Will on 21 Dec 2018 14:57:55 -0800


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: [PLUG] Git: net time gain or loss?


This discussion just went into the heart of devops land.... And that is where things get scary. 

-Will

On Fri, Dec 21, 2018, 17:55 Rich Freeman <r-plug@thefreemanclan.net wrote:
On Fri, Dec 21, 2018 at 5:43 PM Fred Stluka <fred@bristle.com> wrote:
>
> On 12/21/18 4:44 PM, Rich Freeman wrote:
> >
> > Normally you want to check to make sure that your change doesn't
> > introduce a regression.  Maybe somebody changed an API for a function
> > you're using, in a file you didn't modify - git would have no way of
> > detecting that, and if you just added the call to that function in the
> > changes you're committing the other committer wouldn't have fixed up
> > your code and caused a merge conflict in your own file.
> >
>
> But it's not just a problem with automatic merges.  The same
> situations can occur with strictly sequential updates to a code
> base by multiple people, unless the people all communicate
> very well with each other about the intent of every change they
> make.  So, it happens with a distributed or non-distributed VCS.

Sure, simply using git doesn't automatically protect you against these
sorts of issues if you're not actively looking for them.

However, the fact that git commits are atomic at the repository level
is a very useful tool to help ensure this, in conjunction with
whatever other QA you impose.

Git doesn't guarantee that your commit works.  However, it does
guarantee that when somebody else checks out your commit they get the
exact state of the entire repository that you had when you made the
commit, no matter what else is going on, as long as you stick to
fast-forward pushes only.  If you and all your other committers are
doing systematic QA on each commit, then anybody bisecting the
repository should always land on working commits (at least as far as
your QA goes).

Now, you can achieve this with CI tools on other types of repositories
as an additional layer of course - have developers push in one place,
and then CI later tests and publishes known-good
snapshots/commits/whatever in a more controlled repository.

The cost of this is time - if your QA checks take time and you have a
lot of commits, then you're going to find it more likely that when you
go to push that somebody else has done a push.  On the flip side, you
can do a fetch and diff and see exactly what has changed, and
potentially simplify your QA.

--
Rich
___________________________________________________________________________
Philadelphia Linux Users Group         --        http://www.phillylinux.org
Announcements - http://lists.phillylinux.org/mailman/listinfo/plug-announce
General Discussion  --   http://lists.phillylinux.org/mailman/listinfo/plug
___________________________________________________________________________
Philadelphia Linux Users Group         --        http://www.phillylinux.org
Announcements - http://lists.phillylinux.org/mailman/listinfo/plug-announce
General Discussion  --   http://lists.phillylinux.org/mailman/listinfo/plug