בתאריך יום ד׳, 6 בפבר׳ 2019, 12:25, מאת Simone Tiraboschi <
stirabos(a)redhat.com>:
>
>
> On Wed, Feb 6, 2019 at 11:17 AM Barak Korren <bkorren(a)redhat.com> wrote:
>
>>
>>
>> On Wed, 6 Feb 2019 at 11:57, Simone Tiraboschi <stirabos(a)redhat.com>
>> wrote:
>>
>>>
>>>
>>> On Wed, Feb 6, 2019 at 10:44 AM Barak Korren <bkorren(a)redhat.com>
>>> wrote:
>>>
>>>>
>>>>
>>>> On Wed, 6 Feb 2019 at 11:34, Simone Tiraboschi
<stirabos(a)redhat.com>
>>>> wrote:
>>>>
>>>>>
>>>>>
>>>>> On Wed, Feb 6, 2019 at 10:23 AM Barak Korren
<bkorren(a)redhat.com>
>>>>> wrote:
>>>>>
>>>>>>
>>>>>>
>>>>>> On Wed, 6 Feb 2019 at 11:15, Simone Tiraboschi
<stirabos(a)redhat.com>
>>>>>> wrote:
>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Feb 6, 2019 at 10:00 AM Dan Kenigsberg
<danken(a)redhat.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> On Wed, Feb 6, 2019 at 10:54 AM Simone Tiraboschi <
>>>>>>>> stirabos(a)redhat.com> wrote:
>>>>>>>> >
>>>>>>>> >
>>>>>>>> >
>>>>>>>> > On Wed, Feb 6, 2019 at 9:45 AM Dan Kenigsberg
<danken(a)redhat.com>
>>>>>>>> wrote:
>>>>>>>> >>
>>>>>>>> >> On Wed, Feb 6, 2019 at 10:16 AM Simone
Tiraboschi <
>>>>>>>> stirabos(a)redhat.com> wrote:
>>>>>>>> >> >
>>>>>>>> >> >
>>>>>>>> >> >
>>>>>>>> >> > On Tue, Feb 5, 2019 at 7:07 PM Dafna Ron
<dron(a)redhat.com>
>>>>>>>> wrote:
>>>>>>>> >> >>
>>>>>>>> >> >> Hi,
>>>>>>>> >> >>
>>>>>>>> >> >> Please note that
ovirt-ansible-hosted-engine-setup has a
>>>>>>>> versioning problem with the package and is causing
bootstrap to fail for
>>>>>>>> upgrade suite [1]
>>>>>>>> >> >>
>>>>>>>> >> >> This is effecting all projects, its
been reported to the
>>>>>>>> developers and should be fixed as soon as possible.
>>>>>>>> >> >>
>>>>>>>> >> >> you can view CQ status here:
>>>>>>>> >> >>
>>>>>>>>
https://jenkins.ovirt.org/view/Change%20queue%20jobs/job/ovirt-master_cha...
>>>>>>>> >> >>
>>>>>>>> >> >> [1]
http://pastebin.test.redhat.com/708086
>>>>>>>> >>
>>>>>>>> >> It is unfair to refer to an internal pastebin
here. It is also
>>>>>>>> not
>>>>>>>> >> very sensible, as it is quite short.
>>>>>>>> >>
>>>>>>>> >> 2019-02-05 11:23:51,390-0500 ERROR
>>>>>>>> >> otopi.plugins.otopi.packagers.yumpackager
yumpackager.error:85
>>>>>>>> Yum
>>>>>>>> >>
>>>>>>>>
[u'ovirt-hosted-engine-setup-2.3.5-0.0.master.20190205110929.gitfdbc215.el7.noarch
>>>>>>>> >> requires ovirt-ansible-hosted-engine-setup >=
1.0.10']
>>>>>>>> >> 2019-02-05 11:23:51,390-0500 DEBUG
otopi.context
>>>>>>>> >> context._executeMethod:142 method exception
>>>>>>>> >> Traceback (most recent call last):
>>>>>>>> >> File
"/tmp/ovirt-6fV8LBWX5i/pythonlib/otopi/context.py", line
>>>>>>>> 132,
>>>>>>>> >> in _executeMethod
>>>>>>>> >> method['method']()
>>>>>>>> >> File
>>>>>>>>
"/tmp/ovirt-6fV8LBWX5i/otopi-plugins/otopi/packagers/yumpackager.py",
>>>>>>>> >> line 248, in _packages
>>>>>>>> >> self.processTransaction()
>>>>>>>> >> File
>>>>>>>>
"/tmp/ovirt-6fV8LBWX5i/otopi-plugins/otopi/packagers/yumpackager.py",
>>>>>>>> >> line 262, in processTransaction
>>>>>>>> >> if self._miniyum.buildTransaction():
>>>>>>>> >> File
"/tmp/ovirt-6fV8LBWX5i/pythonlib/otopi/miniyum.py", line
>>>>>>>> 920,
>>>>>>>> >> in buildTransaction
>>>>>>>> >> raise yum.Errors.YumBaseError(msg)
>>>>>>>> >> YumBaseError:
>>>>>>>>
[u'ovirt-hosted-engine-setup-2.3.5-0.0.master.20190205110929.gitfdbc215.el7.noarch
>>>>>>>> >> requires ovirt-ansible-hosted-engine-setup >=
1.0.10']
>>>>>>>> >> 2019-02-05 11:23:51,391-0500 ERROR
otopi.context
>>>>>>>> >> context._executeMethod:151 Failed to execute
stage 'Package
>>>>>>>> >> installation':
>>>>>>>>
[u'ovirt-hosted-engine-setup-2.3.5-0.0.master.20190205110929.gitfdbc215.el7.noarch
>>>>>>>> >> requires ovirt-ansible-hosted-engine-setup >=
1.0.10']
>>>>>>>> >> 2019-02-05 11:23:51,413-0500 DEBUG
>>>>>>>> >>
otopi.plugins.otopi.debug.debug_failure.debug_failure
>>>>>>>> >> debug_failure._notification:100 tcp
connections:
>>>>>>>> >>
>>>>>>>> >> >>
>>>>>>>> >> >
>>>>>>>> >> > The issue is that on github we already
have
>>>>>>>> >> > VERSION="1.0.10"
>>>>>>>> >> > as we can see in
>>>>>>>> >> >
>>>>>>>>
https://github.com/oVirt/ovirt-ansible-hosted-engine-setup/blob/master/bu...
>>>>>>>> >> >
>>>>>>>> >> > And this has been bumped before the commit
that now is
>>>>>>>> reported as broken.
>>>>>>>> >> >
>>>>>>>> >> > CI instead is still building the package as
1.0.9 ignoring
>>>>>>>> the commit that bumped the version.
>>>>>>>> >> > Honestly I don't know how I can fix it
if the version value
>>>>>>>> is already the desired one in the source code.
>>>>>>>> >>
>>>>>>>> >> I don't see your
ovirt-ansible-hosted-engine-setup-1.0.10, only
>>>>>>>> >>
>>>>>>>>
https://plain.resources.ovirt.org/pub/ovirt-master-snapshot/rpm/el7/noarc...
>>>>>>>> >> Not even under "tested":
>>>>>>>> >>
>>>>>>>>
https://plain.resources.ovirt.org/repos/ovirt/tested/master/rpm/el7/noarc...
>>>>>>>> >>
>>>>>>>> >> Simone, can you doublecheck that its artifacts
have been built
>>>>>>>> and
>>>>>>>> >> have been accepted by the change queue?
>>>>>>>> >
>>>>>>>> >
>>>>>>>> > It has been built here once as 1.0.10:
>>>>>>>> >
>>>>>>>>
https://jenkins.ovirt.org/job/oVirt_ovirt-ansible-hosted-engine-setup_sta...
>>>>>>>> >
>>>>>>>> > then on the next commit, CI started building it
again as 1.0.9
>>>>>>>> although in the source code we have 1.0.10 and so this
issue.
>>>>>>>>
>>>>>>>> I don't understand the issue yet (that's not
surprising as I do not
>>>>>>>> know what is that "ghpush" job). Which CI job
has built the wrong
>>>>>>>> version? can you share its logs? who owns it?
>>>>>>>>
>>>>>>>
>>>>>>> In the git log I see:
>>>>>>> commit b5a6c1db135d81d75f3330160e7ef4a84c97fd60 (HEAD ->
master,
>>>>>>> upstream/master, origin/master, origin/HEAD, nolog)
>>>>>>> Author: Simone Tiraboschi <stirabos(a)redhat.com>
>>>>>>> Date: Tue Feb 5 10:56:58 2019 +0100
>>>>>>>
>>>>>>> Avoid using no_log when we have to pass back values to
otopi
>>>>>>>
>>>>>>> commit 96974fad1ee6aee33f8183e49240f8a2a7a617d4
>>>>>>> Author: Simone Tiraboschi <stirabos(a)redhat.com>
>>>>>>> Date: Thu Jan 31 16:39:58 2019 +0100
>>>>>>>
>>>>>>> use dynamic inclusion to avoid tag inheritance
>>>>>>>
>>>>>>> commit 4a9a23fb8e88acba5af4febed43d9e4b02e7a2c5
>>>>>>> Author: Simone Tiraboschi <stirabos(a)redhat.com>
>>>>>>> Date: Thu Jan 31 15:15:04 2019 +0100
>>>>>>>
>>>>>>> Force facts gathering on partial executions
>>>>>>>
>>>>>>> commit 7428b54a5ba8458379b1a27d116f9504bb830e69
>>>>>>> Author: Simone Tiraboschi <stirabos(a)redhat.com>
>>>>>>> Date: Wed Jan 30 17:01:01 2019 +0100
>>>>>>>
>>>>>>> Use static imports and tags
>>>>>>>
>>>>>>> Fixes
>>>>>>>
https://github.com/oVirt/ovirt-ansible-hosted-engine-setup/issues/20
>>>>>>> Reuires
>>>>>>>
https://github.com/oVirt/ovirt-ansible-engine-setup/pull/39
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Version has been bumped to 1.0.10 on commit
>>>>>>> 7428b54a5ba8458379b1a27d116f9504bb830e69 since it introduces
a backward
>>>>>>> incompatible change and we need to track it.
>>>>>>>
>>>>>>> 7428b54a5ba8458379b1a27d116f9504bb830e69 failed CI tests due
to an
>>>>>>> issue on a different package found yesterday.
>>>>>>>
>>>>>>> So 7428b54a5ba8458379b1a27d116f9504bb830e69 got ignored and
now CI
>>>>>>> is building from commit
b5a6c1db135d81d75f3330160e7ef4a84c97fd60 (the last
>>>>>>> one) rebased on something before
7428b54a5ba8458379b1a27d116f9504bb830e69
>>>>>>> which is not what we have in git so now, after
>>>>>>> b5a6c1db135d81d75f3330160e7ef4a84c97fd60 (last commit) the
package builds
>>>>>>> in CI as 1.0.9 although in the code we have 1.0.10 and so the
issue.
>>>>>>>
>>>>>>>
>>>>>> We never ignore commits, certainly not merged ones...
>>>>>>
>>>>>> We can fall back to older builds on system test failures and
throw
>>>>>> away newer build if we suspect they cause the failure, if which
case the
>>>>>> builds need to be resubmitted, but this logic happens at the
build leve not
>>>>>> the commit level, there is no commit reordering or dropping
anywhere.
>>>>>>
>>>>>>
>>>>> I'd double check the CI code for that:
>>>>>
>>>>> on git side, in build.sh at commit
>>>>> b5a6c1db135d81d75f3330160e7ef4a84c97fd60 we have
>>>>> VERSION="1.0.10"
>>>>> as you can see in
>>>>>
>>>>>
https://github.com/oVirt/ovirt-ansible-hosted-engine-setup/blob/b5a6c1db1...
>>>>>
>>>>> but, as you can see in
>>>>>
>>>>>
https://jenkins.ovirt.org/job/oVirt_ovirt-ansible-hosted-engine-setup_sta...
>>>>> CI builds it as 1.0.9 ignoring commit
>>>>> 7428b54a5ba8458379b1a27d116f9504bb830e69
>>>>> that has been merged before b5a6c1db135d81d75f3330160e7ef4a84c97fd60
>>>>>
>>>>>
>>>> I'm not sure if build 138 ran before or after
>>>> 7428b54a5ba8458379b1a27d116f9504bb830e69 was merged.
>>>> since you merged the PR already, we cannot check what will happen if
>>>> we rerun the job now.
>>>>
>>>> I suspect you're simply looking at the output of a job that stated
>>>> running before 7428b54a5ba8458379b1a27d116f9504bb830e69 was merged
>>>>
>>>
>>> On master branch on
>>>
https://github.com/oVirt/ovirt-ansible-hosted-engine-setup I have:
>>>
>>> b5a6c1d (HEAD -> master, upstream/master, origin/master, origin/HEAD,
>>> nolog) Avoid using no_log when we have to pass back values to otopi
>>> 96974fa use dynamic inclusion to avoid tag inheritance
>>> 4a9a23f Force facts gathering on partial executions
>>> 7428b54 Use static imports and tags
>>> 9267118 Fix rhbz1654697 (#108)
>>> 5e928a1 build: post ovirt-ansible-hosted-engine-setup-1.0.8
>>> 8607d90 (tag: 1.0.8) build: ovirt-ansible-hosted-engine-setup-1.0.8
>>>
>>> In the console output of
>>>
https://jenkins.ovirt.org/job/oVirt_ovirt-ansible-hosted-engine-setup_sta...
>>>
>>> I see
>>>
>>> *10:59:42* 2019-02-05 09:59:42,448:INFO:__main__:File
'automation/upstream_sources.yaml' cannot be opened*10:59:42* 2019-02-05
09:59:42,449:INFO:__main__:Executing command: 'git
--git-dir=/home/jenkins/workspace/oVirt_ovirt-ansible-hosted-engine-setup_standard-check-pr/ovirt-ansible-hosted-engine-setup/.git
--work-tree=/home/jenkins/workspace/oVirt_ovirt-ansible-hosted-engine-setup_standard-check-pr/ovirt-ansible-hosted-engine-setup
reset --hard'*10:59:42* 2019-02-05 09:59:42,472:DEBUG:__main__:Git exited with status:
0*10:59:42* 2019-02-05 09:59:42,472:DEBUG:__main__: ---- stderr ----*10:59:42* 2019-02-05
09:59:42,472:DEBUG:__main__: ---- stdout ----*10:59:42* 2019-02-05
09:59:42,472:DEBUG:__main__: HEAD is now at af00824 Merge
ceca0a852d94bd5af7226b15d15c9e3ff917cd5c into 9267118f456de46b9059e76b63bc85c18fcab5dd
>>>
>>>
>>> Now, ceca0a852d94bd5af7226b15d15c9e3ff917cd5c is the last commit on my
>>> develop branch on
>>>
https://github.com/tiraboschi/ovirt-ansible-hosted-engine-setup
>>>
>>>
>>> commit ceca0a852d94bd5af7226b15d15c9e3ff917cd5c (origin/nolog)
>>> Author: Simone Tiraboschi <stirabos(a)redhat.com>
>>> Date: Tue Feb 5 10:56:58 2019 +0100
>>>
>>> Avoid using no_log when we have to pass back values to otopi
>>>
>>>
>>> which corresponds to b5a6c1d on master on
>>>
https://github.com/oVirt/ovirt-ansible-hosted-engine-setup
>>>
>>> so "Avoid using no_log when we have to pass back values to otopi"
got
>>> tested over 9267118 - Fix rhbz1654697 (#108)
>>> skipping
>>> 96974fa use dynamic inclusion to avoid tag inheritance
>>> 4a9a23f Force facts gathering on partial executions
>>> 7428b54 Use static imports and tags
>>>
>>> in the middle.
>>>
>>> The version bump change is in 7428b54
>>> and so it builds as 1.0.9 and not as 1.0.10 as expected.
>>>
>>>
>> This is a matter of timing - as I already said - I suspect the build
>> you're looking at happened before b5a6c1d was merged.
>>
>> In any case the build you're seeing ot the `check-pr` job has nothing to
>> do with what gets sent into system testing and checked with OST. The build
>> that gets checked is the one that is generated by the `on-ghpush` job and
>> is always built from merged commits in the branch itself. You can see the
>> last successful build is with the right version:
>>
>>
>>
https://jenkins.ovirt.org/job/oVirt_ovirt-ansible-hosted-engine-setup_sta...
>>
>
> OK, and there we have
> ovirt-ansible-hosted-engine-setup-1.0.10-0.1.master.20190205155347.el7.noarch.rpm
> as expected.
> Now the issue is why the CI job fails due to the lack of
> ovirt-ansible-hosted-engine-setup >= 1.0.10
>
> Now I merged another patch to bump the version again to 1.0.11 just to
> re-trigger the whole flow.
> Honestly I don't know what else I can do.
>
I don't think solving this was up to you. If the 1.0.11 build was removed
from CQ because of an infra issue, Dafna or the infra-owner should have
simply re-added it.
If it's a matter of repo ordering or whitelisting that causes an older
version to be made available during is the run, it needs to be fixed in OST
code.
It was available - that was no the issue.
since the issue was in upgrade only it may have been same version with diff
hashing on our repos
For now we have a different issue which is blocking this one but if upping
the version does not help I will delete packages in repo and re-add to CQ.
>
>>
>>
>>
>>
>>>
>>>
>>>>
>>>>
>>>>>
>>>>>> --
>>>>>> Barak Korren
>>>>>> RHV DevOps team , RHCE, RHCi
>>>>>> Red Hat EMEA
>>>>>>
redhat.com | TRIED. TESTED. TRUSTED. |
redhat.com/trusted
>>>>>>
>>>>>
>>>>
>>>> --
>>>> Barak Korren
>>>> RHV DevOps team , RHCE, RHCi
>>>> Red Hat EMEA
>>>>
redhat.com | TRIED. TESTED. TRUSTED. |
redhat.com/trusted
>>>>
>>>
>>
>> --
>> Barak Korren
>> RHV DevOps team , RHCE, RHCi
>> Red Hat EMEA
>>
redhat.com | TRIED. TESTED. TRUSTED. |
redhat.com/trusted
>>
>