[ovirt-devel] OST Failure - Weekly update [/04/2018-20/04/2018]

Dafna Ron dron at redhat.com
Mon Apr 30 16:59:07 UTC 2018


On Mon, Apr 30, 2018 at 3:55 PM, Michal Skrivanek <
michal.skrivanek at redhat.com> wrote:

>
>
> On 30 Apr 2018, at 15:29, Arik Hadas <ahadas at redhat.com> wrote:
>
>
>
> On Mon, Apr 30, 2018 at 4:15 PM, Dafna Ron <dron at redhat.com> wrote:
>
>>
>>
>> On Fri, Apr 27, 2018 at 5:57 PM, Yaniv Kaul <ykaul at redhat.com> wrote:
>>
>>>
>>>
>>> On Fri, Apr 27, 2018 at 7:34 PM, Dafna Ron <dron at redhat.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> I wanted to give a short status on this week's failures and OST current
>>>> status.
>>>>
>>>> I am glad to report that the issue with CQ alerts was resolved thanks
>>>> to Barak and Evgheni.
>>>> You can read more about the issue and how it was resolved here:
>>>> https://ovirt-jira.atlassian.net/browse/OVIRT-1974
>>>>
>>>
>>> How was the VM2 high-performance with vNUMA and pinned to more than one
>>> host (a race) was solved?
>>>
>>
>> I was not seeing it all week, however, we had just had a failure for that
>> today: http://jenkins.ovirt.org/job/ovirt-master_change-queu
>> e-tester/7169/
>>
>>
>>
>>>
>>>
>>>>
>>>>
>>>> Currently we have one on-going possible regression which was reported
>>>> to the list and to Arik.
>>>>
>>>
> Dafna,
> how can we see that the error is consistent and triggered by this patch?
> Are there other builds passing, after this failure?
> I see some green builds afterwards, with ​"(ovirt-engine)”, does it mean
> the error is not happening all the time?
>
> Thanks,
> michal
>


Hi Michal,

if I think the change may be related and I reported it to the developer,
and yet the developer thinks that his change was not causing the failure, a
way to debug it is to re-add this change to CQ and if it fails again on the
same test the developer should take a closer look.
CQ tries to isolate failures so that they would not break the tests which
is why you are seeing green builds after the failure.

I re-added this change today at 11:30 and indeed it passed:
http://jenkins.ovirt.org/view/Change%20queue%20jobs/job/ovirt-master_change-queue-tester/7190/

Thanks,
Dafna





>
>
> the change reported: https://gerrit.ovirt.org/#/c/89852/ - examples:
>>>> upload ova as a virtual machine template.
>>>> you can view the details in this Jira: https://ovirt-jira.atlas
>>>> sian.net/browse/OFT-648
>>>>
>>>
>>> I don't understand how that *example* script could have caused this
>>> regression - in a complete different scenario (virt-sparsify fails because
>>> of libguestfs issue).
>>> Y.
>>>
>>
>> I noticed the "example" but since it was reported as a failure I wanted
>> to make sure nothing in the "example" caused the failure which is why I
>> sent to the list and asked Arik to have a look.
>>
>
> No, that sdk-example could not have caused it.
>
>
>>
>>
>>>
>>>>
>>>>
>>>> The majority of issues we had this week were failed-build artifacts for
>>>> fc27. There were two different cases, one was reported to Francesco who was
>>>> already working on a fix to the issue and the second started and resolved
>>>> during the evening/night of Apr 26-27.
>>>> You can see the Jira to these two issues here:
>>>>
>>>> https://ovirt-jira.atlassian.net/browse/OFT-605
>>>> https://ovirt-jira.atlassian.net/browse/OFT-612
>>>>
>>>> There was an infra issue with Mirrors not being available for a few
>>>> minutes. the issue was momentarily and was resolved on its own.
>>>>
>>>> https://ovirt-jira.atlassian.net/browse/OFT-606
>>>>
>>>>
>>>>
>>>> *Below you can see the chart for this week's resolved issues but cause
>>>> of failure:**Code* = regression of working components/functionalities
>>>> *Infra* = infrastructure/OST Infrastructure/Lago related issues/Power
>>>> outages
>>>> *OST Tests* = package related issues, failed build artifacts
>>>>
>>>>
>>>>
>>>> *<image.png><image.png>*
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *Below is a chart showing failures by suite type:
>>>> <image.png><image.png>*
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *Below is a chart showing failures by version type:
>>>> <image.png><image.png>*
>>>>
>>>>
>>>>
>>>>
>>>> *Below you can see the number of reported failures by resolution
>>>> status:<image.png><image.png>*
>>>> Thanks,
>>>> Dafna
>>>>
>>>>
>>>> _______________________________________________
>>>> Devel mailing list
>>>> Devel at ovirt.org
>>>> http://lists.ovirt.org/mailman/listinfo/devel
>>>>
>>>
>>>
>>
>> _______________________________________________
>> Devel mailing list
>> Devel at ovirt.org
>> http://lists.ovirt.org/mailman/listinfo/devel
>>
>
> _______________________________________________
> Devel mailing list
> Devel at ovirt.org
> http://lists.ovirt.org/mailman/listinfo/devel
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.ovirt.org/pipermail/infra/attachments/20180430/912ef605/attachment.html>


More information about the Infra mailing list