--JQ29orswtRjjfiJM
Content-Type: text/plain; charset=us-ascii
Content-Disposition: inline
Content-Transfer-Encoding: quoted-printable
On 02/03, Sandro Bonazzola wrote:
Il 20/01/2015 16:50, David Caro ha scritto:
>=20
> Hi everyone!
>=20
> After talking a bit with some of you, I think that we can start
> planning a common build and dependency declaration for tests for the
> ovirt products, to improve and automate most of the ci process and
> maintenance.
>=20
>=20
>=20
> Current status:
>=20
> =3D=3D Dependencies
>=20
> Right now we have 4 types of dependencies:
>=20
> * test dependencies
> * tarball/srcrpm build dependencies
> * rpm build dependencies
> * installation dependencies
>=20
> The last two are managed from the spec files through rpm/yum
> dependency systems. But the first ones are managed manually on the
> jobs on jenkins or puppet manifests. What separates it from the code
> that actually requires them and adds an extra layer of maintenance and
> synchronization between the code, the jenkins jobs and the puppet
> repository.
>=20
>=20
> =3D=3D Builds
>=20
> We started using autotools to build most of the projects, but it's
> not a global methodology and even being used on some projects, you
> need to tune the run for each of them, specifying different variables
> and running some side scripts.
>=20
>=20
> =3D=3D Tests
>=20
> Some projects use make check to run some of the tests, some tests are
> totally outside the code and run only in jenkins jobs.
>=20
>=20
>=20
> Some possible improvements:
>=20
> =3D=3D Tests/builds
>=20
> Using shell scripts:
> We talked before in another thread to create some generic script to
> build the artifacts for the product and to run the tests, namely we
> talked about having 3 executables (bash scripts probably) at the root
> of each project, that should not require any parameters:
>=20
> ./build-artifacts
> This should generate any artifacts to be archives (isos, rpms,
> debs, tarballs, ...) and leave them at ./exported-artifacts/
> directory, for the build system to collect, removing any
> previous artifacts if needed.
>=20
> ./check_patch
> Runs all the tests required for any new patchset in gerrit, for
> non-merged changes, should be fast to run to get feedback easily
>=20
> ./check_merge
> Runs all the tests for any change that is going to be merged
> (right now we are not using gates, so it actually after merge,
> but the idea is to use this as gate for any merge to the
> repo). This can be more resource hungry than check_path.
>=20
> That way it will let you use easily any framework that you want to use
> for your project, but still let it be easy to maintain in the global
> ci/build system (you can use pip, tox, maven, gradle, autotools, make,
> rake, ...). This will not allow at first running tests in parallel in
> jenkins, but we can in the future add that possibility (for example,
> allowing the user to define more than one check script, like
> check_patch.mytest1 and check_patch.mytest2 and make jenkins run them
> in parallel).
> I started a POC of this process here [1]
>=20
> Using travis yaml files:
> Using a travis compliant yaml file [2]. That will be less flexible
> than the above solution and will not allow you to run tests in
> parallel, though it will let you use travis at any point to offload
> our ci if needed.
>=20
>=20
> =3D=3D Dependencies
>=20
> Using plain text files:
> Similar to the above scripts solution, I though of adding an extra
> file, with the same name, to declare the dependencies to run that
> script. Adding a suffix in case of different requirements for
> different distros (matching facter fact strings), for example:
>=20
> ./build-artifacts.req
> Main requirements file, used if no more specific one
> found. With a newline separated list of packages to install on
> the environment (jenkins will take care of which package
> manager to use).
>=20
> ./build-artifacts.req.fc20
> Specific requirements for fc20 environment, replaces the
> general one if present.
>=20
> And the same for the other scripts (check_patch and check_merge).
>=20
> Using travis yaml file:
> Using a travis compliant yaml file with some extensions to declare
> the different dependencies for each type and os/distro. That will
> allow you to have only one extra file in your repo, though you'd have
> to duplicate some of the requirements as travis only has ubuntu and
> forces you to run scripts to install dependencies.
>=20
>=20
> What do you think? Do you have any better idea?
=20
=20
I would prefer to have above scripts in something like jenkins, automatio=
n or
build sub-directory.
Other than that, no better idea.
As long as all the projects use the same path and name for them, I have no =
issue
with putting them somewhere else.
I don't think jenkins is a good name, but automation is a nice one.
build is too common, used by a lot of different tools, better avoid it.
So can we agree to use automation directory?
=20
=20
>=20
>=20
> ps. About using an external repository to store the
> scripts/requirements for the code. The issue with this is that it
> forces you to bind a code change in the code repo, to a
> script/dependency change in the scripts repo, and that adds a lot of
> extra maintenance and source of issues and failures. If you know a way
> of doing it like that without all the fuss, I'd love to hear it.
>=20
> For example, imagine that you have vdsm and the dependencies are in
> another repo, now you send a patch to vdsm that requires you to run a
> specific pep8 version to pass the patch tests, so you have to change
> the script repo to add that dependency, but doing that you will brake
> all the other patches tests because they require the older pep8
> version, so you have to somehow specify in the vdsm patch that you
> require a specific commit from the scripts repo to be tested with...
>=20
> Having both in the same repo, allows you to do the code change and the
> dependency/script change in the same patchset, and test it right away
> with the correct scripts/deps.
>=20
> It also binds together code and tests to some point, what is nice to
> have in a product view, because you know for each version which tests
> it passed and have a better idea of the possible failures for that
> version.
>=20
>=20
> [1]
http://gerrit.ovirt.org/#/admin/projects/repoman
> [2]
http://docs.travis-ci.com/
>=20
>=20
>=20
> _______________________________________________
> Infra mailing list
> Infra(a)ovirt.org
>
http://lists.ovirt.org/mailman/listinfo/infra
>=20
=20
=20
--=20
Sandro Bonazzola
Better technology. Faster innovation. Powered by community collaboration.
See how it works at
redhat.com
--=20
David Caro
Red Hat S.L.
Continuous Integration Engineer - EMEA ENG Virtualization R&D
Tel.: +420 532 294 605
Email: dcaro(a)redhat.com
Web:
www.redhat.com
RHT Global #: 82-62605
--JQ29orswtRjjfiJM
Content-Type: application/pgp-signature
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1
iQEcBAEBAgAGBQJU0MdtAAoJEEBxx+HSYmnDzKgH/3Ljn6lZj85gvc/eeNuLlmqb
AxqjISFWC5geRzbIv29mVTh2E+4NGb/plcpprl/DtznXgMIYtGwdc14DcC/Q7OGF
KiajNvUWC9XJccH0i9H52fKQki/NuidnCw1TPJcNMdoLLFKg4RBjUJfXAy7iksfm
gWpPhqRBpbGqwZgSq1b1cUJK40OWxpsiJcqJAN1a5/Ik5ql9yOAEfS632hJs8ILe
j4THqTFVVPZ2r7YyiyT3i+W7JqoUvK/LCG4StXdyiyf/FXLi9A/uoqMK3lxVPN4T
mfAkL8ABFb2KDVmWEQWafWiIO0hzEUxbPq+klIoeyc1t0kJGf6CO2kZ0vvsbfBc=
=4yYv
-----END PGP SIGNATURE-----
--JQ29orswtRjjfiJM--