[Scons-users] collecting groups of files as sources
Dirk Bächle
tshortik at gmx.de
Sun Jul 25 09:44:40 EDT 2021
Hi Gabe,
did you have a look at "Parts" already (https://pypi.org/project/scons-parts/)? I'm not saying that it will solve all your problems,
but it might offer a lot more structure to your builds and enable you to think more "module"-oriented.
I'm not a user of "parts" but if I remember correctly, building "modules" with different parameters (similar to the "variant_dir"
option for SCons directly) is very easy and seems to be what you're trying to accomplish.
Just my 2cents.
Best regards,
Dirk
On 24.07.21 12:01, Gabe Black wrote:
> Hi, I'm a core developer on large project which uses SCons as our build system (gem5.org <http://gem5.org>), and I'm in the process
> of refactoring our build so that it has less custom code and defers more work to SCons itself.
>
> The biggest sticking points so far have generally been around gathering up collections of source files to be built into an
> executable. This problem crops up in at least two different flavors.
>
>
> ** Lists of sources **
>
> The first, simplest version is that our project is quite large and has many, many source files. It would be very impractical to try
> to put them all in a huge list to supply to a Program builder, and this would also architect out modularity of our build system,
> where source files would no longer be handled locally and would instead have to be handled all in one central clearing house. This
> would also break a mechanism we have which lets users add new collections of source to the project to add their own components. An
> approximate analogy would be kernel modules. It would not be possible to add these extra directories without modifying the main
> projects source if you had to, by definition, modify the huge list of source files to extend the code base.
>
> To solve this problem, we currently declare python objects which represent the source files, but don't actually mean anything to
> SCons itself. These objects also carry "tags" so that they can say they should only be included if the project includes its built in
> python interpreter, or for certain unit tests, or if it's being built as a complete executable instead of as a library.
>
> Then later, after all the SConscript files in subdirectories have be processed, we have a mechanism to generate lists of source
> files with and/or without various tags, and then we feed those into the actual Program, SharedLibrary, or StaticLibrary builders.
>
> I would really rather have something more like this:
>
> Program('foo.bin', '${SOURCES.with_any_tags("main", "lib", "python")')
>
> and have that construction variable be expanded *after* all the sources have been collected. It's relatively easy to centralize the
> declaration of our central binary or libraries, but it's not really possible for things like unit tests which are scattered
> throughout the code base (near what they test) and can even come in through user additions.
>
>
> ** Dependencies based on build products **
>
> Another more complicated problem is that our project is a simulator, and the objects in the simulator are described using python.
> Our build actually imports these modules, and then checks to see which different classes of simulation objects have been set up.
> Then each of these are used to generate additional c++ files which act as the glue between those python classes and the c++ classes
> that underlie them.
>
> The problem here is similar, in that we need to collect all the python modules, import them and generate a list of simulation
> objects, and then based on that generate a collection of .cc files which will be built into the simulator. We need to collect the
> .py files which is like above, but then beyond that we need to run a build step, and then based on what happens there add some
> number of .cc files to the build.
>
> What we're doing now is that we just have a step in the SConscript which does all that in line, and then we set up additional source
> file representing objects like I described above. This adds a decent amount of complex custom code to our build scripts, and another
> sequential element to the build process.
>
> Is there a way I can run the build step of importing all these .py, and then add the extra .cc files to the build? I wasn't able to
> think of any way to do that given how scons works, and this Stack Overflow post was the closest I've found from anybody else,
> although it looks like it's abusing internal interfaces and I'm pretty reluctant to do anything like that in our production code:
>
> https://stackoverflow.com/questions/24671859/scons-how-to-generate-dependencies-after-some-targets-have-been-built
> <https://stackoverflow.com/questions/24671859/scons-how-to-generate-dependencies-after-some-targets-have-been-built>
>
> Are scanners the right way to do this somehow? Does SCons have any sort of mechanism where it can re-scan nodes that have changed
> since the build started?
>
>
> ** Multiple invocations? **
>
> Another big hammer approach I'm considering is to write some sort of wrapper script which will just invoke SCons multiple times,
> once for each layer of dependencies in the build, and have it record the intermediate results someplace for it rediscover between runs.
>
> This feels pretty clunky to me, and like something SCons should be handling for me. Is there a better way?
>
> Gabe
>
> _______________________________________________
> Scons-users mailing list
> Scons-users at scons.org
> https://pairlist4.pair.net/mailman/listinfo/scons-users
>
More information about the Scons-users
mailing list