← Back to context

Comment by adrian_b

1 day ago

For a static library it does not matter whether it is huge and complex, because you will typically link into your embedded application only a small number of functions from it.

I have used a part of newlib with many different kinds of microcontrollers and its build process has always been essentially the same as a quarter of century ago, so that the script that I have written the first time, before 2000, has always worked without problems, regardless of the target CPU.

The only tricky part that I had to figure the first time was how to split the compilation of the gcc cross-compiler into a part that is built before newlib and a part that is built after newlib.

However that is not specific to newlib, but is the method that must be used when compiling a cross-gcc with any standard C library and it has been simplified over the years, so that now there is little more to it than choosing the appropriate make targets when executing the make commands.

I have never needed to change the build process of newlib for a new system, I had just needed to replace a few functions, for things like I/O peripherals or memory allocation. However, I have never used much of newlib, mostly only stdio and memory/string functions.

> it does not matter whether it is huge and complex

I was talking about the migration effort and usage complexity, not what the compiler or linker actually sees. It may well be that Newlib can be configured for every conceivable application, but it was more important to me not to have a such a behemoth and bag full of surprises in the project with preprocessor rules and dependencies that a single developer can hardly understand or keep track of. My solution is lean, complete, and works with standard-conforming compilers on each platform I need it.

  • The standard C library does not belong into any project, but it normally is shared together with cross-compilers, linkers and other tools by all projects that target a certain kind of hardware architecture.

    So whatever preprocessor rules and dependencies may be needed to build the tool chain, they do not have any influence on the building processes for the software projects used to develop applications.

    The building of the tool chain is done again only when new tool versions become available, not during the development of applications.

    I assume that you have encountered problems because you have desired to build newlib with something else than gcc + binutils, with which it can be built immediately, as delivered.

    Even if for some weird reason the use of gcc is avoided for the intended application, that should have not required the use of a newlib compiled with something else than gcc, as it should be linked without problems with any other ELF object files.