[prev in list] [next in list] [prev in thread] [next in thread]
List: kde-commits
Subject: [kdepim] unsermake
From: Allen Winter <allen.winter () kdab ! com>
Date: 2011-01-19 23:45:55
Message-ID: 20110119234555.879D4A6092 () git ! kde ! org
[Download RAW message or body]
Git commit 81aeacdd8561b0ed2ba42f62fb603b822b14af7f by Allen Winter
Pushed by winterz into branch e3_polish
Resurrect unsermake to see if we can make it work again here
unsermake should generate a faster set of Makefiles and it might also
obviate the need for automake at all.
A +340 -0 unsermake/COPYING [License: GPL (v2+)]
A +117 -0 unsermake/README [License: UNKNOWN]
A +23 -0 unsermake/TODO [License: UNKNOWN]
A +1370 -0 unsermake/__init__.py [License: BSD]
A +1474 -0 unsermake/amfile.py [License: GENERATED FILE]
A +42 -0 unsermake/assembler.um [License: UNKNOWN]
A +142 -0 unsermake/autodeps.um [License: UNKNOWN]
A +29 -0 unsermake/autodirs.um [License: UNKNOWN]
A +104 -0 unsermake/check.um [License: UNKNOWN]
A +40 -0 unsermake/checkrcs.um [License: UNKNOWN]
A +80 -0 unsermake/dcopidl.um [License: UNKNOWN]
A +182 -0 unsermake/designer.um [License: UNKNOWN]
A +35 -0 unsermake/destdir.um [License: UNKNOWN]
A +50 -0 unsermake/dist.um [License: UNKNOWN]
A +239 -0 unsermake/doc/example.obj [License: UNKNOWN]
A +144 -0 unsermake/doc/unsermake-talk.txt [License: UNKNOWN]
A +157 -0 unsermake/docs.um [License: UNKNOWN]
A +44 -0 unsermake/flex.um [License: UNKNOWN]
A +115 -0 unsermake/handlerbase.py [License: UNKNOWN]
A +163 -0 unsermake/icons.um [License: UNKNOWN]
A +57 -0 unsermake/kcfg.um [License: UNKNOWN]
A +97 -0 unsermake/kdeinit.um [License: UNKNOWN]
A +110 -0 unsermake/manpages.um [License: UNKNOWN]
A +302 -0 unsermake/mocs.um [License: UNKNOWN]
A +116 -0 unsermake/pch.um [License: UNKNOWN]
A +125 -0 unsermake/pofiles.um [License: UNKNOWN]
A +559 -0 unsermake/program.py [License: GENERATED FILE]
A +45 -0 unsermake/rcc.um [License: UNKNOWN]
A +141 -0 unsermake/sourcehash.py [License: UNKNOWN]
A +293 -0 unsermake/target.py [License: UNKNOWN]
A +17 -0 unsermake/texinfo.um [License: UNKNOWN]
A +16 -0 unsermake/unsermake [License: UNKNOWN]
A +169 -0 unsermake/utilities.py [License: UNKNOWN]
A +61 -0 unsermake/yacc.um
http://commits.kde.org/ee6cc38b/81aeacdd8561b0ed2ba42f62fb603b822b14af7f
diff --git a/unsermake/COPYING b/unsermake/COPYING
new file mode 100644
index 0000000..3912109
--- /dev/null
+++ b/unsermake/COPYING
@@ -0,0 +1,340 @@
+ GNU GENERAL PUBLIC LICENSE
+ Version 2, June 1991
+
+ Copyright (C) 1989, 1991 Free Software Foundation, Inc.
+ 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
+ Everyone is permitted to copy and distribute verbatim copies
+ of this license document, but changing it is not allowed.
+
+ Preamble
+
+ The licenses for most software are designed to take away your
+freedom to share and change it. By contrast, the GNU General Public
+License is intended to guarantee your freedom to share and change free
+software--to make sure the software is free for all its users. This
+General Public License applies to most of the Free Software
+Foundation's software and to any other program whose authors commit to
+using it. (Some other Free Software Foundation software is covered by
+the GNU Library General Public License instead.) You can apply it to
+your programs, too.
+
+ When we speak of free software, we are referring to freedom, not
+price. Our General Public Licenses are designed to make sure that you
+have the freedom to distribute copies of free software (and charge for
+this service if you wish), that you receive source code or can get it
+if you want it, that you can change the software or use pieces of it
+in new free programs; and that you know you can do these things.
+
+ To protect your rights, we need to make restrictions that forbid
+anyone to deny you these rights or to ask you to surrender the rights.
+These restrictions translate to certain responsibilities for you if you
+distribute copies of the software, or if you modify it.
+
+ For example, if you distribute copies of such a program, whether
+gratis or for a fee, you must give the recipients all the rights that
+you have. You must make sure that they, too, receive or can get the
+source code. And you must show them these terms so they know their
+rights.
+
+ We protect your rights with two steps: (1) copyright the software, and
+(2) offer you this license which gives you legal permission to copy,
+distribute and/or modify the software.
+
+ Also, for each author's protection and ours, we want to make certain
+that everyone understands that there is no warranty for this free
+software. If the software is modified by someone else and passed on, we
+want its recipients to know that what they have is not the original, so
+that any problems introduced by others will not reflect on the original
+authors' reputations.
+
+ Finally, any free program is threatened constantly by software
+patents. We wish to avoid the danger that redistributors of a free
+program will individually obtain patent licenses, in effect making the
+program proprietary. To prevent this, we have made it clear that any
+patent must be licensed for everyone's free use or not licensed at all.
+
+ The precise terms and conditions for copying, distribution and
+modification follow.
+
+ GNU GENERAL PUBLIC LICENSE
+ TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
+
+ 0. This License applies to any program or other work which contains
+a notice placed by the copyright holder saying it may be distributed
+under the terms of this General Public License. The "Program", below,
+refers to any such program or work, and a "work based on the Program"
+means either the Program or any derivative work under copyright law:
+that is to say, a work containing the Program or a portion of it,
+either verbatim or with modifications and/or translated into another
+language. (Hereinafter, translation is included without limitation in
+the term "modification".) Each licensee is addressed as "you".
+
+Activities other than copying, distribution and modification are not
+covered by this License; they are outside its scope. The act of
+running the Program is not restricted, and the output from the Program
+is covered only if its contents constitute a work based on the
+Program (independent of having been made by running the Program).
+Whether that is true depends on what the Program does.
+
+ 1. You may copy and distribute verbatim copies of the Program's
+source code as you receive it, in any medium, provided that you
+conspicuously and appropriately publish on each copy an appropriate
+copyright notice and disclaimer of warranty; keep intact all the
+notices that refer to this License and to the absence of any warranty;
+and give any other recipients of the Program a copy of this License
+along with the Program.
+
+You may charge a fee for the physical act of transferring a copy, and
+you may at your option offer warranty protection in exchange for a fee.
+
+ 2. You may modify your copy or copies of the Program or any portion
+of it, thus forming a work based on the Program, and copy and
+distribute such modifications or work under the terms of Section 1
+above, provided that you also meet all of these conditions:
+
+ a) You must cause the modified files to carry prominent notices
+ stating that you changed the files and the date of any change.
+
+ b) You must cause any work that you distribute or publish, that in
+ whole or in part contains or is derived from the Program or any
+ part thereof, to be licensed as a whole at no charge to all third
+ parties under the terms of this License.
+
+ c) If the modified program normally reads commands interactively
+ when run, you must cause it, when started running for such
+ interactive use in the most ordinary way, to print or display an
+ announcement including an appropriate copyright notice and a
+ notice that there is no warranty (or else, saying that you provide
+ a warranty) and that users may redistribute the program under
+ these conditions, and telling the user how to view a copy of this
+ License. (Exception: if the Program itself is interactive but
+ does not normally print such an announcement, your work based on
+ the Program is not required to print an announcement.)
+
+These requirements apply to the modified work as a whole. If
+identifiable sections of that work are not derived from the Program,
+and can be reasonably considered independent and separate works in
+themselves, then this License, and its terms, do not apply to those
+sections when you distribute them as separate works. But when you
+distribute the same sections as part of a whole which is a work based
+on the Program, the distribution of the whole must be on the terms of
+this License, whose permissions for other licensees extend to the
+entire whole, and thus to each and every part regardless of who wrote it.
+
+Thus, it is not the intent of this section to claim rights or contest
+your rights to work written entirely by you; rather, the intent is to
+exercise the right to control the distribution of derivative or
+collective works based on the Program.
+
+In addition, mere aggregation of another work not based on the Program
+with the Program (or with a work based on the Program) on a volume of
+a storage or distribution medium does not bring the other work under
+the scope of this License.
+
+ 3. You may copy and distribute the Program (or a work based on it,
+under Section 2) in object code or executable form under the terms of
+Sections 1 and 2 above provided that you also do one of the following:
+
+ a) Accompany it with the complete corresponding machine-readable
+ source code, which must be distributed under the terms of Sections
+ 1 and 2 above on a medium customarily used for software interchange; or,
+
+ b) Accompany it with a written offer, valid for at least three
+ years, to give any third party, for a charge no more than your
+ cost of physically performing source distribution, a complete
+ machine-readable copy of the corresponding source code, to be
+ distributed under the terms of Sections 1 and 2 above on a medium
+ customarily used for software interchange; or,
+
+ c) Accompany it with the information you received as to the offer
+ to distribute corresponding source code. (This alternative is
+ allowed only for noncommercial distribution and only if you
+ received the program in object code or executable form with such
+ an offer, in accord with Subsection b above.)
+
+The source code for a work means the preferred form of the work for
+making modifications to it. For an executable work, complete source
+code means all the source code for all modules it contains, plus any
+associated interface definition files, plus the scripts used to
+control compilation and installation of the executable. However, as a
+special exception, the source code distributed need not include
+anything that is normally distributed (in either source or binary
+form) with the major components (compiler, kernel, and so on) of the
+operating system on which the executable runs, unless that component
+itself accompanies the executable.
+
+If distribution of executable or object code is made by offering
+access to copy from a designated place, then offering equivalent
+access to copy the source code from the same place counts as
+distribution of the source code, even though third parties are not
+compelled to copy the source along with the object code.
+
+ 4. You may not copy, modify, sublicense, or distribute the Program
+except as expressly provided under this License. Any attempt
+otherwise to copy, modify, sublicense or distribute the Program is
+void, and will automatically terminate your rights under this License.
+However, parties who have received copies, or rights, from you under
+this License will not have their licenses terminated so long as such
+parties remain in full compliance.
+
+ 5. You are not required to accept this License, since you have not
+signed it. However, nothing else grants you permission to modify or
+distribute the Program or its derivative works. These actions are
+prohibited by law if you do not accept this License. Therefore, by
+modifying or distributing the Program (or any work based on the
+Program), you indicate your acceptance of this License to do so, and
+all its terms and conditions for copying, distributing or modifying
+the Program or works based on it.
+
+ 6. Each time you redistribute the Program (or any work based on the
+Program), the recipient automatically receives a license from the
+original licensor to copy, distribute or modify the Program subject to
+these terms and conditions. You may not impose any further
+restrictions on the recipients' exercise of the rights granted herein.
+You are not responsible for enforcing compliance by third parties to
+this License.
+
+ 7. If, as a consequence of a court judgment or allegation of patent
+infringement or for any other reason (not limited to patent issues),
+conditions are imposed on you (whether by court order, agreement or
+otherwise) that contradict the conditions of this License, they do not
+excuse you from the conditions of this License. If you cannot
+distribute so as to satisfy simultaneously your obligations under this
+License and any other pertinent obligations, then as a consequence you
+may not distribute the Program at all. For example, if a patent
+license would not permit royalty-free redistribution of the Program by
+all those who receive copies directly or indirectly through you, then
+the only way you could satisfy both it and this License would be to
+refrain entirely from distribution of the Program.
+
+If any portion of this section is held invalid or unenforceable under
+any particular circumstance, the balance of the section is intended to
+apply and the section as a whole is intended to apply in other
+circumstances.
+
+It is not the purpose of this section to induce you to infringe any
+patents or other property right claims or to contest validity of any
+such claims; this section has the sole purpose of protecting the
+integrity of the free software distribution system, which is
+implemented by public license practices. Many people have made
+generous contributions to the wide range of software distributed
+through that system in reliance on consistent application of that
+system; it is up to the author/donor to decide if he or she is willing
+to distribute software through any other system and a licensee cannot
+impose that choice.
+
+This section is intended to make thoroughly clear what is believed to
+be a consequence of the rest of this License.
+
+ 8. If the distribution and/or use of the Program is restricted in
+certain countries either by patents or by copyrighted interfaces, the
+original copyright holder who places the Program under this License
+may add an explicit geographical distribution limitation excluding
+those countries, so that distribution is permitted only in or among
+countries not thus excluded. In such case, this License incorporates
+the limitation as if written in the body of this License.
+
+ 9. The Free Software Foundation may publish revised and/or new versions
+of the General Public License from time to time. Such new versions will
+be similar in spirit to the present version, but may differ in detail to
+address new problems or concerns.
+
+Each version is given a distinguishing version number. If the Program
+specifies a version number of this License which applies to it and "any
+later version", you have the option of following the terms and conditions
+either of that version or of any later version published by the Free
+Software Foundation. If the Program does not specify a version number of
+this License, you may choose any version ever published by the Free Software
+Foundation.
+
+ 10. If you wish to incorporate parts of the Program into other free
+programs whose distribution conditions are different, write to the author
+to ask for permission. For software which is copyrighted by the Free
+Software Foundation, write to the Free Software Foundation; we sometimes
+make exceptions for this. Our decision will be guided by the two goals
+of preserving the free status of all derivatives of our free software and
+of promoting the sharing and reuse of software generally.
+
+ NO WARRANTY
+
+ 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
+FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
+OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
+PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
+OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
+MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
+TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
+PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
+REPAIR OR CORRECTION.
+
+ 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
+WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
+REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
+INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
+OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
+TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
+YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
+PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
+POSSIBILITY OF SUCH DAMAGES.
+
+ END OF TERMS AND CONDITIONS
+
+ How to Apply These Terms to Your New Programs
+
+ If you develop a new program, and you want it to be of the greatest
+possible use to the public, the best way to achieve this is to make it
+free software which everyone can redistribute and change under these terms.
+
+ To do so, attach the following notices to the program. It is safest
+to attach them to the start of each source file to most effectively
+convey the exclusion of warranty; and each file should have at least
+the "copyright" line and a pointer to where the full notice is found.
+
+ <one line to give the program's name and a brief idea of what it does.>
+ Copyright (C) <year> <name of author>
+
+ This program is free software; you can redistribute it and/or modify
+ it under the terms of the GNU General Public License as published by
+ the Free Software Foundation; either version 2 of the License, or
+ (at your option) any later version.
+
+ This program is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+ GNU General Public License for more details.
+
+ You should have received a copy of the GNU General Public License
+ along with this program; if not, write to the Free Software
+ Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
+
+
+Also add information on how to contact you by electronic and paper mail.
+
+If the program is interactive, make it output a short notice like this
+when it starts in an interactive mode:
+
+ Gnomovision version 69, Copyright (C) year name of author
+ Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
+ This is free software, and you are welcome to redistribute it
+ under certain conditions; type `show c' for details.
+
+The hypothetical commands `show w' and `show c' should show the appropriate
+parts of the General Public License. Of course, the commands you use may
+be called something other than `show w' and `show c'; they could even be
+mouse-clicks or menu items--whatever suits your program.
+
+You should also get your employer (if you work as a programmer) or your
+school, if any, to sign a "copyright disclaimer" for the program, if
+necessary. Here is a sample; alter the names:
+
+ Yoyodyne, Inc., hereby disclaims all copyright interest in the program
+ `Gnomovision' (which makes passes at compilers) written by James Hacker.
+
+ <signature of Ty Coon>, 1 April 1989
+ Ty Coon, President of Vice
+
+This General Public License does not permit incorporating your program into
+proprietary programs. If your program is a subroutine library, you may
+consider it more useful to permit linking proprietary applications with the
+library. If this is what you want to do, use the GNU Library General
+Public License instead of this License.
diff --git a/unsermake/README b/unsermake/README
new file mode 100644
index 0000000..e86dcc0
--- /dev/null
+++ b/unsermake/README
@@ -0,0 +1,117 @@
+unsermake is written in python, so it does not have to be compiled.
+To use unsermake, simply ensure that unsermake is in $PATH
+by setting something like:
+
+export PATH=~/projects/kde/kdenonbeta/unsermake:$PATH
+
+At this point building KDE modules as usual, starting with
+"unsermake -f Makefile.cvs" should work. You should see the text:
+
+*** YOU'RE USING UNSERMAKE.
+*** GOOD LUCK!! :)
+
+Keep in mind that unsermake is still unstable (but hey, you were brave enough
+to check out kdenonbeta, so you must like punishment) and may not work for
+all modules / projects at all times.
+
+There are some target forwards in the generated Makefiles that make possible
+to call make install or make clean. Keep in mind, that as soon as you need to
+pass options to unsermake, you can't rely on them and need to call unsermake
+directly. In kdesdk/scripts there is a makeobj that makes this transparently
+if you alias make to makeobj
+
+If you are compiling KDE 3.3 (or earlier) packages you should use the old
+unsermake. This is in cvs branch "old_unsermake".
+
+You can disable the use of unsermake in KDE modules in using
+UNSERMAKE=no make -f Makefile.cvs
+
+You can set the make variable VERBOSE (make VERBOSE=...) to modify the output
+of the generated Makefiles:
+- if it is not set (the default), messages are accumulated while the
+ process is running. if an error occurs, the command is output and
+ after that the accumulated messages. if no error occurs, only the
+ messages are output.
+- if it is set, the behaviour is like regular automake output: the
+ command is output before it is started and messages appear as they
+ are generated.
+ this mode of operation is the most natural one, but is highly
+ inappropriate for parallel builds and generally clutters the output.
+
+If you want to compile everything in a subdir but not in it's subdirs, you
+used to do make -C amor all-am. This doesn't work with non-recursive
+make. But you can do unsermake -C amor SUBDIRS=
+
+If you want to hot fix something, you used to patch the generated Makefiles.
+This doesn't work because the generated Makefiles do not clone the Makefile.am
+variables. But for a hot fix you can copy the variables over from Makefile.am
+to Makefile and hot fix there (e.g. SUBDIRS line).
+
+
+Precompiled Header support:
+===========================
+
+Requirements: gcc >= 3.4
+
+The supported is modelled after how it works in Qt's qmake. It works per
+target, by specifying a variable that says which header file to precompile
+and then include forcibly to all sources of the target. For example:
+
+myproject.h:
+#if defined(__cplusplus)
+#include <aheaderfilethatsioftenused.h>
+#include <anotherone.h>
+#include <noproblemtouseacoupleofthem.h>
+#include <usuallythemorethebetter.h>
+#include <althoughtoomuchofthemcanhurtperformance.h>
+#endif
+
+Assume this is part of a library called libfoo.la. Then in addition to variables
+like libfoo_la_SOURCES you specify libfoo_la_PCH = myproject.h. If you're lazy
+you can also specify libfoo_la_PCH = AUTO, which will make unsermake pick up
+header files to which a corresponding source file exists. For example:
+
+libfoo_la_SOURCES = blah.cpp hey.cpp
+libfoo_la_PCH = AUTO
+
+That'll make it precompile blah.h and hey.h (if they exist) and include them
+when compiling the _SOURCES. Note that AUTO may not always give you best
+performance though. Best is a wise choice of most often used/included header
+files.
+
+To actually enable the pch support you need to have the unsermake_enable_pch
+automake conditional set. If you use KDE's admin/ infrastructure you get that
+by specifying --enable-pch with configure.
+
+Caveats:
+
+1) Keep in mind that by the inclusion of the precompiled headers some preprocessor
+ tricks simply don't work. kdelibs's -DQT_NO_TRANSLATION on the commandline and
+ the unsetting of it in some .cpp files is an example for that.
+
+2) If you have the same source file in multiple _SOURCES in a Makefile.am then you
+ can't use unsermake's PCH support for targets this source is compiled for. That
+ is because you would end up with two rules for the source with different \
dependencies + (one depending on the pch of the one target and the other on another \
pch (or none)) . + A workaround is to use two difference sources:
+
+ libfoo_la_SOURCES = yoyo.cpp
+ libblah_la_SOURCES = yoyo.cpp
+ libblah_la_PCH = AUTO
+
+ That has to become:
+
+ yoyo_foo.cpp: $(srcdir)/yoyo.cpp
+ -rm -f yoyo_foo.cpp
+ $(LN_S) $(srcdir)/yoyo.cpp yoyo_foo.cpp
+
+ yoyo_bar.cpp: $(srcdir)/yoyo.cpp
+ -rm -f yoyo_bar.cpp
+ $(LN_S) $(srcdir)/yoyo.cpp yoyo_bar.cpp
+
+ libfoo_la_SOURCES = yoyo_foo.cpp
+ libblah_la_SOURCES = yoyo_blah.cpp
+ libblah_la_PCH = AUTO
+
+ Note that this workaround breaks with srcdir == builddir when yoyo.cpp include \
yoyo.moc! :( +
diff --git a/unsermake/TODO b/unsermake/TODO
new file mode 100644
index 0000000..989f1ed
--- /dev/null
+++ b/unsermake/TODO
@@ -0,0 +1,23 @@
+- kdebindings: various problems, including
+dcopjava/tests/Makefile.am: unknown source extension .java for test.class
+kdeobjc/kangaroo/Makefile.am: unknown source extension .m for kangaroo
+
+- kde-i18n got top level build trees as subdirs - not yet supported
+
+- comments aren't stripped out correctly. At least in INCLUDES
+
+- pics_DATA = *.png does not work - as it does in automake + srcdir=builddir, it's \
nice to support it too +
+- make nofinal
+
+- add better error message for recursion problems
+
+- add support for 'export'. Info:make Node: Variables/Recursion
+
+- Possible code cleanups:
+ * Command line parsing can be ported to optparse, perhaps
+
+- automake allows to specify nodist_ and dist_ prefices (not used in
+ KDE fortunately). unsermake should at least ignore these:
+ dist_sbin_SCRIPTS = my_script
+
diff --git a/unsermake/__init__.py b/unsermake/__init__.py
new file mode 100644
index 0000000..748d68e
--- /dev/null
+++ b/unsermake/__init__.py
@@ -0,0 +1,1370 @@
+import os
+#os.environ['PYCHECKER'] = 'no-shadowbuiltin'
+#import pychecker.checker
+
+import string, re, os, types, stat
+
+import utilities, amfile
+import handlerbase
+from target import Target, PhonyConflict
+from sourcehash import SourceHash
+import target
+import program, time
+import sys, getopt, os
+
+
+verbose = False
+ignore_most = False
+max_children = 1
+max_compile_jobs = 0
+
+def create_makefiles(given_files):
+ given_dirs = []
+ for index in range(len(given_files)):
+ filename = given_files[index]
+ if not os.path.exists(filename + '.am'):
+ sys.stderr.write('%s.am does not exist!\n' % filename)
+ sys.exit(1)
+ given_files[index] = filename
+ filename = os.path.dirname(filename)
+ if not len(filename):
+ filename = '.'
+ given_dirs.append(filename)
+
+ makefiles = []
+ utilities.subst_vars, files, headers, auxdir = utilities.parse_autoconf()
+
+ # the remaining files, not beeing automake files
+ # these need to be remade with config.status
+ config_files = {}
+ config_headers = {}
+
+ for file in given_files:
+ if not file in files:
+ files.append(file)
+
+ for file in files:
+ dir = os.path.dirname(file)
+ if not len(dir):
+ dir = '.'
+
+ if len(given_dirs) and not dir in given_dirs:
+ continue
+
+ if os.path.exists( file + '.am'):
+ makefiles.append( file + '.am' )
+ elif os.path.exists( file + '.in') or file == 'MakeVars':
+ file = os.path.basename( file )
+ if config_files.has_key(dir):
+ config_files[dir].append( file )
+ else:
+ config_files[dir] = [ file ]
+ else:
+ utilities.print_error('%s appears in configure files, but has no .in file\n' % \
file) +
+ for file in headers:
+ dir = os.path.dirname(file)
+ if not len(dir):
+ dir = '.'
+
+ file = os.path.basename( file )
+ if config_headers.has_key(dir):
+ config_headers[dir].append( file )
+ else:
+ config_headers[dir] = [ file ]
+
+ # the backslash is very bad to have as end of line ;/
+ if utilities.subst_vars.has_key("AMDEPBACKSLASH"):
+ del utilities.subst_vars["AMDEPBACKSLASH"]
+
+ if not utilities.subst_vars.has_key("DEPDIR"):
+ utilities.subst_vars['DEPDIR'] = '.deps'
+
+ utilities.subst_vars['depcomp'] = '$(SHELL) %s/depcomp\n' % auxdir
+ utilities.subst_vars['mkinstalldirs'] = '$(SHELL) %s/mkinstalldirs' % auxdir
+
+ # the following is a several pass thing. First we have to parse
+ # all, then we have to do overall checks and inserting of targets
+ # and then we can print the makefiles out
+ amfiles = []
+
+ stamp_cnt = 1
+ for file in makefiles:
+ makefile = amfile.AMFile(os.path.normcase(os.path.abspath(file)))
+ if config_files.has_key(makefile.subdir):
+ makefile.set_configure_files( config_files[ makefile.subdir] )
+ else:
+ makefile.set_configure_files( [] )
+ if config_headers.has_key(makefile.subdir):
+ stamp_cnt = makefile.set_configure_headers( config_headers[ makefile.subdir], \
stamp_cnt ) +
+ amfiles.append(makefile)
+
+ for targ in makefile.targets.values():
+ for line in targ._rules:
+ if string.find(line, '$<') >= 0:
+ utilities.print_error("%s: Do not use $< rules. They are not portable (%s).\n" \
% + (makefile.filename, targ.target))
+
+ for makefile in amfiles:
+ makefile.create()
+ makefile.print_out(makefile.subdir in given_dirs)
+
+def overwrite_makefile(dest):
+ for var, define in utilities.environment_vars.items():
+ dest.del_define(var)
+ dest.add_define(var, define)
+
+def merged_makefiles(builddir, srcdir):
+
+ path = os.path.abspath(builddir + "/Makefile")
+ if not os.path.exists(path):
+ if os.path.exists(srcdir + "/Makefile.am"):
+ src = amfile.AMFile(path, create_empty=True)
+ # ignore configure content
+ src.add_makefile_rules()
+ if getmtime(srcdir + "/Makefile.am") > getmtime(utilities._topsrcdir + \
"/configure.in"): + raise OldConfigureIn(srcdir)
+ else:
+ if utilities.top_builddir_abs and \
builddir.startswith(utilities.top_builddir_abs): + path = builddir[ \
len(utilities.top_builddir_abs) + 1 : ] + utilities.print_error("No Makefile \
present in: %s!\n" % path) + return None
+ else:
+ src = amfile.AMFile(path)
+ src.del_define("top_srcdir")
+
+ if not os.path.exists(srcdir + "/Makefile.am"):
+ overwrite_makefile(src)
+ return src
+
+ dest = amfile.AMFile(os.path.abspath(srcdir + "/Makefile.am"))
+ dest.defs["subdir"] = [dest.subdir]
+ dest.defs["srcdir"] = ['$(top_srcdir)/' + dest.subdir]
+
+ for var, define in src.defs.items():
+ if var in ['subdir', 'srcdir']: # the better paths are in srcdir
+ continue
+ if var in utilities.environment_vars.keys():
+ continue
+ dest.del_define(var)
+ if len(define):
+ dest.add_define(var, define)
+
+ overwrite_makefile(dest)
+
+ for targ in dest.targets.values():
+ # expand the variables to the local scope
+ targ.expand_target(dest)
+ targ.expand_deps(dest)
+ targ.expand_rules(dest, replace=1)
+
+ for targ in src.targets.values():
+ targ.user_specified = 0
+ dest.addTarget(targ)
+
+ # we need to delete the cache as the context changes
+ dest.cached_defs = {}
+ return dest
+
+def append_all_defines(all_defines, makefile):
+ values = makefile.targets.values()
+ for targ in values:
+ if not targ.expanded:
+ targ.expand_target(makefile)
+
+ # expand_target moves/merges targets
+ values = makefile.targets.values()
+ for targ in values:
+ try:
+ all_defines[targ.target].append(makefile)
+ except KeyError:
+ all_defines[targ.target] = [makefile]
+
+ return all_defines
+
+def remove_makefile(all_defines, makefile):
+ values = makefile.targets.values()
+ for targ in values:
+ all_defines[targ.target].remove(makefile)
+ if not len(all_defines[targ.target]):
+ del all_defines[targ.target]
+ return all_defines
+
+class NoSuchTarget(Exception):
+ def __init__(self, targ, orig = "", filename = ""):
+ Exception.__init__(self)
+ self.target = targ
+ self.orig = orig
+ self.filename = filename
+
+class MakefileChanged(Exception):
+ def __init__(self, filename):
+ Exception.__init__(self)
+ self.filename = filename
+
+class OldConfigureIn(Exception):
+ def __init__(self, filename):
+ Exception.__init__(self)
+ self.filename = filename
+
+def read_subdirs(makefile, all_defines, src_prefix, prefix, \
ignore_makefile_updates=False): +
+ makefile.check()
+ makefile.collect()
+ makefile.read_deps()
+
+ all_defines = append_all_defines(all_defines, makefile)
+
+ makefile_filename = makefile.expand(makefile.build + "Makefile")
+ if makefile.targets.has_key(makefile_filename):
+ try:
+ if finish(all_defines, makefile_filename) and not ignore_makefile_updates:
+ raise MakefileChanged(makefile_filename)
+ except NoSuchTarget, t:
+ print "No such target", t.target
+ if not ignore_most:
+ utilities.sys_exit_code = 1
+ return all_defines
+
+ if not makefile.is_defined("SUBDIRS"):
+ return all_defines
+
+ subdirs = []
+
+ for subdir in makefile.definition_rec("SUBDIRS"):
+ subdir = makefile.expand(subdir)
+ if subdir == '.':
+ continue
+ subdirs.extend(string.split(subdir))
+
+ for subdir in subdirs:
+
+ nprefix = prefix + subdir + "/"
+ nsrc_prefix = src_prefix + subdir + "/"
+ submakefile = merged_makefiles(nprefix, nsrc_prefix)
+ if not submakefile:
+ continue
+ submakefile.del_define("top_srcdir")
+ submakefile.del_define("top_builddir")
+ try:
+ all_defines = read_subdirs(submakefile, all_defines, nsrc_prefix, nprefix)
+ except MakefileChanged, m:
+ all_defines = remove_makefile(all_defines, submakefile)
+ submakefile = merged_makefiles(nprefix, nsrc_prefix)
+ submakefile.del_define("top_srcdir")
+ submakefile.del_define("top_builddir")
+ all_defines = read_subdirs(submakefile, all_defines, nsrc_prefix,
+ nprefix, ignore_makefile_updates=True)
+ return all_defines
+
+stat_cache = {}
+
+def call(obj, makefile, fork):
+ if verbose:
+ print "updating", obj.target
+
+ if fork:
+ pid = os.fork()
+ if pid:
+ return pid
+
+ try:
+ ret = obj.call_command(makefile)
+ except KeyboardInterrupt, ki:
+ if not fork:
+ raise ki
+ os._exit(130)
+
+ if ret:
+ sys.stderr.write("Error creating %s. Exit status %d.\n" % (obj.target, ret))
+ if not obj.is_phony:
+ try:
+ os.unlink(obj.target)
+ except OSError:
+ pass
+ if amfile.ignore_errors:
+ utilities.sys_exit_code = ret
+ ret = 0
+ if fork:
+ os._exit(ret)
+ else:
+ return ret
+
+sourcehash = None
+def getsourcemtime(file, real):
+ if sourcehash is None:
+ return real
+ for ending in ['.h','.cpp','.cc','.c']:
+ if file.endswith(ending):
+ return sourcehash.mtime(file,real)
+ return real
+
+def getrealmtime(file, force=0):
+ if not force and stat_cache.has_key(file):
+ m = stat_cache[file]
+ return m
+
+ try:
+ m = os.stat(file)[stat.ST_MTIME]
+ except OSError, e:
+ m = 0
+
+ stat_cache[file] = m
+ return m
+
+def getmtime(file, force = 0):
+ return getsourcemtime(file, getrealmtime(file, force))
+
+def exists(file):
+ if stat_cache.has_key(file):
+ m = stat_cache[file]
+ return m != 0
+
+ return getmtime(file) != 0
+
+def add_subst_var(var, define):
+ if not utilities.environment_vars.has_key(var):
+ utilities.subst_vars[var] = define
+
+def export_var(var):
+ if utilities.environment_vars.has_key(var):
+ os.environ[var] = utilities.environment_vars[var]
+ elif utilities.subst_vars.has_key(var):
+ os.environ[var] = utilities.subst_vars[var]
+ else:
+ print 'Variable %s does not exist.' % var
+
+def two_spaces_per(depth):
+ ret = ""
+ for i in range(0, depth):
+ ret += " "
+ return ret
+
+def fix_target_depth(targ, depth):
+ if targ.depth > depth:
+ return
+ targ.depth = depth + 1
+ for dep in targ.dep_objs:
+ if type(dep) != types.StringType:
+ fix_target_depth(dep, targ.depth)
+
+class PrioQueue:
+ def __init__(self):
+ self.data = {}
+ self.prios = []
+ self.length = 0
+
+ def add(self, item):
+ if item.depth in self.prios:
+ self.data[item.depth].append(item)
+ else:
+ self.data[item.depth] = [item]
+ self.prios.append(item.depth)
+ self.prios.sort()
+ self.prios.reverse()
+ self.length += 1
+
+ def first(self):
+ self.length -= 1
+ depth = self.prios[0]
+ obj = self.data[depth].pop(0)
+ if not len(self.data[depth]):
+ # clean up
+ self.prios.pop(0)
+ if verbose:
+ print "first", obj.target, depth, len(self.data[depth])
+ return obj
+
+ def size(self):
+ return self.length
+
+ def first_prio(self):
+ return self.prios[0]
+
+ def fix_items(self):
+ prios = self.prios[:]
+ prios.reverse() # assuming the depth go only up
+ for depth in prios:
+ items = self.data[depth]
+ index = 0
+ while index < len(items):
+ if items[index].depth != depth:
+ self.add(items[index])
+ del items[index]
+ self.length -= 1
+ else:
+ index += 1
+ if len(items):
+ self.data[depth] = items
+ else:
+ del self.data[depth]
+ self.prios = self.data.keys()
+ self.prios.sort()
+ self.prios.reverse()
+
+def find_todo(targ, all_defines, todo_list, com_todo, non_todo, depth):
+
+ if verbose:
+ print two_spaces_per(depth), "find_todo", targ
+
+ if todo_list.has_key(targ):
+ return todo_list[targ]
+
+ try:
+ makefiles = all_defines[targ]
+ except KeyError:
+ # if the target is not defined, then we take the string
+ # as target and are done
+ todo_list[targ] = targ
+ return targ
+
+ rule=0
+ first_tar = None
+
+ # first we collect the target in all makefiles it's in
+ for makefile in makefiles:
+ tar = makefile.targets[targ]
+ if not first_tar:
+ tar.makefile = makefile
+ first_tar = tar
+ if not first_tar.expanded:
+ first_tar.expand_deps(makefile)
+ continue
+ if tar.user_specified:
+ # BTW: already expanded deps+rules (esp. the $@)
+ del makefile.targets[targ]
+ tar.target = targ + "_" + makefile.canon_subdir
+ makefile.addTarget(tar)
+ first_tar.deps.append(tar.target)
+ assert(not all_defines.has_key(tar.target))
+ all_defines[tar.target] = [makefile]
+ continue
+ if not tar.expanded:
+ tar.expand_deps(makefile)
+ if tar.has_rules():
+ if first_tar.has_rules():
+ continue # ignoring
+ for dep in first_tar.deps:
+ # TODO: profile
+ if not dep in tar.deps:
+ tar.deps.append(dep)
+ tar.makefile = makefile
+ first_tar = tar
+ else:
+ for dep in tar.deps:
+ if not dep in first_tar.deps:
+ first_tar.deps.append(dep)
+
+ all_defines[targ] = [first_tar.makefile]
+
+ # now some caching
+ todo_list[targ] = first_tar
+
+ # it might be changed by former finish runs. Then it's
+ # uptodate and we can return the file name
+ if first_tar.changed:
+ todo_list[first_tar.target] = first_tar.target
+ return first_tar.target
+
+ if first_tar.is_phony:
+ first_tar.mtime = 0
+ else:
+ first_tar.mtime = getmtime(first_tar.target)
+
+ deps = 0
+ first_tar.dep_objs = []
+ first_tar.needs = []
+ first_tar.deps_count = 0
+ first_tar.is_todo = False
+ first_tar.depth = depth
+
+ for dep in first_tar.deps:
+ if todo_list.has_key(dep):
+ dep_obj = todo_list[dep]
+ else:
+ dep_obj = find_todo( dep, all_defines,
+ todo_list, com_todo,
+ non_todo, depth + 1)
+
+ if type(dep_obj) == types.StringType:
+ dep_mtime = getmtime(dep_obj)
+ dep_str = dep_obj + "(str)"
+
+ if not dep_mtime:
+ if dep_obj in amfile.maybe_missing:
+ continue
+ if not ignore_most:
+ raise NoSuchTarget(dep_obj, first_tar.target, first_tar.makefile.filename)
+ else:
+ utilities.sys_exit_code = 1
+ continue
+ if first_tar.mtime >= dep_mtime:
+ continue
+ else:
+ dep_mtime = dep_obj.mtime
+ dep_str = dep_obj.target
+ if not dep_obj.mtime: # propagate
+ first_tar.mtime = 0
+
+ if verbose:
+ print two_spaces_per(depth), "first_tar", first_tar.target, first_tar.mtime, \
dep_str, dep_mtime +
+ deps += 1
+ if type(dep_obj) != types.StringType:
+ dep_obj.needs.append(first_tar)
+ if first_tar.depth >= dep_obj.depth:
+ fix_target_depth(dep_obj, first_tar.depth)
+ first_tar.deps_count += 1
+ first_tar.dep_objs.append(dep_obj)
+
+ # if the target is uptodate, just return the path
+ if first_tar.mtime and not deps:
+ # if we ignore this target, don't take the full
+ # effort in future calls
+ first_tar.changed = True
+ todo_list[targ] = targ
+ return targ
+
+ if not first_tar.is_phony:
+ target.num_todo += 1
+
+ if not first_tar.deps_count:
+ if first_tar.compile_target:
+ com_todo.add(first_tar)
+ else:
+ non_todo.add(first_tar)
+
+ return first_tar
+
+def print_dfa(targ, depth = 0):
+ if type(targ) == types.StringType:
+ print two_spaces_per(depth) + targ
+ return
+
+ try:
+ if target.printed:
+ print targ.depth, depth, two_spaces_per(depth) + "[" + targ.target + "]"
+ return
+ except:
+ pass
+ print targ.depth, depth, two_spaces_per(depth) + targ.target, targ.mtime
+ targ.printed = 1
+ for dep in targ.dep_objs:
+ print_dfa(dep, depth + 1)
+
+def finish(all_defines, targ):
+ if not all_defines.has_key(targ):
+ raise NoSuchTarget(targ)
+
+ children = {}
+ try:
+ return finish_internal(all_defines, targ, children)
+ except KeyboardInterrupt, ki:
+ for obj in children.values():
+ #Note we use ansi color call to set the color to nothing - just incase we were \
half way through displaying a color + sys.stderr.write("[0minterrupted call, \
removing %s\n" % obj.target) + try:
+ os.unlink(obj.target)
+ except OSError:
+ pass
+ raise ki
+
+def finish_internal(all_defines, targ, children):
+ com_todo = PrioQueue()
+ non_todo = PrioQueue()
+
+ target.num_todo = 0
+ target_obj = find_todo(targ, all_defines, {}, com_todo, non_todo, 0)
+ target.max_todo = target.num_todo
+ com_todo.fix_items()
+ non_todo.fix_items()
+ done_something = False
+
+ if type(target_obj) == types.StringType:
+ return done_something
+
+ if verbose:
+ print_dfa(target_obj)
+
+ compile_jobs = 0
+
+ while non_todo.size() or com_todo.size() or len(children):
+
+ while non_todo.size() or com_todo.size():
+
+ if verbose:
+ print "todo", non_todo.size(), com_todo.size(), "%d(%d)" % (compile_jobs, \
max_compile_jobs), "%d(%d)" % (len(children), max_children) +
+ if not max_compile_jobs:
+ if com_todo.size():
+ if non_todo.size():
+ if com_todo.first_prio() > non_todo.first_prio():
+ obj = com_todo.first()
+ else:
+ obj = non_todo.first()
+ else:
+ obj = com_todo.first()
+ elif non_todo.size():
+ obj = non_todo.first()
+ else:
+ break
+ else:
+ if com_todo.size() and compile_jobs < max_compile_jobs:
+ obj = com_todo.first()
+ elif non_todo.size() and len(children) - compile_jobs < max_children - \
max_compile_jobs: + obj = non_todo.first()
+ elif com_todo.size() and len(children) < max_children:
+ obj = com_todo.first()
+ else:
+ break
+
+ assert(not obj.changed)
+ assert(obj.deps_count == 0)
+
+ if obj.mtime == -2:
+ for targ in obj.needs:
+ targ.mtime = -2
+ obj.changed = True
+ if not target.is_phony:
+ target.num_todo -= 1
+ continue
+
+ if verbose:
+ print "considering", obj.target
+
+ assert(len(children) <= max_children)
+
+ # this is a bit tree-state here. If the obj is in the
+ # tree, then it needs to be updated. But if we go through
+ # the list and find one younger, we need to update. The
+ # older ones can be ignored
+ all_younger = True
+ one_younger = False
+
+ if not obj.mtime and not len(obj.dep_objs) and not obj.is_phony:
+ # special exception: if it didn't exist and has no dependencies,
+ # we assume it had to be generated. So if it now exists, we can
+ # ignore it (I hope)
+ if verbose:
+ print "Assuming", obj.target, "needs to be generated"
+ all_younger = False
+
+ obj.mtime = getmtime(obj.target, force=1)
+ for dep in obj.dep_objs:
+ if type(dep) == types.StringType:
+ dep_mtime = getmtime(dep)
+ else:
+ dep_mtime = dep.mtime
+ assert(dep_mtime != -1)
+
+ if not dep_mtime:
+ obj.mtime = 0
+
+ if obj.mtime < dep_mtime:
+ one_younger = True
+ else:
+ all_younger = False
+
+ if all_younger:
+ one_younger = True
+
+ obj.changed = True
+ if not obj.is_phony:
+ target.num_todo -= 1
+
+ if obj.mtime and not one_younger or not obj.has_rules():
+ if verbose and not one_younger:
+ print "no need to update", obj.target
+ for targ in obj.needs:
+ assert(targ.deps_count > 0)
+ targ.deps_count -= 1
+ if targ.deps_count == 0 and not targ.is_todo:
+ if targ.compile_target:
+ com_todo.add(targ)
+ else:
+ non_todo.add(targ)
+ continue
+
+ if target.print_progress and target.max_todo > 0:
+ progress = 100 - int(round(100 * target.num_todo / target.max_todo))
+ if progress == 100:
+ # don't break the layout :)
+ progress = 99
+ sys.stdout.write("%s%02d%%%s " % \
(utilities.progress_color,progress,utilities.normal)) + sys.stdout.flush()
+ elif target.precise_print_progress and target.max_todo > 0:
+ progress = 100.0 - (100.0 * float(target.num_todo) / float(target.max_todo))
+ if progress >= 99.99:
+ # don't break the layout :)
+ progress = 99.99
+ if progress < 10.0:
+ # Add a space to pad (%02 won't work for floats)
+ sys.stdout.write("%s %.2f%%%s " % \
(utilities.progress_color,progress,utilities.normal)) + else:
+ sys.stdout.write("%s%.2f%%%s " % \
(utilities.progress_color,progress,utilities.normal)) + sys.stdout.flush()
+
+ if max_children > 1:
+ pid = call(obj, obj.makefile, fork=1)
+ if obj.compile_target:
+ compile_jobs += 1
+
+ children[pid] = obj
+ obj.mtime = -1 # mark as in process
+ if len(children) >= max_children:
+ break
+ else:
+ try:
+ ret = call(obj, obj.makefile, fork=0)
+ except KeyboardInterrupt, ki:
+ sys.stderr.write("interrupted call, removing %s\n" % obj.target)
+ try:
+ os.unlink(obj.target)
+ except OSError:
+ pass
+ raise ki
+ if ret:
+ if ignore_most:
+ obj.mtime = -2
+ utilities.sys_exit_code = ret
+ else:
+ sys.exit(ret)
+ children[0] = obj
+ break
+
+ if verbose:
+ print "finished", len(children)
+
+ if len(children):
+ done_something = True
+ exit_code = 0
+
+ if max_children > 1:
+ (pid, exit_code) = os.wait()
+
+ if exit_code:
+ if not ignore_most:
+ if os.WIFEXITED(exit_code):
+ sys.exit(os.WEXITSTATUS(exit_code))
+ sys.exit(exit_code)
+ else:
+ utilities.sys_exit_code = exit_code
+ else:
+ pid = 0
+
+ obj = children[pid]
+ del children[pid]
+
+ if max_children > 1 and obj.compile_target:
+ compile_jobs -= 1
+
+ if exit_code: # ignore_most
+ obj.mtime = -2
+
+ # the above is not the only way it can get -2 (-j1)
+ if obj.mtime != -2:
+ if obj.is_phony:
+ new_mtime = 0
+ else:
+ new_mtime = getmtime(obj.target, force=1)
+
+ obj.mtime = new_mtime
+
+ for targ in obj.needs:
+ assert(targ.deps_count > 0)
+ targ.deps_count -= 1
+ if targ.deps_count == 0 and not targ.is_todo:
+ if targ.compile_target:
+ com_todo.add(targ)
+ else:
+ non_todo.add(targ)
+
+ else: # .mtime == -2
+ # every target above in the tree is marked as failed too
+ for targ in obj.needs:
+ targ.mtime = -2
+
+ return done_something
+
+def setup_top_makefile( top_srcdir, subdir ):
+ top_makefile = merged_makefiles(".", top_srcdir + "/" + subdir)
+
+ if top_makefile.subdir == '.':
+ for key in utilities.subst_vars.keys():
+ top_makefile.del_define(key)
+
+ bins = []
+ bins.extend(top_makefile.binaries.values())
+ if utilities.top_builddir == '.':
+ top_makefile.insertTarget('$(top_srcdir)/configure', [utilities.configure_in,
+ '$(top_srcdir)/aclocal.m4'],
+ 'cd $(top_srcdir) && $(AUTOCONF)')
+ if os.path.exists(top_srcdir + "/acinclude.m4"):
+ # TODO: check if it generated and trace dependencies
+ top_makefile.insertTarget('$(top_srcdir)/aclocal.m4', [utilities.configure_in,
+ '$(top_srcdir)/acinclude.m4'],
+ 'cd $(srcdir) && $(ACLOCAL) $(ACLOCAL_AMFLAGS)')
+ top_makefile.insertTarget('$(top_builddir)/config.status', \
['$(top_srcdir)/configure'], + '$(SHELL) ./config.status --recheck')
+
+ top_makefile.insertTarget("force-reedit", [],
+ ["cd $(top_srcdir) && %s -c %s/Makefile" %
+ (utilities.our_path, top_makefile.subdir)], phony=1)
+ top_makefile.insertTarget("force-install", [],
+ ['@$(MAKE) install UNSERMAKE_FORCE_INSTALL=FORCE'], phony=1)
+ top_makefile.insertTarget("FORCE", [], phony=1)
+
+ return top_makefile
+
+def setup_top_makefile_wrapper( top_makefile, top_srcdir, subdir):
+
+ try:
+ all_defines = read_subdirs(top_makefile, {},
+ os.path.normpath(top_srcdir + "/" + subdir ) + "/",
+ os.path.abspath(".") + "/")
+ except MakefileChanged, m:
+ top_makefile = setup_top_makefile( top_srcdir, subdir )
+ # very possible that the second call tries again something
+ # but we ignore the second time
+ all_defines = read_subdirs(top_makefile, {},
+ os.path.normpath(top_srcdir + "/" + subdir ) + "/",
+ os.path.abspath(".") + "/",
+ ignore_makefile_updates=True)
+
+ return top_makefile, all_defines
+
+def usage():
+ print "Usage: " + thisProg + " [OPTION] ... [dir]..."
+ print ""
+ print "Some of the switches correspond to those in \"make\" or \"automake\". See \
the manpages for details." + print ""
+ print "Available options:"
+ print ""
+ print " --add-missing Add missing files to package (not implemented yet)."
+ print " -c, --create Create Makefiles (automake mode)."
+ print " -C, --directory=dir Change to directory dir before doing anything."
+ print " --color Add color to the output (default)."
+ print " --compile-jobs=N Limit compile jobs to N."
+ print " -e Environment variables override makefiles."
+ print " -i, --ignore-errors Ignore errors from commands."
+ print " -j, --jobs=N Allow N parallel jobs."
+ print " -k, --keep-going Keep going when some targets can't be made."
+ print " -l N Don't start multiple jobs unless load is below N \
(not implemented yet)." + print " --missing-deps Find missing dependencies."
+ print " --no-color Do not colorize the output."
+ print " --no-real-compare Use only mtime and not file contents to determine if \
a file has changed (default)." + print " --real-compare Use not only mtime \
but file content to determine if a file has changed." + print " --random=N \
Make random targets." + print " -n --just-print Only print out the commands to \
call." + print " -v, --verbose Show verbose output."
+ print " --version Show version information and copyright notice."
+ print " -p, --print-progress Shows an estimated percent number before each \
output." + sys.exit(0)
+
+def unsermake_main():
+ starttime = time.time()
+
+ global thisProg, verbose
+ thisProg = "unsermake"
+ recurse_flags = []
+ recurse_args = []
+ after_dminus = False
+ try:
+ makeflags = os.environ['MAKEFLAGS']
+ if len(makeflags):
+ inquote = False
+ last_arg = ''
+ for arg in string.split(makeflags, ' '):
+ last_arg += arg
+
+ if string.count(arg, '\'') % 2:
+ inquote = not inquote
+
+ if not inquote:
+ if len(last_arg) > 1 and last_arg[0] == '\'' and last_arg[-1] == '\'':
+ last_arg = last_arg[1:-1]
+ # special GNU make hack
+ if not len(recurse_flags) and len(last_arg) and last_arg[0] != '-':
+ if not string.count(last_arg, '='):
+ last_arg = "-" + last_arg
+ if last_arg == '--':
+ after_dminus = True
+ elif after_dminus:
+ recurse_args.append(last_arg)
+ else:
+ recurse_flags.append(last_arg)
+ last_arg = ''
+
+ except KeyError:
+ pass
+
+ utilities.our_path = "unsermake"
+ if len(recurse_args):
+ recurse_args = ["--"] + recurse_args
+ if os.environ.has_key('UNSERMAKE_OPTS'):
+ recurse_flags = recurse_flags + string.split(os.environ['UNSERMAKE_OPTS'])
+ recurse_flags = recurse_flags + sys.argv[1:] + recurse_args
+
+ moduledirs = []
+
+ added_delim = False
+
+ commandline_vars = {}
+ eqre = re.compile('^([^=]*)=(.*)$')
+
+ optlist = []
+ given_files = []
+
+ while len(recurse_flags):
+ try:
+ t_optlist, given_files = getopt.getopt(recurse_flags, 'cvhr:f:C:kij:el:nps', [
+ 'create', 'add-missing', 'version', 'verbose', 'help', 'random=', \
'missing-deps', + 'compile-jobs=', 'directory=', 'ignore-errors', 'keep-going', \
'jobs=', 'just-print', + 'print-progress', 'real-compare', 'no-real-compare', \
'modules=','color','no-color']) + optlist.extend(t_optlist)
+ recurse_flags = []
+ if len(given_files):
+ match = eqre.match(given_files[0])
+ if match: # variable assignment on command line
+ commandline_vars[match.group(1)] = match.group(2)
+ recurse_flags.extend(given_files[1:])
+
+ except getopt.GetoptError:
+ print "Wrong parameters."
+ usage();
+ sys.exit(2);
+
+ targets = []
+ for arg in given_files:
+ match = eqre.match(arg)
+ if match: # variable assignment on command line
+ commandline_vars[match.group(1)] = match.group(2)
+ else:
+ targets.append(arg)
+
+ # print optlist, targets, commandline_vars.keys()
+
+ # default values
+ random_targets = 0
+ create_mode = False
+ print_progress = False
+ precise_print_progress = False
+ mtime_only = True
+ color_mode = sys and sys.stderr.isatty() and sys.stdout.isatty() and "TERM" in \
os.environ and not os.environ["TERM"].upper() in ["DUMB"] and not "EMACS" in \
os.environ + file_to_read = "Makefile"
+ makeflags = ''
+
+ global max_children
+
+ for option, param in optlist:
+ if option == '--version':
+ print thisProg + " 0.4"
+ print 'Written by Stephan Kulow using concepts of Michael Matz'
+ print 'and Simon Hausmann. In parts based on concepts of "am_edit"'
+ print 'and "GNU Automake".'
+ print ''
+ print 'This is free software; see the source for copying conditions. There is \
NO' + print 'warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR \
PURPOSE.' + sys.exit(0)
+ elif option == '--verbose' or option == '-v':
+ verbose = True
+ elif option == '--help' or option == '-h':
+ usage();
+ elif option == '--random' or option == '-r':
+ utilities.default_force = 0
+ random_targets = string.atoi(param)
+ elif option == '--create' or option == '-c':
+ create_mode = True
+ elif option == '--color':
+ color_mode = True
+ elif option == '--no-color':
+ color_mode = False
+ elif option == '-f':
+ file_to_read = param
+ elif option == '--directory' or option == '-C':
+ os.chdir(param)
+ elif option == '--missing-deps':
+ amfile.find_missing_deps = True
+ elif option == '--ignore-errors' or option == '-i':
+ amfile.ignore_errors = True
+ makeflags += "-i "
+ elif option == '--keep-going' or option == '-k':
+ global ignore_most
+ ignore_most = True
+ makeflags += "-k "
+ elif option == '--jobs' or option == '-j':
+ try:
+ max_children = string.atoi(param)
+ except:
+ utilities.print_error("you have to pass an integer after -j\n")
+ sys.exit(1)
+ makeflags += "-j %d " % max_children
+ elif option == '--just-print' or option == '-n':
+ target.just_print = True
+ makeflags += "-n "
+ elif option == '-e':
+ for var in os.environ.keys():
+ utilities.environment_vars[var] = os.environ[var]
+ makeflags += "-e "
+ elif option == '-p' or option == '--print-progress':
+ if print_progress or precise_print_progress:
+ # This is not the first -p option, so set the "precise" progress
+ precise_print_progress = True
+ print_progress = False
+ else:
+ precise_print_progress = False
+ print_progress = True
+ # Set makeflags only at the first -p
+ makeflags += "-p "
+ elif option == '-l':
+ print 'the option %s is ignored (for now)' % option
+ elif option == '-s':
+ continue # see above - but silently ;/
+ elif option == '--compile-jobs':
+ global max_compile_jobs
+ try:
+ max_compile_jobs = string.atoi(param)
+ except:
+ utilities.print_error("you have to pass an integer after --compile-jobs\n")
+ sys.exit(1)
+ makeflags += "--compile-jobs %s " % max_compile_jobs
+ elif option == '--no-real-compare':
+ mtime_only = True
+ makeflags += "--no-real-compare "
+ elif option == '--real-compare':
+ mtime_only = False
+ makeflags += "--real-compare "
+ elif option == '--modules':
+ moduledirs.append(param)
+
+ max_children = max_children + max_compile_jobs
+
+ # Register list of file handlers, as well as setting our_path to the
+ # absolute path to unsermake for instances like sudo with --secure-path.
+ # Assumes that one of the module dirs has the unsermake script.
+ handlers = []
+ for sourcedir in moduledirs:
+ files = os.listdir(sourcedir)
+ for l in files:
+ if l.endswith('.um'):
+ handlers.append(os.path.join(sourcedir, l))
+ if l == 'unsermake':
+ utilities.our_path = sourcedir + '/' + l
+
+ handlers.sort()
+
+ for file in handlers:
+ execfile(file)
+ if verbose:
+ print "registered handlers:", handlerbase.handlers
+
+ if not color_mode:
+ utilities.clearAllColors()
+
+ if create_mode:
+ utilities.configure_in = "$(top_srcdir)/configure.in"
+ if os.path.exists("configure.ac"):
+ utilities.configure_in = "$(top_srcdir)/configure.ac"
+ else:
+ if not os.path.exists("configure.in"):
+ print 'There must be a configure.in or configure.ac in the current directory for \
--create.' + sys.exit(1)
+
+ utilities._topsrcdir = os.path.abspath(os.curdir) + "/"
+ create_makefiles(targets)
+ return
+
+ if not os.path.exists(file_to_read):
+ print "no file", file_to_read, "present"
+ sys.exit(1)
+
+ added_delim = False
+
+ for var, value in commandline_vars.items():
+ utilities.environment_vars[var] = value
+ os.environ[var] = value
+ if not added_delim:
+ makeflags += "-- "
+ add_delim = True
+ makeflags += "'%s=%s' " % (var,value)
+
+ if makeflags and makeflags[-1] == ' ':
+ makeflags = makeflags[:-1]
+
+ utilities.subst_vars = {}
+ add_subst_var('MAKE', utilities.our_path)
+ export_var('MAKE')
+ add_subst_var('MAKEFLAGS', makeflags)
+ export_var('MAKEFLAGS')
+ add_subst_var('SHELL', '/bin/sh')
+ export_var('SHELL')
+
+ # top_makefile is just of temporary value
+ top_makefile = amfile.AMFile(file_to_read, ignore_comments=False)
+ if not top_makefile.is_defined("top_srcdir"): # TODO: put some UNSERMAKE_was_here \
in there + overwrite_makefile(top_makefile)
+ all_defines = append_all_defines({}, top_makefile)
+ try:
+ if not len(targets):
+ targets = [top_makefile.first_target]
+ for targ in targets:
+ finish(all_defines, targ)
+ except NoSuchTarget, t:
+ print "no rule to create target: %s" % t.target
+ sys.exit(1)
+ sys.exit(0) # if not exited before
+
+ if not len(targets):
+ targets = ["all"]
+
+ top_srcdir = top_makefile.value_of("top_srcdir")
+ utilities._topsrcdir = os.path.abspath(top_srcdir)
+ if os.path.exists(utilities._topsrcdir + "/configure.ac"):
+ utilities.configure_in = "$(top_srcdir)/configure.ac"
+ else:
+ utilities.configure_in = "$(top_srcdir)/configure.in"
+ if not utilities._topsrcdir.endswith("/"):
+ utilities._topsrcdir = utilities._topsrcdir + "/"
+ utilities.top_builddir = top_makefile.value_of("top_builddir")
+ if utilities.top_builddir and utilities.top_builddir != '.':
+ top_makefile = amfile.AMFile(utilities.top_builddir + "/Makefile", \
ignore_comments=False) + else:
+ utilities.top_builddir = '.'
+
+ builddir = os.path.abspath(os.curdir)
+ utilities.top_builddir_abs = os.path.abspath(os.curdir + "/" + \
utilities.top_builddir) + subdir = builddir[len(utilities.top_builddir_abs)+1:]
+
+ for cond in top_makefile.defines():
+ if cond in ['subdir', 'top_builddir', 'srcdir', 'top_srcdir']:
+ continue
+ if cond.endswith("_FALSE"):
+ continue
+ if not cond.endswith("_TRUE"):
+ if not utilities.environment_vars.has_key(cond):
+ utilities.subst_vars[cond]=top_makefile.value_of(cond)
+ top_makefile.del_define(cond)
+ continue
+ cond = cond[:-5]
+ true = top_makefile.value_of(cond + '_TRUE')
+ top_makefile.del_define(cond + '_TRUE')
+ false = top_makefile.value_of(cond + '_FALSE')
+ top_makefile.del_define(cond + '_FALSE')
+ if true == '#' and not len(false):
+ utilities.false_conds.append(cond)
+ continue
+ if false == '#' and not len(true):
+ utilities.true_conds.append(cond)
+ continue
+ print 'unknown cond', cond
+
+ add_subst_var('top_builddir', utilities.top_builddir)
+ # we take the absolute form (without trailing /) so we avoid confusion with
+ # relative build dirs
+ # TODO: if someone is bored enough, he could put in here a @top_srcdir@ that
+ # is then way later expanded back to the relative version. This only affects
+ # user_specified rules as we got a race between variable expansion and build
+ # dir expansion (amfile.replace_builddir)
+ add_subst_var('top_srcdir', utilities._topsrcdir[:-1])
+ add_subst_var('INSTALL_HEADER', '$(INSTALL_DATA)')
+ add_subst_var('CXXLD', '$(CXX)')
+ add_subst_var('CLD', '$(CC)')
+ add_subst_var('AR', 'ar')
+ add_subst_var('UNSERMAKE_FORCE_INSTALL', '')
+ add_subst_var('INSTALL_STRIP_FLAG', '')
+
+ # these variables are expected to used from the environment
+ # and not from configure
+ for var in ["DESTDIR", "INSTALL_STRIP_FLAG", "INSTALL_PROGRAM_ENV",
+ "transform", "AM_MAKEFLAGS", "VERBOSE"]:
+ add_subst_var(var, '')
+
+ add_subst_var('V_ECHO', 'test -n "$(VERBOSE)" || echo')
+ add_subst_var('DEFAULT_INCLUDES', '-I. -I$(srcdir) -I$(top_builddir)')
+ add_subst_var('V_EXEC', "if test -n \"$(VERBOSE)\"; then "\
+ "echo \"$$@\"; \"$$@\"; "\
+ " else "\
+ " rslt=$$(\"$$@\" 2>&1); stat=$$?; "\
+ " if test $$stat -ne 0; then "\
+ " echo \"$$@\"; echo \"$$rslt\" >&2; "\
+ " else "\
+ " test -n \"$$rslt\" && echo \"$$rslt\" >&2; "\
+ " fi; "\
+ " exit $$stat;"\
+ " fi")
+ add_subst_var('V_COMPILE', "if test -n \"$(VERBOSE)\"; then echo \"$$@\"; \"$$@\"; \
"\ + "else rslt=$$(\"$$@\" 2>&1); fi; stat=$$?; "\
+ " if test $$stat -ne 0; then "\
+ " if test -z \"$(VERBOSE)\"; then echo \"$$@\"; echo \"$$rslt\" >&2; fi; "\
+ " else "\
+ " test -n \"$$rslt\" && echo \"$$rslt\" >&2; "\
+ " echo \"$$targetfile : \\\\\" > \"$$depfile.tmp\"; "\
+ " deps=`sed -e 's,.*:,,' $$tmpdepfile | sed -e 's,\\\\\\\\,,g'`; "\
+ " echo 'builddir=$$(top_builddir)/$(subdir)' > $$depfile.tmp ;"\
+ " echo 'top_builddir=$(top_builddir)' >> $$depfile.tmp ;"\
+ " echo \"targetfile=$$targetfile\" >> $$depfile.tmp ;"\
+ " for dep in $$deps ; do echo $$dep >> $$depfile.tmp ; done ;"\
+ " mv $$depfile.tmp $$depfile; "\
+ " fi; "\
+ " rm -f \"$$tmpdepfile\"; exit $$stat")
+
+ # not really sure who would use these
+ package = utilities.subst_vars['PACKAGE']
+ add_subst_var('pkgdatadir', '$(datadir)/' + package)
+ add_subst_var('pkglibdir', '$(libdir)/' + package)
+ add_subst_var('pkgincludedir', '$(includedir)/' + package)
+
+ top_makefile = setup_top_makefile( top_srcdir, subdir )
+
+ try:
+ (top_makefile, all_defines) = setup_top_makefile_wrapper( top_makefile, \
top_srcdir, subdir) + except OldConfigureIn, o:
+ if top_makefile.targets.has_key(utilities._topsrcdir + "configure.in"):
+ print "removing ", utilities._topsrcdir + "configure.in"
+ os.unlink( utilities._topsrcdir + "configure.in")
+ (top_makefile, all_defines) = setup_top_makefile_wrapper( top_makefile, \
top_srcdir, subdir) + else:
+ utilities.print_error("there is no way to generate the Makefile for %s\n" %
+ o.filename)
+ sys.exit(1)
+
+ if amfile.find_missing_deps:
+ if os.path.abspath(utilities.top_builddir) == os.path.abspath(top_srcdir):
+ print "top_srcdir is the same as top_builddir. This doesn't work with \
--missing-deps" + sys.exit(1)
+ keys = amfile.found_missing_deps.keys()
+ keys.sort()
+ gwd = os.getcwd()
+ os.chdir(top_srcdir)
+ utilities.subst_vars, files, headers, auxdir = utilities.parse_autoconf()
+ os.chdir(gwd)
+
+ cheaders = []
+ for header in headers + files:
+ cheaders.append('$(top_builddir)/' + header)
+
+ old_dir = None
+ lines = ''
+
+ print "\nFound Missing Dependencies:\n"
+
+ for key in keys:
+ custom_keys = []
+ dir = os.path.dirname(key)
+ if dir.startswith("$(top_builddir)/"):
+ dir = dir[len("$(top_builddir)/"):]
+ if dir != old_dir:
+ if lines:
+ print "%s/Makefile.am:" % old_dir
+ print lines
+ lines = ''
+ old_dir = dir
+
+ for dep2 in amfile.found_missing_deps[key]:
+ if dep2 in cheaders:
+ continue
+ dep_dirs = string.split(dep2, '/')
+ targetdirs = string.split(key, '/')
+ index=0
+ while index < min(len(targetdirs), len(dep_dirs)) - 1:
+ if targetdirs[index] != dep_dirs[index]:
+ break
+ index += 1
+
+ targetdirs = targetdirs[index:]
+ dep_dirs = dep_dirs[index:]
+
+ dep2 = ''
+ for index in range(0, len(targetdirs) - 1):
+ dep2 += "../"
+ for index in range(0, len(dep_dirs) - 1):
+ dep2 += dep_dirs[index] + "/"
+ dep2 += dep_dirs[-1]
+ if not dep2 in custom_keys:
+ custom_keys.append(dep2)
+ if len(custom_keys):
+ lines += "%s: %s\n" % (os.path.basename(key),
+ string.join(custom_keys))
+ if lines:
+ print "%s/Makefile.am:" % old_dir
+ print lines
+
+ sys.exit(0)
+
+ #for r in xrange(0, random_targets):
+ # b = random.choice(bins)
+ # b.add_random()
+ # bins.remove(b)
+
+ top_makefile.final_reorder()
+
+ if utilities.sys_exit_code:
+ sys.exit(utilities.sys_exit_code)
+
+ if verbose:
+ top_makefile.filename = "Makefile.out"
+ top_makefile.print_out(1)
+
+ target.print_progress = print_progress
+ target.precise_print_progress = precise_print_progress
+
+ sourcehash_file = utilities.top_builddir+'/SourceHash'
+ if not mtime_only:
+ global sourcehash
+ sourcehash = SourceHash(sourcehash_file)
+
+ for targ in targets:
+ try:
+ finish(all_defines, targ)
+ except NoSuchTarget, t:
+ try:
+ btargets = top_makefile.replace_builddir([targ])
+ if len(btargets) == 1:
+ btarget = top_makefile.expand(btargets[0])
+ finish(all_defines, btarget)
+ except NoSuchTarget, t2:
+ filename = t2.filename
+ orig = t2.orig
+ targ = t2.target
+ if not len(t2.filename):
+ filename = t.filename
+ orig = t.orig
+ targ = t.target
+ print "%s: no rule to create target: %s(%s)" % (filename, targ, orig)
+ sys.exit(1)
+
+ if sourcehash:
+ sourcehash.save(sourcehash_file)
+
+def main():
+ if os.environ.has_key('PROFILE'):
+ import profile
+ profile.run("unsermake_main()", 'fooprof')
+
+ import pstats
+ p = pstats.Stats('fooprof')
+ p.strip_dirs()
+ p.sort_stats('cumulative').print_stats(10)
+ p.sort_stats('time').print_stats(10)
+
+ else:
+
+ # If Psyco is installed, use it.
+ # Psyco speeds dramatically up Python, see:
+ # http://psyco.sourceforge.net/
+ try:
+ import psyco
+ psyco.full()
+ except ImportError:
+ # Psyco is optional, silently accept its abscence
+ pass
+
+ try:
+ unsermake_main()
+ except KeyboardInterrupt, ki:
+ sys.exit(130)
+
+ sys.exit(utilities.sys_exit_code)
+
+# vim: ts=4
diff --git a/unsermake/amfile.py b/unsermake/amfile.py
new file mode 100644
index 0000000..4302181
--- /dev/null
+++ b/unsermake/amfile.py
@@ -0,0 +1,1474 @@
+
+import utilities
+import re, os, types, string, sys
+import handlerbase, program
+from target import Target, PhonyConflict, DefValue
+
+__pychecker__ = 'no-shadowbuiltin'
+
+maybe_missing = {}
+find_missing_deps = False
+found_missing_deps = {}
+
+ignore_errors = False
+
+class AMFile:
+ def __init__(self, filename, ignore_comments = True, create_empty = False):
+
+ self.targets = {}
+ self.defs = {}
+ self.printed_defs = {}
+ self.conds = {}
+ self.dep_files = []
+ self.cached_defs = {}
+
+ if utilities._topsrcdir and filename.startswith(utilities._topsrcdir):
+ self.filename = filename[ len(utilities._topsrcdir) : ]
+ elif utilities.top_builddir_abs and \
filename.startswith(utilities.top_builddir_abs): + self.filename = filename[ \
len(utilities.top_builddir_abs) + 1: ] + else:
+ self.filename = filename
+
+ self.configure_files = []
+
+ # this array collects defines that were changed after initial parsing
+ self.overwrites = []
+
+ self.commentre = re.compile('\s*#.*')
+ self.to_handle_targets = []
+ self.am_includes = []
+ self.binaries = {}
+ self.mansectre = re.compile('.*\.([0-9nl])[a-z]*')
+ self.mansections = ['0','1','2','3','4','5','6','7','8','9','n','l']
+ self.first_target = None
+
+ try:
+ self.parse(filename, ignore_comments, create_empty)
+ except IOError, e:
+ print filename, "caused error:", e
+ sys.exit(1)
+
+ self.overwrites = []
+ self.isempty = create_empty
+
+ def __repr__(self):
+ return self.subdir
+
+ def addTarget(self, target):
+ if self.targets.has_key(target.target):
+ # throws if both have rules
+ try:
+ self.targets[target.target].merge(target)
+ except PhonyConflict:
+ utilities.print_error("%s tries to overwrite the PHONY status of %s\n" % \
(self.filename, target)) + else:
+ self.targets[target.target] = target
+
+ def insertTarget(self, target, deps, rules = [], user_specified=0,
+ phony=0, compile_target=0):
+ if not len(rules) and self.targets.has_key(target):
+ target = self.targets[target]
+ if type(deps) == types.StringType:
+ deps = string.split(deps)
+ for dep in deps:
+ target.deps.append(DefValue(dep))
+ else:
+ for dep in deps:
+ if type(dep) == types.StringType:
+ # we can not assume it's an expanded value
+ target.deps.append(DefValue(dep))
+ else:
+ target.deps.append(dep)
+
+ return
+
+ # throws error if a line doesn't start with <tab>
+ t = Target(target, deps, rules, user_specified, phony)
+ t.compile_target = compile_target
+ self.addTarget(t)
+
+ def print_out_definition(self, output, definition):
+ if self.printed_defs.has_key(definition):
+ return
+ self.set_def_printed(definition)
+
+ if not self.is_defined(definition):
+ # print '%s: %s is not defined but used' % (self.filename, definition)
+ return
+
+ if self.defs.has_key(definition):
+ for var in self.defs[definition]:
+ match = utilities.variablere.match(var)
+ if match:
+ self.print_out_definition(output, match.group(1))
+ rhs = string.join(self.defs[definition])
+
+ output.append('%s = %s\n' % (definition, rhs))
+ else:
+ cond = self.conds[definition]
+ for var in cond[1] + cond[2]:
+ match = utilities.variablere.match(var)
+ if match:
+ self.print_out_definition(output, match.group(1))
+
+ output.append('if %s\n' % cond[0])
+ output.append('%s = %s\n' % (definition, string.join(cond[1])))
+ output.append('else\n')
+ output.append('%s = %s\n' % (definition, string.join(cond[2])))
+ output.append('endif\n')
+
+ def print_out_target(self, output, target):
+ defs = self.targets[target].used_defines()
+ for definition in defs:
+ self.print_out_definition(output, definition)
+ self.targets[target].print_out(output)
+
+ def target(self, target):
+ if self.targets.has_key(target):
+ return self.targets[target]
+ return None
+
+ def defines(self):
+ return self.defs.keys() + self.conds.keys()
+
+ def is_target(self, target):
+ return self.targets.has_key(target)
+
+ def is_defined(self, variable):
+ return self.defs.has_key(variable) or self.conds.has_key(variable)
+
+ def del_define(self, variable):
+ #return self.defs.remove(variable)
+ if self.defs.has_key(variable):
+ del self.defs[variable]
+
+ def add_define(self, variable, value, cond = "", cond_true=1):
+
+ if type(value) == types.StringType:
+ value = string.split(string.strip(string.replace(value, '\001', ' ')))
+
+ if not len(cond):
+ if not variable in self.overwrites:
+ self.overwrites.append(variable)
+ if self.defs.has_key(variable):
+ self.defs[variable].extend(value)
+ else:
+ self.defs[variable] = value
+ else:
+ if cond_true:
+ if self.conds.has_key(variable):
+ print '%s: %s is redefined in automake conditional' % (self.filename,
+ variable)
+ self.conds[variable] = (cond, value, [])
+ else:
+ if not self.conds.has_key(variable):
+ self.conds[variable] = (cond, [], value)
+ if not self.conds[variable][0] == cond:
+ utilities.print_error('%s: %s defined in two different conditionals\n' % \
(self.filename, + variable))
+ return
+ cond = self.conds[variable]
+ self.conds[variable] = (cond[0], cond[1], value)
+
+ def add_prefixed_variable(self, var, replace_srcdir=0):
+ if not self.defs.has_key(var):
+ # not defined here (may be subst_vars)
+ return
+
+ if var in self.overwrites:
+ # already defined
+ return
+ if self.conds.has_key(var):
+ list1 = self.replace_srcdir(self.conds[var][1])
+ list2 = self.replace_srcdir(self.conds[var][2])
+
+ self.conds[var] = (self.conds[var][0], list1, list2)
+ self.overwrites.append(var)
+ return
+ orig = self.definition(var)
+ rec = self.definition_rec(var)
+ if replace_srcdir:
+ rec = self.replace_srcdir(rec)
+ if orig != rec:
+ self.del_define(var)
+ self.add_define(var, rec)
+
+ def _var_rec_var(self, str):
+
+ if not len(str):
+ return ['']
+
+ val = utilities.variablesre.match(str)
+ if val:
+ if not len(val.group(1)) and not len(val.group(3)):
+ return self._var_rec_priv_(val.group(2))
+ else:
+ ret = self._var_rec_priv_(val.group(2))
+ if len(ret) > 1:
+ utilities.print_error('%s: variable \'%s\' is used in another variable, but \
contains a list\n' % (self.filename, val.group(2))) + return [str]
+ if not len(ret):
+ return [str]
+
+ suffix = self._var_rec_var(val.group(3))
+ prefix = self._var_rec_var(val.group(1))
+ assert(len(prefix) == 1)
+ assert(len(suffix) == 1)
+ retstr = prefix[0] + ret[0] + suffix[0]
+
+ if utilities.subst_vars.has_key(val.group(2)):
+ return [retstr]
+ else:
+ return self._var_rec_var(retstr)
+ else:
+ if str[0] == '@' and str[-1] == '@':
+ str = str[1:-1]
+ str = utilities.subst_vars[str]
+ return [str]
+ return [str]
+
+ def _var_rec_priv_(self, variable):
+ var_list = self.definition(variable)
+ if not len(var_list):
+ if variable in utilities.subst_vars.keys() or self.conds.has_key(variable):
+ return ["$(%s)" % variable]
+ else:
+ return []
+
+ return_list = []
+
+ if self.circle.has_key(variable):
+ sys.stderr.write("Warning: %s produces a circle.\n" % variable)
+ return return_list
+
+ self.circle[variable] = 1
+
+ for entry in var_list:
+ return_list.extend(self._var_rec_var(entry))
+
+ del self.circle[variable]
+ return return_list
+
+ def definition(self, variable):
+ if self.defs.has_key(variable):
+ return self.defs[variable]
+ else:
+ return []
+
+ def value_of(self, variable):
+ if self.defs.has_key(variable):
+ return string.join(self.defs[variable])
+ return None
+
+ def value_list(self, variable):
+ if self.defs.has_key(variable):
+ return self.definition_rec(variable)
+ return []
+
+ def definition_rec(self, variable):
+ self.circle = {}
+ ret = self._var_rec_priv_(variable)
+ del self.circle
+ return ret
+
+ def add_makefile_rules(self):
+ if self.subdir == '.':
+ prefix = ""
+ else:
+ prefix = self.subdir + "/"
+
+ self.insertTarget(self.build + 'Makefile',
+ ['$(top_builddir)/config.status', self.source + "Makefile.in"],
+ ["cd $(top_builddir) && " +
+ "$(SHELL) ./config.status " + prefix + "Makefile"])
+ self.insertTarget("all", self.build + "Makefile", phony=1)
+
+
+ def set_configure_files(self, files):
+ for _file in files:
+ if not _file in self.configure_files:
+ self.configure_files.append(_file)
+ if self.subdir == '.':
+ pfile = _file
+ else:
+ pfile = self.subdir + "/" + _file
+ self.insertTarget(self.build + _file,
+ ['$(top_builddir)/config.status',
+ '$(srcdir)/' + _file + '.in'],
+ ["\tcd $(top_builddir) && " +
+ "$(SHELL) ./config.status " + pfile])
+ self.add_makefile_rules()
+
+ def set_configure_headers(self, files, stamp_counter):
+ cleanrules = []
+ for file in files:
+ stamp = 'stamp-h%d' % stamp_counter
+
+ self.insertTarget(self.build + file, self.build + stamp,
+ ['@if test ! -f %s; then \\' % (self.build + file),
+ ' rm -f %s; \\' % (self.build + stamp),
+ ' cd %s && $(MAKE) %s; \\' % (self.build, stamp),
+ 'else :; fi'])
+
+ config_status_argument = '%s/%s' % (self.subdir, file)
+ if self.subdir == '.':
+ config_status_argument = file
+
+ self.insertTarget(self.build + stamp,
+ [self.source + file + '.in',
+ '$(top_builddir)/config.status'],
+ ['@rm -f %s' % (self.build + stamp),
+ 'cd $(top_builddir) && $(SHELL) ./config.status %s' % \
config_status_argument]) +
+ if stamp_counter == 1:
+ self.insertTarget(self.source + file + '.in', [ utilities.configure_in ],
+ ['cd $(top_srcdir) && $(AUTOHEADER)',
+ 'rm -f ' + self.build + stamp,
+ 'touch ' + self.source + file + '.in'])
+
+ cleanrules.append('-rm -f ' + self.build + file + ' ' + self.build + stamp)
+ stamp_counter += 1
+ if cleanrules:
+ self.insertTarget('distclean-hdr-%s' % self.canon_subdir, [],
+ cleanrules, phony=1)
+ self.insertTarget('distclean', 'distclean-hdr-%s' % self.canon_subdir, phony=1)
+
+
+ return stamp_counter
+
+ def final_reorder(self):
+ self.insertTarget('.SUFFIXES', '')
+ for target in self.targets.values():
+ if target.is_phony and not target.target == '.PHONY':
+ self.insertTarget('.PHONY', target.target, phony=1)
+
+ def cache_def(self, var, value):
+ value = self.expand(value)
+ self.cached_defs[var] = value
+ return value
+
+ def cache_def_list(self, var, list):
+ ret = ""
+ for item in list:
+ ret += self.expand(item) + " "
+ if ret and ret[-1] == ' ':
+ ret = ret[:-1]
+ self.cached_defs[var] = ret
+ return ret
+
+ def expand(self, file):
+ match = utilities.variablesre.match(file)
+ if not match:
+ return file
+ var = match.group(2)
+ begin = match.group(1)
+ if len(begin) and begin[-1] == '$':
+ # if it ends with a $, then the variable is escaped
+ return file[:match.end(2)] + self.expand(file[match.end(2):])
+ # TODO: while this cache saves us at least 30% recursions to expand
+ # the check also costs considerably time. But we can't put the expanded
+ # values into .defs that easily as INSTALL_DATA should not be collected
+ # as primary and we can't put cached subst_vars back in subst_vars
+ # as the variables _could_ expand depending on the Makefile's .defs
+ if self.cached_defs.has_key(var):
+ return begin + self.cached_defs[var] + self.expand(match.group(3))
+ if self.defs.has_key(var):
+ return begin + self.cache_def_list(var, self.defs[var]) + \
self.expand(match.group(3)) + if utilities.subst_vars.has_key(var):
+ return begin + self.cache_def(var, utilities.subst_vars[var]) + \
self.expand(match.group(3)) + if os.environ.has_key(var):
+ return begin + self.cache_def(var, os.environ[var]) + self.expand(match.group(3))
+ # print "DEBUG:", self.filename, "variable", var, "not found."
+ return begin + self.expand(match.group(3))
+
+ def read_in(self, filename, ignore_comments):
+ # read the file
+ try:
+ lines = open(filename, 'r').readlines()
+ except IOError, e:
+ print "No such file: ", filename
+ raise e
+ sys.exit(1)
+ ret = []
+ for line in lines:
+ if not len(line):
+ continue
+ if ignore_comments and line[0] != '\t':
+ try:
+ index = string.index(line, '#')
+ if index == 0:
+ continue
+ line = line[:index]
+ except ValueError:
+ pass
+ if line.startswith('include '):
+ line = string.strip(line[8:])
+ if line.startswith('$(top_srcdir)'):
+ file = os.path.join(utilities._topsrcdir, line[14:])
+ else:
+ file = os.path.join(os.path.dirname(filename), line)
+ ret.extend(self.read_in(file, ignore_comments))
+ self.am_includes.append(line)
+ else:
+ ret.append(line)
+ return ret
+
+ def parse(self, filename, ignore_comments, create_empty):
+ if len(filename) and not create_empty:
+ self.lines = self.read_in(filename, ignore_comments)
+ else:
+ self.lines = []
+ self.dirname = os.path.dirname(filename)
+ self.subdir = os.path.dirname(self.filename)
+
+ if utilities.top_builddir:
+ self.asubdir = os.path.abspath(utilities.top_builddir + "/" + self.subdir)
+ else:
+ self.asubdir = "."
+
+ if not len(self.subdir):
+ self.subdir = '.'
+ self.canon_subdir = 'top'
+ else:
+ self.canon_subdir = 'top_' + utilities.canon_name(self.subdir)
+
+ # sharing some variables
+ self.source = '$(top_srcdir)/'
+ self.build = '$(top_builddir)/'
+ if self.subdir != '.':
+ self.source += self.subdir + "/"
+ self.build += self.subdir + "/"
+
+ # add a final line feed to be sure we have it
+ if len(self.lines) == 0:
+ self.lines = ["\n"]
+ if not self.lines[-1][-1] == '\n':
+ self.lines.append('\n')
+
+ # concatinate lines with backslash (replaced with \001 for later
+ # reference). Automake comments are skiped
+ index = 0
+ ebre = re.compile('\\\[ \t]*\n')
+ while index < len(self.lines):
+ if self.lines[index].startswith('##'):
+ del self.lines[index]
+ continue
+ if ebre.search(self.lines[index]):
+ if index + 1 < len(self.lines):
+ self.lines[index:index+2] = [ ebre.sub('\001', self.lines[index], 1) + \
self.lines[index + 1] ] + else:
+ del self.lines[index]
+ else:
+ index = index + 1
+
+ self.find_targets()
+ self.find_defines()
+ self.find_opts()
+
+ for line in self.lines:
+ if len(string.strip(line)) and not self.commentre.match(line):
+ sys.stderr.write("%s: rest %s\n" % (self.filename, line))
+
+ # the targets allowed to be defined not-PHONY
+ for target in ['install-exec', 'install-data', 'all', "uninstall", "check"]:
+ targ = self.target(target + '-hook')
+ if targ:
+ targ.is_phony = 1
+ targ = self.target(target + '-local')
+ if targ:
+ targ.is_phony = 1
+
+ if self.targets.has_key(".FORWARDS"):
+ targets = self.targets[".FORWARDS"].deps
+ del self.targets[".FORWARDS"]
+ for target in targets:
+ # assuming forwards do not contain variables
+ del self.targets[target.value]
+
+ for handler in handlerbase.handlers:
+ handler.parse(self)
+
+ def replace_autoconf(self, line):
+ match = utilities.autoconfre.match(line)
+ if match:
+ if utilities.subst_vars.has_key(match.group(2)):
+ return match.group(1) + utilities.subst_vars[match.group(2)] + \
self.replace_autoconf(match.group(3)) + else:
+ return match.group(1) + '@' + match.group(2) + self.replace_autoconf('@' + \
match.group(3)) +
+ return line
+
+ def find_defines(self):
+
+ current_cond = ""
+ current_cond_true = 1
+ index = 0
+ while index < len(self.lines):
+ line = self.lines[index]
+ if self.commentre.match(line):
+ index = index + 1
+ continue
+ define = utilities.definere.match(line)
+ if define:
+ varname = define.group(1)
+ value = define.group(3)
+ if string.find(value, '@') != -1:
+ value = self.replace_autoconf(value)
+ if define.group(2) == '+':
+ if not self.is_defined(varname):
+ utilities.print_error("%s: %s += used before assigned value to the variable\n" \
% (self.filename, varname)) + elif current_cond:
+ if current_cond in utilities.true_conds:
+ if current_cond_true:
+ self.add_define(varname, value)
+ elif current_cond in utilities.false_conds:
+ if not current_cond_true:
+ self.add_define(varname, value)
+ else:
+ self.add_define(varname, value)
+ else:
+ self.add_define(varname, value)
+ else:
+ if current_cond:
+ if current_cond in utilities.true_conds:
+ if current_cond_true:
+ self.add_define(varname, value)
+ else:
+ self.add_define(varname, [])
+ elif current_cond in utilities.false_conds:
+ if not current_cond_true:
+ self.add_define(varname, value)
+ else:
+ self.add_define(varname, [])
+ else:
+ self.add_define(varname, value, current_cond, current_cond_true)
+ else:
+ if self.is_defined(varname):
+ utilities.print_error("%s: %s defined twice\n" % (self.filename, varname))
+ self.add_define(varname, value, current_cond, current_cond_true)
+
+ self.lines[index:index+1] = []
+ elif line.startswith('if '):
+ if len(current_cond):
+ utilities.print_error('%s: nested automake conditionals '
+ '(%s vs. %s)\n' % (self.filename,
+ current_cond,
+ string.strip(line[3:])))
+ return
+ current_cond = string.strip(line[3:])
+ current_cond_true = 1
+ self.lines[index:index+1] = []
+ elif line.startswith('else'):
+ current_cond_true = 0
+ self.lines[index:index+1] = []
+ elif line.startswith('endif'):
+ current_cond = ""
+ self.lines[index:index+1] = []
+ else:
+ index = index + 1
+
+ def find_targets(self):
+ targetre = re.compile(' *([^:]*)\s*:\s*(.*)')
+
+ index = 0
+ while index < len(self.lines):
+ line = self.lines[index]
+ if self.commentre.match(line):
+ index = index + 1
+ continue
+ # it's a croax - defines can contain : chars too
+ if utilities.definere.match(line):
+ index = index + 1
+ continue
+ targmatch = targetre.match(line)
+ if targmatch:
+ targets = string.split(targmatch.group(1))
+ deps = re.sub('#.*', '', targmatch.group(2))
+ deps = string.replace(deps, '\001', '')
+ newindex = index + 1
+ while newindex < len(self.lines):
+ if self.lines[newindex][0] == '\t':
+ newindex = newindex + 1
+ continue
+ # white space only lines count as comments too
+ if not len(string.strip(self.lines[newindex])) or \
self.commentre.match(self.lines[newindex]): + newindex = newindex + 1
+ continue
+ break
+ for target in targets:
+ if not self.first_target:
+ self.first_target = target
+ self.insertTarget(target, deps, self.lines[index+1:newindex], 1)
+ self.lines[index:newindex] = []
+ else:
+ index = index + 1
+ target = self.target('.PHONY')
+ if target:
+ target.is_phony = 1
+ for targ in target.deps:
+ if isinstance(targ, DefValue):
+ targ = self.expand(targ.value)
+ otarg = self.target(targ)
+ if not otarg:
+ utilities.print_error("%s: target '%s' marked PHONY but isn't present\n" % \
(self.filename, + targ))
+ else:
+ otarg.is_phony = 1
+
+ # does some basic checking for things that often break
+ def check(self):
+ for define in self.defines():
+ if define.endswith('_LDFLAGS') or define == 'LDFLAGS':
+ list = self.definition_rec(define)
+ for str in list:
+ if str.endswith('.la') or str.endswith('.lo'):
+ utilities.print_error('%s: LDFLAGS contains libtool\n'
+ '\tfile %s. These belong in LIBADD\n'
+ '\t(for libs) or LDADD (for programs)\n' %
+ (self.filename, str))
+ continue
+ if str.startswith('-L .') or str.startswith('-L.'):
+ utilities.print_error('%s: seen -L. in your LDFLAGS. You shouldn\'t\n'
+ 'use -L relative paths but use relative paths to\n'
+ '.la files in LDADD/LIBADD\n' % (self.filename))
+ continue
+ if define.endswith('_LIBADD') or define.endswith('_LDADD'):
+ list = self.definition_rec(define)
+ for str in list:
+ if str.endswith('.la') and not os.path.basename(str).startswith('lib'):
+ print('%s: %s contains a module. This is not portable!\n'
+ ' You can only link against libraries.' % (self.filename, define))
+ continue
+
+
+ def replace_srcdir(self, list):
+ res = []
+ Ire = re.compile('^-I([^/\$].*)')
+ index = 0
+ while index < len(list):
+ l = list[index]
+ if l == '-I':
+ print('%s: found single -I argument. This is not portable.' % self.filename)
+ list[index+1] = l + list[index+1]
+ del list[index]
+ continue
+ l = re.sub('^-I\$\(srcdir\)', '-I' + self.source[:-1], l)
+ match = Ire.match(l)
+ if match:
+ # strip the latest / - it looks ugly, no other reason
+ if match.group(1) == '.':
+ l = '-I' + self.build[:-1]
+ else:
+ l = '-I' + self.build + match.group(1)
+ else:
+ match = utilities.variablesre.match(l)
+ if match and self.conds.has_key(match.group(2)):
+ var = match.group(2)
+ if not var in self.overwrites:
+ list1 = self.replace_srcdir(self.conds[var][1])
+ list2 = self.replace_srcdir(self.conds[var][2])
+
+ self.conds[var] = (self.conds[var][0], list1, list2)
+ l = '$(%s)%s' % (var, match.group(3))
+ else:
+ l = '$(%s)%s' % (var, match.group(3))
+ self.overwrites.append(var)
+
+ res.append(l)
+ index = index + 1
+ return res
+
+ def create(self):
+
+ self.default_includes = string.join(self.replace_srcdir(['$(DEFS)', '-I.', \
'-I$(srcdir)', '-I$(top_builddir)'])) + self.add_define(self.canon_subdir + \
'_srcdir', '$(srcdir)') + self.add_define(self.canon_subdir + '_builddir', '.')
+ utilities.subst_vars[self.canon_subdir + '_srcdir'] = ''
+ utilities.subst_vars[self.canon_subdir + '_builddir'] = ''
+
+ self.find_binaries()
+
+ for handler in handlerbase.handlers:
+ handler.create(self)
+
+ finals = []
+ for prog in self.binaries.values():
+ prog.create_variables()
+
+ sources = self.definition_rec( prog.canon_name + '_SOURCES' )
+ index = 0
+ while index < len(sources):
+ source = sources[index]
+ match = utilities.variablere.match(source)
+ if match and self.conds.has_key(match.group(1)):
+ cond = self.conds[match.group(1)]
+ # if we don't know the value, we take both
+ sources.extend(cond[1])
+ sources.extend(cond[2])
+ del sources[index]
+ else:
+ index += 1
+
+ for source in sources:
+ match = utilities.extre.match(source)
+ if not match:
+ match = utilities.variablere.match(source)
+ print match
+ if match and self.conds.has_key(match.group(1)):
+ print match.group(1)
+ utilities.print_error('%s: "%s" doesnt match extre\n' % (self.filename, \
source)) + continue
+ base = match.group(1)
+ ext = match.group(2)
+
+ if ext in utilities.hext:
+ continue
+
+ if handlerbase.ext_dict.has_key(ext):
+ handlerbase.ext_dict[ext].create_source(prog, base, ext)
+ elif prog.is_cpp(ext) or ext == '.c':
+ pass
+ else:
+ utilities.print_error('%s: unknown source extension %s for %s\n' % \
(self.filename, ext, prog.name)) + finals.append(prog.name)
+
+ if len(finals):
+ self.insertTarget("final", [],
+ utilities.our_path +
+ " -C $(top_builddir)/%s UNSERMAKE_FORCE_FINAL=1 " % self.subdir +
+ string.join(finals), phony=1)
+ else:
+ self.insertTarget("final", [],
+ ["@echo no programs available to recreate as final. Call in subdirs"],
+ phony=1)
+
+ def set_def_printed(self, definition):
+ self.printed_defs[definition] = 1
+
+ def in_srcdir(self, file):
+ return os.path.exists(os.path.join(self.dirname, file))
+
+ def in_builddir(self, file):
+ if utilities.top_builddir:
+ path = os.path.abspath(utilities.top_builddir + "/" + self.subdir)
+ else:
+ path = "./" + self.subdir
+ return os.path.exists(os.path.join(path, file))
+
+ def userdirs(self):
+ try:
+ return self.userdirs
+ except AttributeError:
+ self.userdirs = []
+ for key in self.defines():
+ if key.endswith('dir'):
+ self.userdirs.append(key[:-3])
+ return self.userdirs
+
+
+ def dirprefixes(self):
+ # These are the directory names, which can be written
+ # as variable prefixes (e.g. xxx_SOURCES, with xxx being one of the
+ # dirs)
+ # it's a subset of automake's recognized directories
+ list = ["bin", "sbin", "libexec", "data", "sysconf", "sharedstate",
+ "localstate", "lib", "info", "include", "man",
+ "pkglib", "pkginclude", "pkgdata"]
+ list.extend( self.userdirs() )
+ return list
+
+ def primaries(self):
+ return ["PROGRAMS", "LIBRARIES", "SCRIPTS", "DATA", "HEADERS", "MANS",
+ "TEXINFOS", "LTLIBRARIES"]
+
+ def instance_binary(self, entry, prefix, type):
+ prog = program.Program(self, entry, prefix, type)
+ self.binaries[prog.name] = prog
+
+ # This finds all xxx_{PROGRAMS,{LT}LIBRARIES} and their sources
+ def find_binaries(self):
+
+ progre = re.compile('(.*)_(PROGRAMS|(LT)?LIBRARIES)')
+
+ for key in self.defines():
+ match = progre.match(key)
+ if not match:
+ continue
+
+ prefix = match.group(1)
+ suffix = match.group(2)
+
+ type = program.program_type(suffix)
+
+ for entry in self.definition_rec(key):
+ match = utilities.variablere.match(entry)
+ if not match:
+ self.instance_binary(entry, prefix, type)
+ else:
+ var = match.group(1)
+ if not self.conds.has_key(var):
+ utilities.print_error('%s: variable %s is not automake conditional, but can\'t \
be expanded\n' % + (self.filename, var))
+ continue
+ for l in self.conds[var][1]:
+ self.instance_binary(l, prefix, type)
+ for l in self.conds[var][2]:
+ self.instance_binary(l, prefix, type)
+
+ def find_opts(self):
+ self.options = {}
+ self.options["foreign"] = 0
+ self.options["qtonly"] = 0
+ self.options["noautodist"] = 0
+ self.options["foreign-libtool"] = 0
+ self.options["nofinal"] = 0
+ self.options["doxygen"] = 0
+ for option in self.definition("KDE_OPTIONS"):
+ self.options[option] = 1
+
+ def get_opt(self, opt):
+ try:
+ return self.options[opt]
+ except:
+ utilities.print_error('No such option named: %s\n' % opt)
+ return 0
+
+ def print_out(self, force):
+
+ self.printed_defs = {}
+
+ output = []
+
+ output.append("# Makefile.in generated by unsermake\n")
+ output.append("####################################\n")
+ output.append("\n")
+
+ if self.subdir != '.':
+ builddir = '..'
+ slashes = string.count(self.subdir, '/')
+ for dummy in range(slashes):
+ builddir = builddir + '/..'
+ output.append('top_builddir = %s\n' % builddir)
+ else:
+ output.append('top_builddir = .\n')
+
+ self.printed_defs["subdir"] = 1
+ self.printed_defs["srcdir"] = 1
+ self.printed_defs["top_builddir"] = 1
+ self.printed_defs["distdir"] = 1
+ self.printed_defs["top_distdir"] = 1
+ self.printed_defs["INSTALL"] = 1
+
+ if self.canon_subdir != "top":
+ output.append("top_srcdir = @top_srcdir@\n")
+ output.append("srcdir = @srcdir@\n")
+ self.printed_defs[self.canon_subdir + "_srcdir"] = 1
+ self.printed_defs[self.canon_subdir + "_builddir"] = 1
+ self.printed_defs["top_srcdir"] = 1
+
+ # these are the variables the toplevel Makefile _has_ to write
+ self.printed_defs[self.canon_subdir + "_srcdir"] = 1
+ self.printed_defs[self.canon_subdir + "_builddir"] = 1
+
+ for var in utilities.subst_vars.keys() + self.defines():
+ if not var in self.overwrites:
+ self.printed_defs[var] = 1
+
+ if self.subdir == '.':
+ keys = utilities.subst_vars.keys()
+ keys.sort()
+ for var in keys:
+ if var != 'top_builddir':
+ if len(utilities.subst_vars[var]):
+ output.append('%s = %s\n' % (var, utilities.subst_vars[var]))
+ else:
+ output.append('%s = @%s@\n' % (var, var))
+ output.append("transform = @program_transform_name@\n")
+ output.append("INSTALL = @INSTALL@\n")
+
+ phonies = []
+ keys = self.targets.keys()
+ keys.sort()
+ forwards = []
+ for key in ['install', 'install-data', 'install-exec', 'clean', 'check', \
'force-reedit']: + if not key in keys:
+ forwards.append(key)
+
+ targets_output = []
+
+ for key in keys:
+ if key in [".PHONY"]:
+ continue
+ target = self.targets[key]
+ if not target.user_specified:
+ self.print_out_target(targets_output, key)
+ if target.is_phony:
+ phonies.append(key)
+ else:
+ if target.is_phony:
+ forwards.append(key)
+
+ forwards[:0] = ['all']
+
+ output.append(".FORWARDS: " + string.join(forwards) + "\n\n")
+ for key in forwards:
+ output.append("%s:\n\t@echo 'WARNING: use unsermake instead of make or use a \
wrapper script, e.g. makeobj!!!'\n\t%s %s\n\n" % + (key, utilities.our_path, \
key)) +
+ output += targets_output
+
+ keys = self.defines()
+ keys.sort()
+ for definition in keys:
+ self.print_out_definition(output, definition)
+ #if not self.printed_defs.has_key(definition):
+ # print '%s = %s' % (definition, string.join(self.defs[definition]))
+
+ if len(phonies):
+ output.append(".PHONY: " + string.join(phonies) + "\n")
+ utilities.write_if_changed(re.sub('.am$', '.in', self.filename), output, force)
+
+ def translate_target(self, target):
+ targ = self.target(target)
+
+ if not targ:
+ targ = self.target('$(srcdir)/' + target)
+ lfs = self.source + target
+ else:
+ lfs = self.build + targ.target
+
+ if targ and targ.user_specified:
+ self.to_handle_targets.append(targ)
+
+ for dep in targ.deps:
+ if type(dep) == types.StringType:
+ self.translate_target(dep)
+ else:
+ self.translate_target(dep.value)
+
+ self.insertTarget(lfs, [])
+
+ def replace_builddir(self, deps, leaveout_vars=1):
+ newdeps = deps
+ deps = []
+ for dep in newdeps:
+ if isinstance(dep, DefValue):
+ dep = dep.value
+ if not len(dep):
+ continue
+ elif dep.startswith('./'):
+ dep = self.build + dep[2:]
+ elif dep[0] == '/': # absolute paths are rather seldom
+ pass
+ elif dep.startswith('../'):
+ dep = os.path.normpath('$(top_builddir)/' +
+ self.subdir + '/' + dep)
+ elif dep.startswith('$(srcdir)'):
+ if dep[9] == '/':
+ dep = self.source + dep[10:]
+ else:
+ dep = self.source + dep[9:]
+ elif dep.startswith('$(top_srcdir)'):
+ pass
+ elif dep[0] == '$':
+ if re.match('^\$\(top_builddir\)', dep):
+ deps.append(dep)
+ continue
+ match = utilities.variablesre.match(dep)
+ if not match or len(match.group(1)):
+ utilities.print_error('%s: "%s" appears to be a variable, but is none. \
Assertion hit!\n' % (self.filename, dep)) + continue
+ var = match.group(2)
+ pvar = '%s_%s' % (self.canon_subdir, var)
+ if leaveout_vars:
+ pvar = pvar + '_dep'
+ if self.conds.has_key(var):
+ if not self.conds.has_key(pvar):
+ list1 = self.replace_builddir(self.conds[var][1], leaveout_vars)
+ list2 = self.replace_builddir(self.conds[var][2], leaveout_vars)
+ if list1 == list2:
+ deps.extend(list1)
+ else:
+ self.conds[pvar] = (self.conds[var][0], list1, list2)
+ deps.append('$(%s)%s' % (pvar, match.group(3)))
+ else:
+ deps.append('$(%s)%s' % (pvar, match.group(3)))
+
+ continue
+ elif utilities.subst_vars.has_key(var) and len(utilities.subst_vars[var]):
+ deps.extend(self.replace_builddir(string.split(utilities.subst_vars[var] + \
match.group(3)), leaveout_vars)) + continue
+ elif leaveout_vars:
+ continue
+ elif dep[0] == '-':
+ if leaveout_vars:
+ continue
+ else:
+ target = self.target(dep)
+ if not target:
+ if self.in_srcdir(dep) and not self.in_builddir(dep):
+ dep = "$(srcdir)/" + dep
+ else:
+ dep = self.build + dep
+ elif not target.is_phony:
+ dep = self.build + dep
+
+ deps.append(dep)
+ return deps
+
+ def rewrite_list(self, list):
+ files = []
+ for file in list:
+ self.translate_target(file)
+ match = utilities.variablere.match(file)
+ if match:
+ self.rewrite_cond(match.group(1))
+ files.append('$(%s_%s)' % (self.canon_subdir, match.group(1)))
+ else:
+ dir = self.source
+ if self.target(file) or self.target(self.build + file):
+ dir = self.build
+ elif file.startswith(self.source):
+ files.append(file)
+ continue
+ files.append(dir + file)
+ return files
+
+ def rewrite_cond(self, var):
+ if not self.conds.has_key(var):
+ print self.filename, var
+ assert(self.conds.has_key(var))
+ cond = self.conds[var]
+ new_tuple = (cond[0], self.rewrite_list(cond[1]),
+ self.rewrite_list(cond[2]))
+ self.conds[self.canon_subdir + '_' + var] = new_tuple
+
+ def add_install_target(self, install_rules, dirprefix, file, primary):
+ my_install_rules = []
+ basename = os.path.basename(file)
+ basenamenoext = basename
+ if basenamenoext.rfind('.') != -1:
+ basenamenoext = basenamenoext[:basenamenoext.rfind('.')]
+ mansect = ""
+
+ dirname = '$(DESTDIR)$(%sdir)' % dirprefix
+ if primary == 'MANS':
+ match = self.mansectre.match(file)
+ if not match:
+ utilities.print_error('%s is an illegal manpage filename.\n' % file)
+ else:
+ mansect = match.group(1)
+ dirname += "/man" + mansect
+ assert(mansect in self.mansections)
+
+ for rule in install_rules:
+ rule = string.replace(rule, '@file@', file)
+ rule = string.replace(rule, '@basename@', basename)
+ rule = string.replace(rule, '@basenamenoext@', basenamenoext)
+ rule = string.replace(rule, '@mansect@', mansect)
+ my_install_rules.append(rule)
+
+ target = dirname + ('/%s' % basename)
+ deps = [file, '$(UNSERMAKE_FORCE_INSTALL)']
+
+ self.insertTarget(target, deps,
+ ['@test -d %s || $(mkinstalldirs) %s' % (dirname, dirname)]
+ + my_install_rules )
+ return target
+
+ def collect_primaries(self):
+ primaries = self.primaries()
+ primaryre = re.compile('(.*)_([^_]*)')
+ for key in self.defines():
+ match = primaryre.match(key)
+ if not match or not match.group(2) in primaries:
+ continue
+ primary = match.group(2)
+ if primary == 'MANS' and match.group(1) == 'KDE':
+ continue
+ if self.conds.has_key(key):
+ utilities.print_error('%s: found primary %s in conditional.\n' % (self.filename, \
key)) + continue
+ files = self.rewrite_list(self.definition_rec(key))
+
+ self.add_define("%s_%s" % (self.canon_subdir, key), files)
+ if match.group(1) == 'check':
+ self.insertTarget("check-am", "$(%s_%s)" % (self.canon_subdir, key), phony=1)
+ self.insertTarget("check", "check-am", phony=1)
+ elif match.group(1) != 'EXTRA':
+ self.insertTarget("all-%s" % (self.canon_subdir),
+ "$(%s_%s)" % (self.canon_subdir, key),
+ phony=1)
+ if match.group(1) in ['EXTRA', 'noinst', 'check']:
+ continue
+
+ varname = '%sdir' % match.group(1)
+ if not utilities.subst_vars.has_key(varname):
+ if not self.is_defined('%s_%sdir' % (self.canon_subdir, match.group(1))):
+ if primary == 'MANS':
+ self.add_define('%s_%sdir' % (self.canon_subdir, match.group(1)),
+ '$(mandir)/man' + match.group(1)[-1])
+ else:
+ dirkey = '%sdir' % match.group(1)
+ ndirkey = '%s_%sdir' % (self.canon_subdir, match.group(1))
+ if not self.conds.has_key(dirkey):
+ self.add_define(ndirkey, self.definition_rec(dirkey))
+ else:
+ self.conds[ndirkey] = self.conds[dirkey]
+
+ dirprefix = '%s_%s' % (self.canon_subdir, match.group(1))
+ else:
+ dirprefix = match.group(1)
+
+ install_kind = 'exec'
+
+ install_rules = ['@$(V_ECHO) "%s @file@ %s"; \\' % (utilities.installing_text, \
utilities.normal)] + uninstall_rules = ['$(V_ECHO) "%s $$p %s"; \\' % \
(utilities.uninstalling_text, utilities.normal)] + if primary == 'LTLIBRARIES':
+ install_rules += ['$(LIBTOOL) --quiet --mode=install $(INSTALL) \
$(INSTALL_STRIP_FLAG) @file@ $(DESTDIR)$(%sdir) > /dev/null' % dirprefix] \
+ uninstall_rules += ['f="`echo $$p | sed -e \'s|^.*/||\'`"; \\', \
+ '$(LIBTOOL) --quiet --mode=uninstall rm -f $(DESTDIR)$(%sdir)/$$f; \\' % \
dirprefix] + elif primary == 'LIBRARIES':
+ install_rules += ['$(INSTALL_DATA) @file@ $(DESTDIR)$(%sdir)/@basename@' % \
dirprefix, + '$(RANLIB) $(DESTDIR)$(%sdir)/@basename@' % dirprefix]
+ uninstall_rules += ['f="`echo $$p | sed -e \'s|^.*/||\'`"; \\',
+ 'rm -f $(DESTDIR)$(%sdir)/$$f; \\' % dirprefix]
+ elif primary in ['DATA', 'HEADERS']:
+ install_rules += ['$(INSTALL_DATA) @file@ $(DESTDIR)$(%sdir)/@basename@' % \
dirprefix] + uninstall_rules += ['f="`echo $$p | sed -e \'s|^.*/||\'`"; \\',
+ 'rm -f $(DESTDIR)$(%sdir)/$$f; \\' % dirprefix]
+ install_kind = 'data'
+ elif primary == 'PROGRAMS':
+ if sys.platform == 'cygwin':
+ install_rules += ["EXEXT='.exe'; \\"]
+ install_rules += ["p1=`echo @file@ |sed 's/$(EXEEXT)$$//'`; \\", \
+ 'if test -f @file@ || test -f $$p1 ; then \\', \
+ "f=`echo $$p1|sed '$(transform);s|^.*/||;s/$$/$(EXEEXT)/'`; \\", \
+ '$(INSTALL_PROGRAM_ENV) $(LIBTOOL) --quiet --mode=install ' \
+ '$(INSTALL_PROGRAM) @file@$(EXEEXT) $(DESTDIR)$(%sdir)/$$f; else :;\\' % \
dirprefix, + 'fi'] \
+ uninstall_rules += ['f=`echo $$p|sed \
\'s/$(EXEEXT)$$//;$(transform);s|^.*/||;s/$$/$(EXEEXT)/\'`; \\', + 'rm -f \
$(DESTDIR)$(%sdir)/$$f; \\' % dirprefix] + elif primary == 'SCRIPTS':
+ install_rules += ['f="`echo @file@ |sed \'$(transform)\'| sed \'s|^.*/||\'`"; \
\\', + '$(INSTALL_SCRIPT) @file@ $(DESTDIR)$(%sdir)/$$f' % dirprefix]
+ uninstall_rules += ['f="`echo $$p|sed \'$(transform)\' | sed \'s|^.*/||\'`"; \
\\', + 'rm -f $(DESTDIR)$(%sdir)/$$f; \\' % dirprefix]
+ elif primary == 'MANS':
+ installdir = '$(DESTDIR)$(%sdir)' % dirprefix
+ manfilename = ''
+ if match.group(1) == 'man':
+ installdir += '/man@mansect@'
+ manfilename = '@basename@'
+ else:
+ assert(match.group(1)[-1] in self.mansections)
+ manfilename = '@basenamenoext@.%s' % match.group(1)[-1]
+ install_rules += ['$(INSTALL_DATA) @file@ %s/%s' % (installdir, manfilename)]
+
+ uninstalldir = '$(DESTDIR)$(%sdir)' % dirprefix
+ if match.group(1) == "man":
+ manfilename = '$$f'
+ uninstall_rules.append('s="`echo $$p | sed -e \
\'s|^.*\\.\\([0-9a-z]\\)[a-z]*$$|\\1|\'`"; \\') + uninstalldir += "/man$$s";
+ else:
+ manfilename = '"`echo $$f | sed -e \'s|[^\\.]*$$|%s|\'`"' % match.group(1)[-1]
+
+ uninstall_rules += ['f="`echo $$p | sed -e \'s|^.*/||\'`"; \\',
+ 'echo " rm -f %s/%s"; \\' % (uninstalldir, manfilename),
+ 'rm -f %s/%s; \\' % (uninstalldir, manfilename)]
+ install_kind = 'data'
+ else:
+ print('%s: install for %s unimplemented\n' % (self.filename, primary))
+ install_rules = []
+ uninstall_rules = []
+
+ for file in files:
+ vmatch = utilities.variablere.match(file)
+ if vmatch:
+ (cond, list_true, list_false) = self.conds[vmatch.group(1)]
+ varname = 'installed-%s%s-%s' % (dirprefix, primary, vmatch.group(1))
+ new_list_true = []
+ new_list_false = []
+ for file in list_true:
+ target = self.add_install_target(install_rules, dirprefix, file, primary)
+ new_list_true.append(target)
+ for file in list_false:
+ target = self.add_install_target(install_rules, dirprefix, file, primary)
+ new_list_false.append(target)
+ self.conds[varname] = (cond, new_list_true, new_list_false)
+ self.insertTarget('install-%s-%s%s' % (self.canon_subdir, dirprefix, primary),
+ '$(%s)' % varname, phony=1)
+ else:
+ target = self.add_install_target(install_rules, dirprefix, file, primary)
+ self.insertTarget('install-%s-%s%s' % (self.canon_subdir, dirprefix, primary),
+ target, phony=1)
+
+ if len(files):
+ self.insertTarget('uninstall-%s-%s%s' % (self.canon_subdir, dirprefix, primary), \
[], + ['@list=\'$(%s_%s)\'; for p in $$list; do \\' % (self.canon_subdir, \
key)] + + uninstall_rules +
+ ['done'], phony=1)
+ self.insertTarget('install-%s-%s' % (install_kind, self.canon_subdir),
+ 'install-%s-%s%s' % (self.canon_subdir, dirprefix, primary), phony=1)
+ self.insertTarget('uninstall', 'uninstall-%s-%s%s' %
+ (self.canon_subdir, dirprefix, primary), phony=1)
+
+ local_deps = []
+ for targ in ["install-exec", "install-data"]:
+ if self.is_target(targ + '-%s' % self.canon_subdir) and self.is_target(targ + \
"-hook"): + self.insertTarget(targ, targ + '-%s' % self.canon_subdir, phony=1)
+ self.insertTarget(targ + '-%s' % self.canon_subdir, [],
+ "cd " + self.build + " && $(MAKE) " + targ + "-hook", phony=1)
+ local_deps.append(targ + '-%s' % (self.canon_subdir))
+ elif self.is_target(targ + '-%s' % self.canon_subdir):
+ self.insertTarget(targ, targ + '-%s' % self.canon_subdir, phony=1)
+ local_deps.append(targ + '-%s' % (self.canon_subdir))
+ elif self.is_target(targ + "-hook"):
+ utilities.print_error("%s: Hook for target %s installed without this target \
being used\n" % (self.filename, targ)) + return
+
+ self.insertTarget('install-%s' % (self.canon_subdir), local_deps, phony=1)
+
+ # adding the default install targets
+ self.insertTarget('install', ["install-data", "install-exec"], phony=1)
+ self.insertTarget('install-exec', '', phony=1)
+ self.insertTarget('install-data', '', phony=1)
+
+ def collect(self):
+
+ # adding default targets
+
+ self.default_includes = string.join(self.replace_srcdir(['$(DEFS)', '-I.', \
'-I$(srcdir)', '-I$(top_builddir)'])) +
+ self.insertTarget("all-%s" % (self.canon_subdir), [], phony=1)
+ self.insertTarget("all-recursive", "all-%s" % (self.canon_subdir), phony=1)
+ self.insertTarget("all", "all-recursive", phony=1)
+ self.insertTarget("install", "install-%s" % (self.canon_subdir), phony=1)
+
+ self.insertTarget("compile", "", phony=1)
+ self.insertTarget("clean", "", phony=1)
+
+ self.find_binaries()
+
+ for handler in handlerbase.handlers:
+ handler.shuffle_binaries(self)
+
+ for prog in self.binaries.values():
+ prog.handle_sources( self.definition_rec( prog.canon_name + '_SOURCES' ) )
+ prog.add_targets()
+
+ for bin in self.binaries.values():
+ list = self.value_list(bin.canon_name + '_COMPILE_FIRST')
+ for imp in list:
+ self.translate_target(imp)
+ for imp in list:
+ for obj in bin.objs:
+ self.insertTarget(self.build + obj, self.replace_builddir([imp]))
+ bin.add_final_target()
+
+ for handler in handlerbase.handlers:
+ handler.collect(self)
+
+ for targ in ['install-exec', 'install-data', 'all', "uninstall", "check", \
"clean"]: + target = self.target("%s-local" % targ)
+ if target:
+ self.insertTarget(targ, target.target, phony=1)
+
+ for target in self.targets.values():
+ if target.user_specified:
+ ndeps = []
+ for dep in target.deps:
+ if isinstance(dep, DefValue):
+ ndeps.append(dep.value)
+ else:
+ ndeps.append(dep)
+ ndeps = self.replace_builddir(ndeps)
+ target.deps = []
+ for dep in ndeps:
+ target.deps.append(DefValue(dep))
+ ntargs = self.replace_builddir([target.target])
+ if not len(ntargs):
+ del self.targets[target.target]
+ continue
+ # TODO: if it builds then remove the loop
+ assert(len(ntargs) == 1)
+ if ntargs[0] != target.target:
+ del self.targets[target.target]
+ target.target = ntargs[0]
+ self.addTarget(target)
+
+ self.collect_primaries()
+ if self.is_defined("CLEANFILES"):
+ self.insertTarget("clean-%s-CLEANFILES" % self.canon_subdir,
+ [], "rm -f " + \
string.join(self.replace_builddir(self.definition_rec("CLEANFILES"))), phony=1) \
+ self.insertTarget("clean", "clean-%s-CLEANFILES" % self.canon_subdir, phony=1) \
+ + if self.is_defined("DISTCLEANFILES"):
+ self.insertTarget("clean-%s-DISTCLEANFILES" % self.canon_subdir,
+ [], "rm -f " + \
string.join(self.replace_builddir(self.definition_rec("DISTCLEANFILES"))), phony=1) \
+ self.insertTarget("clean", "clean-%s-DISTCLEANFILES" % self.canon_subdir, \
phony=1) +
+ for targ in self.to_handle_targets:
+ dir = self.build
+ if not self.target(targ.target) and not self.target(dir + targ.target):
+ dir = self.source
+
+ if targ.target.startswith('$(top_builddir)'):
+ continue
+ if targ.target.startswith('$(top_srcdir)'):
+ continue
+
+ deps = []
+ for dep in targ.deps:
+ if isinstance(dep, DefValue):
+ dep = dep.value
+ if len(os.path.dirname(dep)):
+ deps.append(dep)
+ else:
+ if self.is_target(self.build + dep):
+ deps.append(self.build + dep)
+ else:
+ deps.append(self.source + dep)
+
+ lfs = dir + targ.target
+ if targ.target.startswith('$(srcdir)'):
+ lfs = self.source + targ.target[10:]
+
+ if not len(targ._rules): # that's the easy case
+ self.insertTarget(lfs, self.replace_builddir(deps, 0))
+ if self.targets.has_key(targ.target):
+ del self.targets[targ.target]
+ else:
+ if targ.target.startswith('$(srcdir)'):
+ self.insertTarget(lfs,
+ self.replace_builddir(deps, 0),
+ "cd " + self.build
+ + " && $(MAKE) '" + self.source +
+ targ.target[10:] + "'")
+ else:
+ del self.targets[targ.target]
+ targ.target = lfs
+ self.addTarget(targ)
+
+ for bin in self.binaries.values():
+ bin.collect_final_dependencies()
+
+ deps = []
+ for file in ['Makefile.am'] + self.am_includes:
+ if file.startswith('$(top_srcdir)'):
+ deps.append(file)
+ else:
+ deps.append(self.source + file)
+
+ if not self.subdir == '.':
+ rule = "cd $(top_srcdir) && %s -c %s/Makefile" % (utilities.our_path, \
self.subdir) + self.insertTarget(self.source + "Makefile.in", deps, rule)
+
+ if 0: # TODO: check if still necessary
+ for target in self.targets.values():
+ if target.user_specified and target.has_rules() and not target.is_phony:
+ self.insertTarget('$(top_builddir)/%s/%s' %
+ (self.subdir, target.target),
+ target.target)
+
+ # for subdirs we need to generate aliases for ../<subdir>/target
+ targets = self.targets.values()
+ prefixlen = len(self.build)
+ for target in targets:
+ if target.target[:prefixlen] == self.build and not target.is_phony:
+ targ = target.target[prefixlen:]
+ # don't add rules the other way around than the above
+ rhs = '$(top_builddir)/%s/%s' % (self.subdir, targ)
+ if not self.is_target(rhs):
+ # note, that we insert an empty command (the " ;" at the end
+ # of the prerequisites), to avoid invoking any default commands
+ self.insertTarget(targ, [rhs, ";"])
+ elif not self.target(rhs).has_rules():
+ self.insertTarget(targ, ";")
+
+ else:
+ deps.append(utilities.configure_in)
+ deps.append('$(top_srcdir)/aclocal.m4')
+ rule = "cd $(top_srcdir) && %s -c" % utilities.our_path
+ self.insertTarget(self.source + "Makefile.in", deps, rule)
+
+ def read_deps(self):
+ if not len(self.dep_files):
+ return
+
+ depdir_value = utilities.subst_vars["DEPDIR"]
+ depdir = utilities.top_builddir + "/" + self.subdir + "/" + depdir_value + "/"
+ if not os.path.exists(depdir):
+ try:
+ os.mkdir(depdir)
+ except OSError, e:
+ print "failed to create %s:" % depdir_value, e
+ return
+
+ for dep in self.dep_files:
+ dep = depdir + dep
+ if os.path.exists(dep):
+ dep_lines = open(dep).readlines()
+ if not len(dep_lines):
+ continue
+ if dep_lines[0].startswith("builddir="):
+ builddir = dep_lines[0][len("builddir="):-1]
+ topbuilddir = dep_lines[1][len("top_builddir="):-1]
+ otargetfile = targetfile = dep_lines[2][len("targetfile="):-1]
+ if targetfile.startswith('$(top_builddir)'):
+ targetfile = utilities.top_builddir + targetfile[len('$(top_builddir)'):]
+ if topbuilddir == ".":
+ tops = []
+ else:
+ tops = string.split(topbuilddir, '/')
+ builds = string.split(builddir, '/')
+ mytopbuilddir = dir = os.path.abspath(utilities.top_builddir) + "/"
+ for index in range(0, len(tops)):
+ dir += "/%s" % builds[index+1]
+ dir += '/'
+
+ target_deps = []
+ self.insertTarget(targetfile, [])
+ target = self.targets[targetfile]
+
+ for dep in dep_lines[3:]:
+
+ dep = dep[:-1]
+ if dep[0] != '/':
+ dep = os.path.normpath(dir + dep)
+
+ if dep.startswith(mytopbuilddir):
+ if find_missing_deps:
+ dep2 = '$(top_builddir)/' + dep[len(mytopbuilddir):]
+ found=False
+ for dep3 in self.targets[otargetfile].deps:
+ if type(dep3) == types.StringType:
+ print dep2, dep3
+ assert(False)
+ else:
+ if dep2 == dep3.value:
+ found=True
+
+ if not found:
+ if found_missing_deps.has_key(otargetfile):
+ found_missing_deps[otargetfile].append(dep2)
+ else:
+ found_missing_deps[otargetfile] = [dep2]
+
+ else:
+ dep = utilities.top_builddir + dep[len(mytopbuilddir) - 1:]
+
+ if find_missing_deps: continue
+
+ target.deps.append(dep)
+ maybe_missing[dep] = True
+
+ self.insertTarget(targetfile, target_deps)
+
+ else:
+ deps_am = AMFile(dep)
+ for targ in deps_am.targets.values():
+ targ.user_specified = 0
+ self.addTarget(targ)
+
+ self.dep_files = []
diff --git a/unsermake/assembler.um b/unsermake/assembler.um
new file mode 100644
index 0000000..7b66f12
--- /dev/null
+++ b/unsermake/assembler.um
@@ -0,0 +1,42 @@
+# -*-python-*-
+# vim: ts=4
+
+import handlerbase, program
+
+class AssemblerHandler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "assembler"
+
+ def extensions(self):
+ return ['.s', '.S']
+
+ def source(self, prog, base, ext):
+ line = 'set '
+ if prog.type == program.PTYPE_LTLIBRARY:
+ line = line + "$(LIBTOOL) --mode=compile "
+
+ line = line + "$(CCAS) "
+ line += prog.handle_variable("CCASFLAGS")
+ line = line + "-o $@ "
+
+ rulef = prog.mfile
+
+ dir = rulef.source
+ insource = 1
+ if rulef.target(rulef.build + base + ext) \
+ or rulef.target(base + ext):
+ dir = rulef.build
+
+ line = line + "-c %s ;\\" % (dir + base + ext)
+ lines = [ "@$(V_ECHO) 'assembling %s'; \\" % (rulef.source + base + ext), line, \
'$(V_EXEC)'] + rulef.insertTarget(rulef.build + base + prog.objext,
+ rulef.source + base + ext,
+ lines)
+
+ prog.mfile.translate_target(base + ext)
+ prog.mfile.translate_target(base + prog.objext)
+
+ rulef.add_define(prog.canon_name + "_OBJECTS",
+ rulef.build + base + prog.objext)
+
+handlerbase.register_handler(AssemblerHandler())
diff --git a/unsermake/autodeps.um b/unsermake/autodeps.um
new file mode 100644
index 0000000..d4d94e6
--- /dev/null
+++ b/unsermake/autodeps.um
@@ -0,0 +1,142 @@
+# -*-python-*-
+# vim: ts=4
+
+import handlerbase, amfile
+
+class AutoDepsHandler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "autodeps"
+
+ def create_header_list(self, cpp2headers, list):
+ new_list = list[:]
+ changed = False
+ for file in list:
+ if cpp2headers.has_key(file):
+ match = cpp2headers[file]
+ for head in match:
+ if not head in new_list:
+ new_list.append(head)
+ changed = True
+ if not changed:
+ return list
+ return self.create_header_list(cpp2headers, new_list)
+
+ def create(self, makefile):
+
+ import utilities
+ cppfiles, hfiles = utilities.list_source_files(makefile.dirname)
+
+ # first we start collecting all includes
+ includere = re.compile("^[ \t]*#include\s*[<\"](.*?)[>\"]")
+ cpp2headers = {}
+ allheaders = {}
+
+ for (base, ext) in hfiles + cppfiles:
+ file = base + ext
+ cpp2headers[file] = []
+ lines = open(makefile.dirname + "/" + file).readlines()
+ for line in lines:
+ match = includere.match(line)
+ if match:
+ if match.group(1) == file:
+ print "%s: file %s seems to include itself. Ignoring." % \
+ (makefile.filename, file)
+ continue
+ header = match.group(1)
+ if not header in cpp2headers[file]:
+ cpp2headers[file].append(header)
+ allheaders[match.group(1)] = 1
+
+ for file in cpp2headers.keys():
+ if not allheaders.has_key(file): # assume leaf
+ files = self.create_header_list(cpp2headers, cpp2headers[file])
+ files.sort()
+ if file in files:
+ print_error("%s: file %s seems to include itself\n" % (self.filename, file))
+ for header in files:
+ if string.find(header, '/') == -1:
+ makefile.add_define("__um_includes_" + header, file)
+
+ return
+
+ def collect(self, makefile):
+ for targ in makefile.targets.values():
+ if targ.target.startswith(makefile.build):
+ target = targ.target[len(makefile.build):]
+ if makefile.is_defined("__um_includes_" + target):
+ files = makefile.definition("__um_includes_" + target)
+ for file in files:
+ match = utilities.extre.match(file)
+ if match:
+ base = match.group(1)
+ makefile.insertTarget(makefile.build + base + ".o",
+ makefile.build + target)
+ makefile.insertTarget(makefile.build + base + ".lo",
+ makefile.build + target)
+
+ # below is a quite radical idea of mine, but it's pretty complex to get right
+ def nonsense_fornow(self):
+ # first collect the include paths
+ flags = string.split(makefile.default_includes)
+ if makefile.is_defined("AM_CPPFLAGS"):
+ flags.extend(makefile.definition_rec("AM_CPPFLAGS"))
+ if makefile.is_defined("INCLUDES"):
+ flags.extend(makefile.definition_rec("INCLUDES"))
+
+ # in src/build dirs we collect include directories that are either
+ # below srcdir or below builddir
+ src_dirs = []
+ build_dirs = []
+
+ last_is_i = False
+ for flag in flags:
+ if last_is_i:
+ search_dirs.append(flag)
+ last_is_i = False
+ else:
+ if flag.startswith("-I"):
+ if flag == "-I":
+ last_is_i = True
+ else:
+ src = False
+ flag = flag[2:]
+ if flag.startswith("$(top_srcdir)"):
+ flag = flag[len("$(top_srcdir)"):]
+ src = True
+ elif flag.startswith("$(top_builddir)"):
+ flag = flag[len("$(top_builddir)"):]
+ else:
+ continue
+
+ if len(flag):
+ flag = os.path.normpath(flag)
+ if flag[0] == '/':
+ flag = flag[1:] + "/"
+
+ if src:
+ if not flag in src_dirs:
+ src_dirs.append(flag)
+ else:
+ if not flag in build_dirs:
+ build_dirs.append(flag)
+
+ header_paths = {}
+ for header in allheaders.keys():
+ path = None
+ for dir in src_dirs:
+ if os.path.exists(amfile._topsrcdir + dir + header):
+ path = dir + header
+ break
+ if path:
+ header_paths[header] = path
+
+ for file in cpp2headers.keys():
+ for header in cpp2headers[file]:
+ if header_paths.has_key(header):
+ makefile.insertTarget("$(srcdir)/" + file, "$(top_srcdir)/" + \
header_paths[header]) + print makefile.dirname + "/" + file, header_paths[header]
+ else:
+ print makefile.subdir + "/" + file, "no path for", header
+
+
+handlerbase.register_handler(AutoDepsHandler())
diff --git a/unsermake/autodirs.um b/unsermake/autodirs.um
new file mode 100644
index 0000000..dca0a7d
--- /dev/null
+++ b/unsermake/autodirs.um
@@ -0,0 +1,29 @@
+# -*-python-*-
+# vim: ts=4
+
+import handlerbase, utilities
+
+class AutoDIRSHandler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "autodirs"
+
+ def create(self, amfile):
+ if not amfile.is_defined("SUBDIRS"):
+ return
+
+ subdirs = amfile.definition_rec("SUBDIRS")
+
+ if subdirs == ['$(AUTODIRS)']:
+ subdirs = []
+ for entry in os.listdir(amfile.dirname):
+ if entry == 'CVS' or entry[0] == '.':
+ continue
+ if not os.path.isdir(amfile.dirname + "/" + entry):
+ continue
+ if os.path.exists(amfile.dirname + "/" + entry + "/Makefile.am"):
+ subdirs.append(entry)
+
+ amfile.del_define("SUBDIRS")
+ amfile.add_define("SUBDIRS", subdirs)
+
+handlerbase.register_handler(AutoDIRSHandler())
diff --git a/unsermake/check.um b/unsermake/check.um
new file mode 100644
index 0000000..c5a2776
--- /dev/null
+++ b/unsermake/check.um
@@ -0,0 +1,104 @@
+# -*-python-*-
+# vim: ts=4
+
+import handlerbase
+
+class Test_Handler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "check"
+
+ def create(self, amfile):
+ if not amfile.is_defined("TESTS"):
+ return
+
+ l = ['@cd $(top_builddir)/$(subdir) ;\\',
+ 'failed=0; all=0; xfail=0; xpass=0; skip=0; \\',
+ 'srcdir=$(srcdir); export srcdir; \\',
+ 'list=\'$(TESTS)\'; \\',
+ 'if test -n "$$list"; then \\',
+ ' for tst in $$list; do \\',
+ ' if test -f ./$$tst; then dir=./; \\',
+ ## Note: Solaris 2.7 seems to expand TESTS using VPATH. That is
+ ## why we also try `dir='
+ ' elif test -f $$tst; then dir=; \\',
+ ' else dir="$(srcdir)/"; fi; \\',
+ ' if $(TESTS_ENVIRONMENT) $${dir}$$tst; then \\',
+ ## Success
+ ' all=`expr $$all + 1`; \\',
+ ' case " $(XFAIL_TESTS) " in \\',
+ ' *" $$tst "*) \\',
+ ' xpass=`expr $$xpass + 1`; \\',
+ ' failed=`expr $$failed + 1`; \\',
+ ' echo "XPASS: $$tst"; \\',
+ ' ;; \\',
+ ' *) \\',
+ ' echo "PASS: $$tst"; \\',
+ ' ;; \\',
+ ' esac; \\',
+ ' elif test $$? -ne 77; then \\',
+ ## Failure
+ ' all=`expr $$all + 1`; \\',
+ ' case " $(XFAIL_TESTS) " in \\',
+ ' *" $$tst "*) \\',
+ ' xfail=`expr $$xfail + 1`; \\',
+ ' echo "XFAIL: $$tst"; \\',
+ ' ;; \\',
+ ' *) \\',
+ ' failed=`expr $$failed + 1`; \\',
+ ' echo "FAIL: $$tst"; \\',
+ ' ;; \\',
+ ' esac; \\',
+ ' else \\',
+ ## Skipped
+ ' skip=`expr $$skip + 1`; \\',
+ ' echo "SKIP: $$tst"; \\',
+ ' fi; \\',
+ ' done; \\',
+ ## Prepare the banner
+ ' if test "$$failed" -eq 0; then \\',
+ ' if test "$$xfail" -eq 0; then \\',
+ ' banner="All $$all tests passed"; \\',
+ ' else \\',
+ ' banner="All $$all tests behaved as expected ($$xfail expected \
failures)"; \\', + ' fi; \\',
+ ' else \\',
+ ' if test "$$xpass" -eq 0; then \\',
+ ' banner="$$failed of $$all tests failed"; \\',
+ ' else \\',
+ ' banner="$$failed of $$all tests did not behave as expected ($$xpass \
unexpected passes)"; \\', + ' fi; \\',
+ ' fi; \\',
+ ## DASHES should contain the largest line of the banner.
+ ' dashes="$$banner"; \\',
+ ' skipped=""; \\',
+ ' if test "$$skip" -ne 0; then \\',
+ ' skipped="($$skip tests were not run)"; \\',
+ ' test `echo "$$skipped" | wc -c` -gt `echo "$$banner" | wc -c` && \\',
+ ' dashes="$$skipped"; \\',
+ ' fi; \\',
+ ' report=""; \\',
+ ' if test "$$failed" -ne 0 && test -n "$(PACKAGE_BUGREPORT)"; then \\',
+ ' report="Please report to $(PACKAGE_BUGREPORT)"; \\',
+ ' test `echo "$$report" | wc -c` -gt `echo "$$banner" | wc -c` && \\',
+ ' dashes="$$report"; \\',
+ ' fi; \\',
+ ' dashes=`echo "$$dashes" | sed s/./=/g`; \\',
+ ' echo "$$dashes"; \\',
+ ' echo "$$banner"; \\',
+ ' test -n "$$skipped" && echo "$$skipped"; \\',
+ ' test -n "$$report" && echo "$$report"; \\',
+ ' echo "$$dashes"; \\',
+ ' test "$$failed" -eq 0; \\',
+ 'else :; fi']
+
+ deps = amfile.replace_builddir(amfile.definition_rec("TESTS"))
+ amfile.insertTarget("check-TESTS-" + amfile.canon_subdir, deps, l, phony=1)
+
+ def collect(self, amfile):
+ if amfile.is_defined("TESTS"):
+ subcheck = "check-TESTS-" + amfile.canon_subdir
+ amfile.insertTarget("check", [ subcheck ], phony=1)
+ if amfile.is_target("check-am"):
+ amfile.insertTarget(subcheck, ["check-am", "all"], phony=1)
+
+handlerbase.register_handler(Test_Handler())
diff --git a/unsermake/checkrcs.um b/unsermake/checkrcs.um
new file mode 100644
index 0000000..a4b1389
--- /dev/null
+++ b/unsermake/checkrcs.um
@@ -0,0 +1,40 @@
+# -*-python-*-
+# vim: ts=4
+# Copyright (C) 2003 Simon Hausmann <hausmann@kde.org>
+# Licensed under whatever license unsermake is available
+
+import handlerbase
+
+class RCHandler( handlerbase.HandlerBase ):
+ def __init__( self ):
+ self.name = "checkrcs"
+
+ def collect( self, amfile ):
+
+ rcfiles = ''
+
+ for define in amfile.defines():
+ if define.endswith( '_DATA' ):
+
+ for file in amfile.definition_rec( define ):
+ if file.endswith( '.rc' ):
+ rcfiles = rcfiles + file + ' '
+
+ checks = [ '@if test -n "$(XMLLINT)"; then \\',
+ ' for rcf in %s; do \\' % rcfiles,
+ ' rc=%s$$rcf; \\' % amfile.source,
+ ' if head -n 1 $$rc | egrep -q "DOCTYPE|xml"; then \\',
+ ' echo checking $$rc; \\',
+ ' if ! $(XMLLINT) --noout $$rc; then \\',
+ ' exit 1 ;\\',
+ ' fi \\',
+ ' fi \\',
+ ' done \\',
+ 'fi' ]
+
+ if len( rcfiles ):
+ checkTargetName = 'install-check-rcfiles-' + amfile.canon_subdir
+ amfile.insertTarget( checkTargetName, '', checks, phony=1 )
+ amfile.insertTarget( 'install-data', checkTargetName, phony=1 )
+
+handlerbase.register_handler( RCHandler() )
diff --git a/unsermake/dcopidl.um b/unsermake/dcopidl.um
new file mode 100644
index 0000000..9bc6146
--- /dev/null
+++ b/unsermake/dcopidl.um
@@ -0,0 +1,80 @@
+# -*-python-*-
+# vim: ts=4
+
+import handlerbase,utilities
+
+class IDLHandler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "dcopidl"
+
+ def requirements(self):
+ return ([], ["autodeps"])
+
+ def extensions(self):
+ return ['.stub', '.skel']
+
+ def source(self, program, base, ext):
+ sourcename = base + "_" + ext[1:]
+
+ rulef = program.mfile
+
+ sourcedir = ''
+ if program.mfile.in_srcdir( base + ".h" ):
+ sourcedir = rulef.source
+ else:
+ value = program.mfile.value_of(base + "_DIR")
+ if value:
+ sourcedir = re.sub('^\$\(srcdir\)', rulef.source, value)
+ if not sourcedir[-1] == '/':
+ sourcedir = sourcedir + '/'
+ else:
+ sourcedir = rulef.build
+
+ if program.mfile.value_of(base + "_DCOPIDLNG"):
+ dcopidl = "KDECONFIG=\"$(KDECONFIG)\" $(DCOPIDLNG)"
+ else:
+ dcopidl = "$(DCOPIDL)"
+
+ rulef.insertTarget(rulef.build + base + ".kidl",
+ [sourcedir + base + ".h",
+ "$(DCOP_DEPENDENCIES)"],
+ ["@echo '%s %s.kidl%s'" % (utilities.creating_text, rulef.build + base, \
utilities.normal), + "@adir=`cd %s && pwd` ;\\" % sourcedir,
+ dcopidl + " $$adir/" + base + ".h > " + rulef.build + base + ".kidl "
+ "|| ( rm -f " + rulef.build + base + ".kidl ;"
+ " false )"])
+
+ if ext == '.skel':
+ rulef.insertTarget(rulef.build + sourcename + ".cpp", [rulef.build + base + \
".kidl", + "$(DCOPIDL2CPP)"],
+ ["@echo '%s %s.cpp%s'" % (utilities.creating_text, rulef.build + sourcename, \
utilities.normal), + "@$(DCOPIDL2CPP) --c++-suffix cpp "
+ "--no-stub %s" % rulef.build + base + ".kidl"])
+ else: #elif ext == '.stub':
+ rulef.insertTarget(rulef.build + sourcename + ".cpp",
+ rulef.build + sourcename + ".h")
+ rulef.insertTarget(rulef.build + sourcename + ".h", [rulef.build + base + \
".kidl", + "$(DCOPIDL2CPP)"],
+ ["@echo '%s %s.h%s'" % (utilities.creating_text, rulef.build + sourcename, \
utilities.normal), + "@$(DCOPIDL2CPP) --c++-suffix cpp "
+ "--no-skel %s" % rulef.build + base + ".kidl"])
+ rulef.insertTarget(rulef.build + sourcename + ".o",
+ rulef.build + sourcename + ".h")
+ rulef.insertTarget(rulef.build + sourcename + ".lo",
+ rulef.build + sourcename + ".h")
+
+ #else:
+ # rulef.insertTarget(rulef.build + sourcename + ".cpp",
+ # rulef.build + sourcename + ".h")
+ # program.mfile.insertTarget(rulef.build + sourcename + ".h", [rulef.build + base \
+ ".kidl", + # "$(DCOPIDL2CPP)"],
+ # ["$(DCOPIDL2CPP) --c++-suffix cpp "
+ # "--no-stub --no-skel %s" % rulef.build + base + ".kidl"])
+
+ program.handle_source(sourcename, ".cpp")
+
+ program.cleanfiles.extend([sourcename + ".cpp",
+ sourcename + ".h",
+ base + ".kidl"])
+
+handlerbase.register_handler(IDLHandler())
diff --git a/unsermake/designer.um b/unsermake/designer.um
new file mode 100644
index 0000000..a8f032c
--- /dev/null
+++ b/unsermake/designer.um
@@ -0,0 +1,182 @@
+# -*-python-*-
+# vim: ts=4
+
+import handlerbase,utilities
+
+class DesignerHandler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "designer"
+
+ def requirements(self):
+ return ([], ["autodeps", "mocs"])
+
+ def extensions(self):
+ return ['.ui','.ui3', '.ui4']
+
+ def source(self, program, base, ext):
+
+ rulef = program.mfile
+
+ sourcedir = ''
+ if program.mfile.in_srcdir(base + ext):
+ sourcedir = rulef.source
+ else:
+ sourcedir = rulef.build
+
+ ui_files = []
+ if rulef.is_defined("UNSERMAKE_UIFILES"):
+ ui_files = rulef.definition("UNSERMAKE_UIFILES")
+
+ uifile_version = 0
+
+ if ext == '.ui':
+ if base in rulef.definition("QT3_UIFILES"):
+ uifile_version = 3
+ elif base in rulef.definition("QT4_UIFILES"):
+ uifile_version = 4
+ else:
+ # If UIC3_PATH is undefined we're using KDE 3, and we can assume a qt3-format \
.ui file. + if not utilities.subst_vars.has_key("UIC3_PATH"):
+ uifile_version = 3
+ else:
+ utilities.print_error( "%s: Unknown UIC version for %s. Rerun make -f \
Makefile.cvs.\n" + % (rulef.filename, rulef.dirname + "/" + base + ext) )
+ else:
+ if ext == '.ui3':
+ uifile_version = 3
+ else:
+ if ext == '.ui4':
+ uifile_version = 4
+ else: # can't happen
+ utilities.print_error("%s: Unknown extension, can't happen\n" % \
(rulef.filename)) + return
+
+ if uifile_version == 3:
+ # If UIC3_PATH is undefined we're using KDE 3
+ if utilities.subst_vars.has_key("UIC3_PATH"):
+ uic_version = "$(UIC3_PATH)"
+ else:
+ uic_version = "$(UIC)"
+
+ outputfile = rulef.build + base + ".cpp"
+ else:
+ if not utilities.subst_vars.has_key("UIC3_PATH"):
+ utilities.print_error( "%s: You cannot use Qt 4 .ui-files with Qt/KDE 3! Please \
use a Qt 4-based build system.\n" + % (rulef.filename) )
+ uic_version = "$(UIC)"
+ outputfile = rulef.build + "ui_" + base + ".h"
+
+ rules = ["@echo '%s %s%s'" % (utilities.creating_text, outputfile, \
utilities.normal)] +
+ if not base in ui_files:
+ if not program.mfile.get_opt("qtonly"):
+ rules.append("@echo '#include <kdialog.h>' > %s" % (outputfile))
+ rules.append("@echo '#include <klocale.h>' >> %s" % (outputfile))
+ canon_source = utilities.canon_name(base)
+ rules.append("@adir=`cd %s && pwd` ;\\" % sourcedir)
+
+ if uifile_version == 3:
+ rules.append( "%s -tr ${UIC_TR} -i %s.h $$adir/%s%s > %s.temp ; ret=$$?; \\"
+ % (uic_version, base, base, ext, outputfile) )
+ else:
+ rules.append( "%s -tr ${UIC_TR} $$adir/%s%s > %s.temp ; ret=$$?; \\"
+ % (uic_version, base, ext, outputfile) )
+
+ line = 'sed -e "s,${UIC_TR}( \"\" ),QString::null,g" %s.temp ' % (outputfile)
+ line = line + '| sed -e "s,image\([0-9][0-9]*\)_data,img\\1_%s,g" ' % \
canon_source + line = line + '>> %s ;\\' % (outputfile)
+ rules.append( line )
+
+ if uifile_version == 3:
+ rules.append('rm -f %s.temp ;\\' % (outputfile))
+ else:
+ rules.append('rm -f %s.temp' % (outputfile))
+ else:
+ if uifile_version == 3:
+ rules.append( "@%s -i %s.h %s%s > %s ; ret=$$?; \\"
+ % (uic_version, base, sourcedir + base, ext, outputfile) )
+ else:
+ rules.append( "@%s %s%s > %s ; ret=$$?; \\"
+ % (uic_version, base, ext, outputfile) )
+
+ if uifile_version == 3:
+ rules.append( "if test \"$$ret\" = 0; then echo '#include \"%s.moc\"' >> %s; \
else rm -f %s; exit $$ret; fi" + % (base, outputfile, outputfile) )
+
+ rulef.insertTarget(outputfile,
+ [sourcedir + base + ext, rulef.build + base + ".h"],
+ rules)
+
+ # If UIC3_PATH is undefined we're using KDE 3
+ if utilities.subst_vars.has_key("UIC3_PATH") and not \
program.mfile.get_opt("qtonly"): + # In KDE4, it is the .h file \
who has translatable strings + rulef.insertTarget(rulef.build + base + ".h", \
[sourcedir + base + ext], + ["@echo '%s %s.h%s'" % \
(utilities.creating_text, rulef.build + base, utilities.normal), + "@%s -tr \
${UIC_TR} -o %s.temp $?" % (uic_version, base), + \
"@echo '#include <klocale.h>' > $@", + "@sed -e \
's/${UIC_TR}(\"\")/QString()/g;s/${UIC_TR}(\"\", 0)/QString()/g' %s.temp >> $@" % \
(base), + "@rm -f %s.temp" % (base) ])
+ else:
+ # In KDE3, the .h has not any translatable string
+ # And for pure Qt3/Qt4, we do not need any modification
+ rulef.insertTarget(rulef.build + base + ".h", [sourcedir + base + ext],
+ ["@echo '%s %s.h%s'" % (utilities.creating_text, rulef.build + base, \
utilities.normal), + "@%s -o $@ $?" % (uic_version)])
+
+
+ rulef.insertTarget(rulef.build + base + program.objext,
+ [rulef.build + base + ".h", rulef.build + base + ".moc"])
+ else:
+ rulef.insertTarget(outputfile, [sourcedir + base + ".ui"],
+ rules)
+
+ rulef.add_define("UNSERMAKE_UIFILES", base)
+ if uifile_version == 3:
+ program.cleanfiles.extend([base + ".h", base + ".cpp"])
+ else:
+ program.cleanfiles.append( "ui_" + base + ".h")
+
+ if uifile_version == 3:
+ program.handle_source(base, ".cpp")
+
+ def create_source(self, program, base, ext):
+ # Detect the file type if the extension is plain '.ui'
+ rulef = program.mfile
+ ui_path = rulef.dirname + "/" + base + ext
+ if ext == '.ui':
+ try:
+ fileToSearch = open( ui_path, 'r' )
+ except IOError:
+ utilities.print_error("%s: Unable to open %s\n" % (rulef.filename, ui_path))
+ return
+
+ searchterm = "<(?:UI|ui) version=\"(\\d+)\\.(\\d+)"
+ patternprog = re.compile( searchterm )
+
+ major_version = '3'
+ minor_version = '0'
+ for line in fileToSearch.readlines():
+ a_match = patternprog.search( line )
+ if ( a_match ):
+ major_version = a_match.group( 1 )
+ minor_version = a_match.group( 2 )
+ break
+
+ if major_version == '3':
+ rulef.add_define("QT3_UIFILES", base)
+ ext = '.ui3'
+ elif major_version == '4':
+ rulef.add_define("QT4_UIFILES", base)
+ ext = '.ui4'
+ else:
+ utilities.print_error("%s: Unable to detect the required UIC version for %s\n" % \
(rulef.filename, ui_path)) + return
+
+
+ # Add rules for Qt 3 .ui files
+ if ext == '.ui3':
+ value = base + "=" + base + ".h=" + base + ".cpp"
+ if not value in rulef.definition("METASOURCES"):
+ rulef.add_define("METASOURCES", value)
+
+handlerbase.register_handler(DesignerHandler())
diff --git a/unsermake/destdir.um b/unsermake/destdir.um
new file mode 100644
index 0000000..585b511
--- /dev/null
+++ b/unsermake/destdir.um
@@ -0,0 +1,35 @@
+# -*-python-*-
+# vim: ts=4
+
+import handlerbase
+
+class DESTDIRHandler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "destdir"
+
+ def handle_DESTDIR(self, filename, target):
+ rules = []
+ changed=0
+ for rule in target._rules:
+ if re.search('^\s*(@.*@)*\s*\$\(mkinstalldirs\)', rule) != None \
+ or re.search('^\s*(@.*@)*\s*\$\(INSTALL\S*\)', rule) != None \
+ or re.search('^\s*(@.*@)*\s*-?rm.* \S*$', rule) != None:
+ if not re.search('\$\(DESTDIR\)', rule):
+ if not changed:
+ sys.stderr.write('%s: install/rm commands without DESTDIR in target %s\n' % \
(filename, target.target)) +
+ rule = re.sub(r"^(.*) ([^\s]+)\s*$", r"\1 $(DESTDIR)\2", rule)
+ changed=1
+ rules.append(rule)
+
+ if changed:
+ target._rules = rules
+
+ def parse(self, amfile):
+
+ for t in ['install-exec-local' , 'install-data-local', 'uninstall-local']:
+ tar = amfile.target(t)
+ if tar:
+ self.handle_DESTDIR(amfile.filename, tar)
+
+handlerbase.register_handler(DESTDIRHandler())
diff --git a/unsermake/dist.um b/unsermake/dist.um
new file mode 100644
index 0000000..ba96dd0
--- /dev/null
+++ b/unsermake/dist.um
@@ -0,0 +1,50 @@
+# -*-python-*-
+# vim: ts=4
+
+class DISTHandler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "dist"
+
+ def tag_DIST(self):
+ foundfiles = {}
+ for file in os.listdir(self.dirname):
+ if file == 'CVS' or file[0] == '.' or file == 'Makefile' or file[-1] == '~' or \
(file[0] == '#' and file[-1] == '#'): + continue
+ if not os.path.isfile(self.dirname + "/" + file):
+ continue
+ if file.endswith('.moc') or file.endswith('.moc.cpp') or file.endswith('.lo') or \
file.endswith('.la') or file.endswith('.o'): + continue
+ if file.endswith('.all_cpp.cpp'):
+ continue
+ foundfiles[file] = 1
+
+ # doing this for MAINTAINERCLEANFILES would be wrong
+ for mark in ["EXTRA_DIST", "DIST_COMMON", '_SOURCES', '_HEADERS', 'CLEANFILES', \
'DISTCLEANFILES', '_OBJECTS']: + res = []
+ if mark[0] == '_':
+ for define in self.defines():
+ if define.endswith(mark):
+ res.extend(self.definition_rec(define))
+ else:
+ res = self.value_list(mark)
+ for file in res:
+ # no need to put 0s where no 1s were
+ if foundfiles.has_key(file):
+ foundfiles[file] = 0
+
+ dead_files = ["Makefile", "config.cache", "config.log", "stamp-h",
+ "stamp-h1", "config.h", "Makefile",
+ "config.status", "config.h", "libtool", "core"]
+ for file in dead_files:
+ if foundfiles.has_key(file):
+ foundfiles[file] = 0
+
+ KDE_DIST = []
+ for file in foundfiles.keys():
+ if foundfiles[file] == 1:
+ KDE_DIST.append(file)
+
+ if len(KDE_DIST):
+ print "%s: KDE_DIST" % self.filename, KDE_DIST
+
+# TODO register_handler(DISTHandler())
diff --git a/unsermake/doc/auug97.pdf b/unsermake/doc/auug97.pdf
new file mode 100644
index 0000000..6ea2331
Binary files /dev/null and b/unsermake/doc/auug97.pdf differ
diff --git a/unsermake/doc/example.obj b/unsermake/doc/example.obj
new file mode 100644
index 0000000..9e5e2c4
--- /dev/null
+++ b/unsermake/doc/example.obj
@@ -0,0 +1,239 @@
+%TGIF 4.1.42-QPL
+state(0,37,100.000,0,54,0,16,0,9,1,0,0,0,1,0,1,0,'Courier',0,80640,0,0,0,10,0,0,1,1,0,16,0,0,1,1,1,1,1088,1408,1,0,2880,0).
+%
+% @(#)$Header$
+% %W%
+%
+unit("1 pixel/pixel").
+color_info(11,65535,0,[
+ "magenta", 65535, 0, 65535, 65535, 0, 65535, 1,
+ "red", 65535, 0, 0, 65535, 0, 0, 1,
+ "green", 0, 65535, 0, 0, 65535, 0, 1,
+ "blue", 0, 0, 65535, 0, 0, 65535, 1,
+ "yellow", 65535, 65535, 0, 65535, 65535, 0, 1,
+ "pink", 65535, 49931, 50971, 65535, 49344, 52171, 1,
+ "cyan", 0, 65535, 65535, 0, 65535, 65535, 1,
+ "CadetBlue", 22885, 40569, 40569, 24415, 40606, 41120, 1,
+ "white", 65535, 65535, 65535, 65535, 65535, 65535, 1,
+ "black", 0, 0, 0, 0, 0, 0, 1,
+ "DarkSlateGray", 10402, 19764, 18724, 12079, 20303, 20303, 1
+]).
+script_frac("0.6").
+fg_bg_colors('black','white').
+dont_reencode("FFDingbests:ZapfDingbats").
+page(1,"",1,'').
+text('black',64,51,12,0,1,261,192,0,13,3,0,0,0,0,2,261,192,0,0,"",0,0,0,0,64,'',[
+minilines(261,192,0,0,0,0,0,[
+mini_line(144,13,3,0,0,0,[
+str_block(0,144,13,3,0,-4,0,0,0,[
+str_seg('black','Courier',0,80640,144,13,3,0,-4,0,0,0,0,0,
+ "lib/Makefile.am:")])
+]),
+mini_line(225,13,3,0,0,0,[
+str_block(0,225,13,3,0,-1,0,0,0,[
+str_seg('black','Courier',0,80640,225,13,3,0,-1,0,0,0,0,0,
+ " lib_LIBRARY = libfoo.a")])
+]),
+mini_line(261,13,3,0,0,0,[
+str_block(0,261,13,3,0,-1,0,0,0,[
+str_seg('black','Courier',0,80640,261,13,3,0,-1,0,0,0,0,0,
+ " libfoo_a_SOURCES = foo.cpp")])
+]),
+mini_line(216,13,3,0,0,0,[
+str_block(0,216,13,3,0,-2,0,0,0,[
+str_seg('black','Courier',0,80640,216,13,3,0,-2,0,0,0,0,0,
+ " foo.cpp foo.h: foo.ui")])
+]),
+mini_line(216,13,3,0,0,0,[
+str_block(0,216,13,3,0,-4,0,0,0,[
+str_seg('black','Courier',0,80640,216,13,3,0,-4,0,0,0,0,0,
+ " $(UIC) foo.ui ...")])
+]),
+mini_line(18,13,3,0,0,0,[
+str_block(0,18,13,3,0,-8,0,0,0,[
+str_seg('black','Courier',0,80640,18,13,3,0,-8,0,0,0,0,0,
+ " ")])
+]),
+mini_line(108,13,3,0,0,0,[
+str_block(0,108,13,3,0,-4,0,0,0,[
+str_seg('black','Courier',0,80640,108,13,3,0,-4,0,0,0,0,0,
+ "Makefile.am:")])
+]),
+mini_line(144,13,3,0,0,0,[
+str_block(0,144,13,3,0,-1,0,0,0,[
+str_seg('black','Courier',0,80640,144,13,3,0,-1,0,0,0,0,0,
+ " SUBDIRS = lib")])
+]),
+mini_line(189,13,3,0,0,0,[
+str_block(0,189,13,3,0,-1,0,0,0,[
+str_seg('black','Courier',0,80640,189,13,3,0,-1,0,0,0,0,0,
+ " bin_PROGRAMS = bar")])
+]),
+mini_line(216,13,3,0,0,0,[
+str_block(0,216,13,3,0,-1,0,0,0,[
+str_seg('black','Courier',0,80640,216,13,3,0,-1,0,0,0,0,0,
+ " bar_SOURCES = bar.cpp")])
+]),
+mini_line(243,13,3,0,0,0,[
+str_block(0,243,13,3,0,-1,0,0,0,[
+str_seg('black','Courier',0,80640,243,13,3,0,-1,0,0,0,0,0,
+ " bar_LDADD = lib/libfoo.a")])
+]),
+mini_line(0,13,3,0,0,0,[
+str_block(0,0,13,3,0,0,0,0,0,[
+str_seg('black','Courier',0,80640,0,13,3,0,0,0,0,0,0,0,
+ "")])
+])
+])]).
+box('black','',48,48,336,240,0,1,1,4,0,0,0,0,0,'1',0,[
+]).
+oval('black','',52,431,148,463,0,1,1,20,0,0,0,0,0,'1',0,[
+]).
+oval('black','',95,370,191,402,0,1,1,21,0,0,0,0,0,'1',0,[
+]).
+text('black',67,439,1,0,1,63,16,22,13,3,0,0,0,0,2,63,16,0,0,"",0,0,0,0,452,'',[
+minilines(63,16,0,0,0,0,0,[
+mini_line(63,13,3,0,0,0,[
+str_block(0,63,13,3,0,-1,0,0,0,[
+str_seg('black','Courier',0,80640,63,13,3,0,-1,0,0,0,0,0,
+ "foo.cpp")])
+])
+])]).
+text('black',121,378,1,0,1,45,16,27,13,3,0,0,0,0,2,45,16,0,0,"",0,0,0,0,391,'',[
+minilines(45,16,0,0,0,0,0,[
+mini_line(45,13,3,0,0,0,[
+str_block(0,45,13,3,0,-1,0,0,0,[
+str_seg('black','Courier',0,80640,45,13,3,0,-1,0,0,0,0,0,
+ "foo.o")])
+])
+])]).
+oval('black','',180,431,276,463,0,1,1,37,0,0,0,0,0,'1',0,[
+]).
+text('black',195,439,1,0,1,45,16,38,13,3,0,0,0,0,2,45,16,0,0,"",0,0,0,0,452,'',[
+minilines(45,16,0,0,0,0,0,[
+mini_line(45,13,3,0,0,0,[
+str_block(0,45,13,3,0,-1,0,0,0,[
+str_seg('black','Courier',0,80640,45,13,3,0,-1,0,0,0,0,0,
+ "foo.h")])
+])
+])]).
+oval('black','',120,490,216,522,0,1,1,39,0,0,0,0,0,'1',0,[
+]).
+text('black',135,498,1,0,1,54,16,40,13,3,0,0,0,0,2,54,16,0,0,"",0,0,0,0,511,'',[
+minilines(54,16,0,0,0,0,0,[
+mini_line(54,13,3,0,0,0,[
+str_block(0,54,13,3,0,-2,0,0,0,[
+str_seg('black','Courier',0,80640,54,13,3,0,-2,0,0,0,0,0,
+ "foo.ui")])
+])
+])]).
+oval('black','',93,312,189,344,0,1,1,44,0,0,0,0,0,'1',0,[
+]).
+text('black',106,320,1,0,1,72,16,45,13,3,0,0,0,0,2,72,16,0,0,"",0,0,0,0,333,'',[
+minilines(72,16,0,0,0,0,0,[
+mini_line(72,13,3,0,0,0,[
+str_block(0,72,13,3,0,-1,0,0,0,[
+str_seg('black','Courier',0,80640,72,13,3,0,-1,0,0,0,0,0,
+ "libfoo.a")])
+])
+])]).
+poly('black','',2,[
+ 142,370,142,344],1,1,1,59,0,0,0,0,0,0,0,'1',0,0,
+ "0","",[
+ 0,8,3,0,'8','3','0'],[0,8,3,0,'8','3','0'],[
+]).
+poly('black','',2,[
+ 101,431,128,401],1,1,1,60,0,0,0,0,0,0,0,'1',0,0,
+ "0","",[
+ 0,8,3,0,'8','3','0'],[0,8,3,0,'8','3','0'],[
+]).
+poly('black','',2,[
+ 216,431,173,399],1,1,1,61,0,0,0,0,0,0,0,'1',0,0,
+ "0","",[
+ 0,8,3,0,'8','3','0'],[0,8,3,0,'8','3','0'],[
+]).
+poly('black','',2,[
+ 175,490,219,463],1,1,1,62,0,0,0,0,0,0,0,'1',0,0,
+ "0","",[
+ 0,8,3,0,'8','3','0'],[0,8,3,0,'8','3','0'],[
+]).
+poly('black','',2,[
+ 146,491,112,463],1,1,1,63,0,0,0,0,0,0,0,'1',0,0,
+ "0","",[
+ 0,8,3,0,'8','3','0'],[0,8,3,0,'8','3','0'],[
+]).
+text('black',46,279,1,0,1,72,16,64,13,3,0,0,0,0,2,72,16,0,0,"",0,0,0,0,292,'',[
+minilines(72,16,0,0,0,0,0,[
+mini_line(72,13,3,0,0,0,[
+str_block(0,72,13,3,0,-4,0,0,0,[
+str_seg('black','Courier',0,80640,72,13,3,0,-4,0,0,0,0,0,
+ "lib DAG:")])
+])
+])]).
+box('black','',20,257,294,545,0,1,1,66,0,0,0,0,0,'1',0,[
+]).
+oval('black','',424,318,520,350,0,1,1,67,0,0,0,0,0,'1',0,[
+]).
+text('black',450,326,1,0,1,27,16,68,13,3,0,0,0,0,2,27,16,0,0,"",0,0,0,0,339,'',[
+minilines(27,16,0,0,0,0,0,[
+mini_line(27,13,3,0,0,0,[
+str_block(0,27,13,3,0,-1,0,0,0,[
+str_seg('black','Courier',0,80640,27,13,3,0,-1,0,0,0,0,0,
+ "bar")])
+])
+])]).
+oval('black','',357,387,453,419,0,1,1,69,0,0,0,0,0,'1',0,[
+]).
+text('black',371,395,1,0,1,72,16,70,13,3,0,0,0,0,2,72,16,0,0,"",0,0,0,0,408,'',[
+minilines(72,16,0,0,0,0,0,[
+mini_line(72,13,3,0,0,0,[
+str_block(0,72,13,3,0,-1,0,0,0,[
+str_seg('black','Courier',0,80640,72,13,3,0,-1,0,0,0,0,0,
+ "libfoo.a")])
+])
+])]).
+oval('black','',484,391,580,423,0,1,1,73,0,0,0,0,0,'1',0,[
+]).
+text('black',510,399,1,0,1,45,16,74,13,3,0,0,0,0,2,45,16,0,0,"",0,0,0,0,412,'',[
+minilines(45,16,0,0,0,0,0,[
+mini_line(45,13,3,0,0,0,[
+str_block(0,45,13,3,0,-1,0,0,0,[
+str_seg('black','Courier',0,80640,45,13,3,0,-1,0,0,0,0,0,
+ "bar.o")])
+])
+])]).
+oval('black','',427,452,523,484,0,1,1,75,0,0,0,0,0,'1',0,[
+]).
+text('black',447,461,1,0,1,63,16,76,13,3,0,0,0,0,2,63,16,0,0,"",0,0,0,0,474,'',[
+minilines(63,16,0,0,0,0,0,[
+mini_line(63,13,3,0,0,0,[
+str_block(0,63,13,3,0,-1,0,0,0,[
+str_seg('black','Courier',0,80640,63,13,3,0,-1,0,0,0,0,0,
+ "bar.cpp")])
+])
+])]).
+poly('black','',2,[
+ 483,452,522,423],1,1,1,90,0,0,0,0,0,0,0,'1',0,0,
+ "0","",[
+ 0,8,3,0,'8','3','0'],[0,8,3,0,'8','3','0'],[
+]).
+poly('black','',2,[
+ 531,391,487,349],1,1,1,91,0,0,0,0,0,0,0,'1',0,0,
+ "0","",[
+ 0,8,3,0,'8','3','0'],[0,8,3,0,'8','3','0'],[
+]).
+poly('black','',2,[
+ 414,386,455,349],1,1,1,92,0,0,0,0,0,0,0,'1',0,0,
+ "0","",[
+ 0,8,3,0,'8','3','0'],[0,8,3,0,'8','3','0'],[
+]).
+text('black',372,285,1,0,1,72,16,93,13,3,0,0,0,0,2,72,16,0,0,"",0,0,0,0,298,'',[
+minilines(72,16,0,0,0,0,0,[
+mini_line(72,13,3,0,0,0,[
+str_block(0,72,13,3,0,-4,0,0,0,[
+str_seg('black','Courier',0,80640,72,13,3,0,-4,0,0,0,0,0,
+ "bar DAG:")])
+])
+])]).
+box('black','',336,278,599,508,0,1,1,95,0,0,0,0,0,'1',0,[
+]).
diff --git a/unsermake/doc/unsermake-talk.sxi b/unsermake/doc/unsermake-talk.sxi
new file mode 100644
index 0000000..77e54e9
Binary files /dev/null and b/unsermake/doc/unsermake-talk.sxi differ
diff --git a/unsermake/doc/unsermake-talk.txt b/unsermake/doc/unsermake-talk.txt
new file mode 100644
index 0000000..6f54962
--- /dev/null
+++ b/unsermake/doc/unsermake-talk.txt
@@ -0,0 +1,144 @@
+# NOTE: I extended it in the sxi
+make:
+ - rules and dependencies
+ - purely text and file (time stamp) based
+ - commands are shell scripts
+ - macro language build in (mostly unused within KDE)
+
+projects:
+ nested directory structure
+ - some directories have icons, some have docu,
+ some have only have some READMEs and are there
+ to give more structures
+ - most contain code to build binaries, libraries, usually only very few
+ targets per directory (one lib, one exe, very seldom more than that)
+
+automake:
+ - has strictly nothing to do with autoconf, but relies
+ on it. Because of this, many confuse them
+ - every directory contains a Makefile.am file that describes
+ in a syntax related to make's what should be built, installed,
+ distributed (only used wrongly in KDE :) for this directory only.
+ This all by using only simple variable assignments.
+ - based on purely recursive make (SUBDIRS = ), even though later
+ versions give basic support to build targets (binaries, libraries)
+ from subdirs' object files
+ - to new users it became attractive as it made deploying GNU standards
+ more comfortable and people love ./configure && make && sudo make install
+ - became popular even to convert older projects to automake because it
+ deployed gcc's dependency tracking transparently. Most developers
+ rely on this feature without even beeing aware it's there.
+
+recursion:
+ - automake makes recursive make a natural thing even though it's not.
+ But as more and more free software developers start their make
+ "career" with automake, they often have no idea what they're doing
+ and get their Makefile.am written by copy & paste or through kdevelop
+ (which I still consider a good thing - if better code gets written because
+ of this)
+ - because you describe the target and its dependencies on a per directory
+ base and most often a directory only has a limited set of targets, the
+ Makefile.am files are very easy to understand and write. This is a definite
+ advantage as pure make is just too limited.
+ - recursion in itself is bad though as it adds a really huge overhead to a
+ rather basic problem: give structure to your project
+ - even though only two icons changed, a make install call can take forever
+ as it evalutates tons of Makefiles that describe targets and dependencies
+ for files completely unrelated.
+ - inter-target dependencies can only be expressed through the order the subdirs
+ are listed in SUBDIRS
+ - parallel make can only taken advantage of in one single directory. So if you
+ want to have the power of parallel make, you need to put more sources in one
+ directory - which is counter productive to the "give structure" need.
+ - the earned structure can be used very effectively if you only want to have
+ a part build or installed. Then you can change into these subdirs and call
+ make there - you basically start with a subtree. But even there the other
+ subdirectories can harm you. For this, automake added some workarounds, e.g.
+ make all-am doesn't start the recursion.
+
+
+some theory:
+ - make creates a directed acyclic graph (DAG) of the target it reads in. The
+ vertices of this graph are the files in the file system, the edges of this
+ graph are the inter-file depencies. Each vertice has a set of commands to
+ update it when one dependency's time stamp is newer.
+ - some targets aren't files but phony/pseudo targets. E.g. 'all', 'install',
+ these are usually labeled in the Makefile
+ - a call to 'make', will read in Makefile (actually it has a list of file names
+ it goes through, Makefile is just the one about everyone on UNIX uses) and
+ update the first target it finds. This is per tradition all or default.
+ ... big fat example ...
+
+recursive make considered harmful:
+ - recursive make has quite some disadvantages, but those harming an (more or less)
+ average KDE developer most:
+ - make needs to recurse through every directory, starting the process
+ again and again, even though nothing needs to be done - taking tons
+ of time
+ - all of kdecore needs to be build before kdeui can be started to build,
+ even those parts not depending on stuff in kdecore that needs to be
+ compiled (e.g. only relying on headers already there)
+ - one can't build parts of a module without having quite some knowledge of
+ the whole thing (or do trial & error).
+ Example: what needs to be build in koffice to get a working kpresenter?
+
+so is automake considered harmful?:
+ - using automake is still better than about every other solution I've seen
+ so far. There might be ways to build huge projects, that solve the one or
+ the other aspect better as automake does, but automake marks the best
+ compromise to what we need.
+ - the biggest problem of a build system within a project is: does any
+ affected(!) developer understands it? KDE is developing a desktop environment,
+ not a build system, still we've seen enough KDE specific problems that
+ made KDE specific features in the build system necessary.
+ - it always was the aim to make it easy to write Makefiles that build normal
+ KDE applications. Beginners should be able to finish standard tasks copying
+ and pasting with some little thinking (still one would wish they would pick
+ better origins :)
+ - syntactic sugar within our Makefiles were part of our history for quite some
+ time and that can't be taken back. Writing Makefile syntax is hard enough
+ without repeating tons of MOC rules.
+
+automake -> automoc -> am_edit:
+ - a central piece of our build system nowadays is a perl script named am_edit,
+ that meanwhile has most likely more lines than automake itself.
+ - it "patches" the automake generated files and expands things automake didn't
+ understand to the things we want it to do. E.g. METASOURCES = AUTO is just
+ a variable assignment to automake, but am_edit will start grepping through
+ all header files for Q_OBJECT lines and add MOC rules.
+ - technically we're no longer use automake, but we use something like automake++
+ - which doesn't make it easier to maintain and understand. As with graphical
+ user interfaces, developers make their own image of how things work when they
+ put something into their Makefile.
+
+am2:
+ - As the perl code of am_edit is quite a mess and perl isn't really the language
+ of choice for me, I was always toying around with the idea of replacing it with
+ something people understand.
+ - In Spring 2000 I tried to replace the current code base with a python \
replacement, + but it became clear quite quickly that "fixing" the language \
problem wasn't the + solution to the mess - with patching automake output you \
always have to understand + what automake saw when it output things and create your \
own output. It's basically + impossible to code that without looking hacky.
+ - but with teambuilder (and later distcc) becoming popular, the problems of KDE's
+ recursive build system became obvious - you could compile Qt in about no time \
with + 10 clients while KDE seemed to take forever and the team builder monitor \
showed that + building KDE didn't came near filling the compile farm
+ - the rest of the story: I took a week off and it rained
+
+unsermake:
+ - am2 wasn't really a good name to start with, so we decided that the new automake
+ replacment should do what we want: so it's short for unser (replacement for \
auto)make + - it reads Makefile.am syntax just as automake does, but creates very \
different output + taking advantage of the rather general way of specifying \
targets. + - it creates all rules and dependencies relative to the toplevel so that \
make doesn't + need to recurse. Advantage of this is that all dependencies are \
known to a single instance + so that plural make can be taken advantage of - but \
also all dependencies have to be + for all files
+ - the key problem after starting wasn't getting compiling and linking done, but \
don't + loose the advantage of beeing able to compile from a subdir and only build \
the stuff + inside that subdir. For bigger non-recursive projects you can _only_ \
build from the + toplevel - which would have been a too high price.
+ - for this unsermake creates three files per subdirectory that are included by each \
other, + so that variables in there can be overwritten to specify where e.g. the \
top_builddir + (e.g. top_builddir = . for kdelibs/, top_builddir = .. for \
kdelibs/kdecore)
diff --git a/unsermake/docs.um b/unsermake/docs.um
new file mode 100644
index 0000000..dbf87d2
--- /dev/null
+++ b/unsermake/docs.um
@@ -0,0 +1,157 @@
+# -*-python-*-
+# vim: ts=4
+
+import handlerbase
+
+class DocsHandler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "docs"
+ # sync with manpages.um
+ self.manDbFileRx = re.compile( "^man-([^\.]+)\.([0-9nl])\.docbook$" )
+
+ def create(self, amfile):
+ if not amfile.is_defined("KDE_DOCS"):
+ return
+
+ docs = amfile.definition_rec("KDE_DOCS")
+
+ if amfile.is_defined("KDE_LANG"):
+ kdelang = amfile.definition_rec('KDE_LANG')
+ if not len(kdelang) == 1:
+ print_error("%s: KDE_LANG is not a value but a(n empty) list\n" % \
amfile.filename) + kdelang = kdelang[0]
+ else:
+ kdelang = ''
+
+ # Either find the files in the directory (AUTO) or use
+ # only the specified po files.
+ if docs[0] == 'AUTO':
+ appname = os.path.basename(amfile.subdir)
+ if appname == 'en':
+ print_error("%s: KDE_DOCS = AUTO relies on the directory name. Yours is 'en' - \
you most likely want something else, e.g. KDE_DOCS = myapp\n" % amfile.filename) \
+ else: + appname = docs[0]
+
+ if len(docs) > 2 and docs[1] == '-':
+ files = docs[2:]
+ else:
+ files = []
+ for file in os.listdir(amfile.subdir):
+ if (file in ['CVS', 'core', 'index.cache.bz2']) or file.startswith('Makefile') \
or file[0] == '.' or file[-1] == '~' or (file[0] == '#' and file[-1] == '#'): \
+ continue + if self.manDbFileRx.match(file):
+ continue
+ if os.path.isfile(amfile.subdir + "/" + file):
+ files.append(file)
+
+ if not len(files):
+ return
+
+ dep_files = []
+ for file in files:
+ dep_files.append(amfile.source + file)
+
+ if 'index.docbook' in files:
+
+ deps = amfile.value_list("KDE_DOCS_DEPS")
+ # amfile.add_define('MEINPROC', '$(kde_bindir)/meinproc')
+ # amfile.add_define("KDE_XSL_STYLESHEET", \
"$(kde_datadir)/ksgmltools2/customization/kde-chunk.xsl") +
+ amfile.insertTarget(amfile.build + "index.cache.bz2",
+ ["$(KDE_XSL_STYLESHEET)"] + dep_files + deps,
+ '@if test -n "$(MEINPROC)"; then echo $(MEINPROC) --check --cache \
%sindex.cache.bz2 ' + '%sindex.docbook; $(MEINPROC) --check --cache \
%sindex.cache.bz2 %sindex.docbook; fi' + % (amfile.build, amfile.source, \
amfile.build, amfile.source)) +
+ amfile.insertTarget("docs-%s" % amfile.canon_subdir, amfile.build + \
"index.cache.bz2", phony=1) +
+ install_docs = "install-docs-%s" % amfile.canon_subdir
+ targetdir = "$(DESTDIR)$(kde_htmldir)/%s/%s" % (kdelang, appname)
+
+ amfile.insertTarget(install_docs, targetdir + "/common", phony=1)
+ amfile.insertTarget(targetdir + "/common", ['$(UNSERMAKE_FORCE_INSTALL)'],
+ ["-rm -f %s/common" % targetdir,
+ "$(LN_S) $(kde_libs_htmldir)/%s/common %s/common" % (kdelang, targetdir)])
+
+ amfile.insertTarget(install_docs, targetdir + "/index.cache.bz2", phony=1)
+ amfile.insertTarget(targetdir + "/index.cache.bz2",
+ [amfile.build + "index.cache.bz2", '$(UNSERMAKE_FORCE_INSTALL)'],
+ ["$(mkinstalldirs) " + targetdir,
+ "@if test -f %sindex.cache.bz2; then \\" % amfile.build,
+ " echo $(INSTALL_DATA) %sindex.cache.bz2 %s/ ;\\" %
+ (amfile.build, targetdir),
+ " $(INSTALL_DATA) %sindex.cache.bz2 %s/; \\" %
+ (amfile.build, targetdir),
+ "fi"])
+
+ amfile.insertTarget("uninstall-docs-%s" % amfile.canon_subdir, "",
+ "-rm -rf $(kde_htmldir)/%s/%s" % (kdelang, appname), phony=1)
+ amfile.insertTarget("clean-docs-%s" % amfile.canon_subdir, "",
+ "-rm -f " + amfile.build + "index.cache.bz2", phony=1)
+
+ else:
+ amfile.insertTarget("docs-%s" % amfile.canon_subdir, dep_files, phony=1)
+
+ install_nls = "install-nls-%s" % amfile.canon_subdir
+ for file in files:
+ targetdir = "$(DESTDIR)$(kde_htmldir)/%s/%s" % (kdelang, appname)
+ lines = ["$(mkinstalldirs) " + targetdir,
+ "@echo $(INSTALL_DATA) %s %s/%s ;\\" % (amfile.source + file,
+ targetdir, file),
+ "$(INSTALL_DATA) %s %s/%s" % (amfile.source + file,
+ targetdir, file)]
+
+ amfile.insertTarget(install_nls,
+ "%s/%s" % (targetdir, file))
+ amfile.insertTarget("%s/%s" % (targetdir, file),
+ [amfile.source + file, '$(UNSERMAKE_FORCE_INSTALL)'], lines)
+
+ if appname == 'common':
+ lines = ["@echo \"merging common and language specific dir\" ;\\",
+ "if test ! -f $(kde_htmldir)/en/common/kde-common.css; then echo 'no english \
docs found in $(kde_htmldir)/en/common/'; exit 1; fi", + "@com_files=`cd \
$(kde_htmldir)/en/common && echo *` ;\\", + "cd \
$(DESTDIR)$(kde_htmldir)/%s/common ;\\" % kdelang, + "if test -n \
\"$$com_files\"; then for p in $$com_files ; do \\", + " case \" " + \
string.join(files) + " \" in \\", + " *\" $$p \"*) ;; \\",
+ " *) test ! -f $$p && echo $(LN_S) ../../en/common/$$p \
$(DESTDIR)$(kde_htmldir)/%s/common/$$p && $(LN_S) ../../en/common/$$p $$p ;; \\" % \
kdelang, + " esac ; \\",
+ "done ; fi ; true"]
+
+ amfile.insertTarget(install_nls, install_nls + "-common", phony=1)
+ amfile.insertTarget(install_nls + "-common", [], lines, phony=1)
+
+ amfile.insertTarget("uninstall-nls-%s" % amfile.canon_subdir, "",
+ ["for base in %s; do \\" % string.join(files),
+ " rm -f $(DESTDIR)$(kde_htmldir)/%s/%s/$$base ;\\" % (kdelang, appname),
+ "done"], phony=1)
+
+ #amfile.insertTarget("distdir", "distdir-nls-%s" % amfile.canon_subdir)
+ #amfile.insertTarget("distdir-nls-%s" % amfile.canon_subdir, "",
+ # ["for file in $files; do \\",
+ # " cp %s$$file $(distdir); \\" % amfile.source,
+ # "done"])
+
+ def collect(self, amfile):
+ # adding default targets
+ amfile.insertTarget("all", "docs", phony=1)
+
+ install = "install-nls-%s" % amfile.canon_subdir
+ if amfile.is_target(install):
+ amfile.insertTarget("install-data", install, phony=1)
+ amfile.insertTarget("uninstall", "un" + install, phony=1)
+ amfile.insertTarget("install", "install-data", phony=1)
+
+ install = "install-docs-%s" % amfile.canon_subdir
+ if amfile.is_target(install):
+ amfile.insertTarget("install-data", install, phony=1)
+ amfile.insertTarget("uninstall", "un" + install, phony=1)
+ amfile.insertTarget("clean", "clean-docs-%s" % amfile.canon_subdir, phony=1)
+
+ install = "docs-%s" % amfile.canon_subdir
+ if amfile.is_target(install):
+ amfile.insertTarget("docs", install, phony=1)
+ else:
+ amfile.insertTarget("docs", "", phony=1)
+
+handlerbase.register_handler(DocsHandler())
diff --git a/unsermake/flex.um b/unsermake/flex.um
new file mode 100644
index 0000000..ffe97c1
--- /dev/null
+++ b/unsermake/flex.um
@@ -0,0 +1,44 @@
+# -*-python-*-
+# vim: bs=4
+
+import handlerbase
+
+class FlexHandler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "flex"
+
+ def extensions(self):
+ return ['.ll', '.l']
+
+ def source(self, program, base, ext):
+
+ rulef = program.mfile
+
+ sourcedir = ''
+ if program.mfile.in_srcdir(base + ext):
+ sourcedir = rulef.source
+ else:
+ sourcedir = rulef.build
+
+ rext = '.cc'
+ if ext == '.l':
+ rext = '.c'
+
+ variable = 'LEX_OUTPUT_ROOT'
+ if program.mfile.is_defined('LEX_OUTPUT_ROOT'):
+ variable = rulef.canon_subdir + '_LEX_OUTPUT_ROOT'
+ rulef.add_define(variable, program.mfile.definition_rec('LEX_OUTPUT_ROOT'))
+
+ compile = '$(LEX) ' + program.handle_variable('LFLAGS')
+ rules = ["absdir=`cd %s && pwd` ;\\" % sourcedir,
+ "cd %s && %s $$absdir/%s" % (rulef.build, compile, base + ext)]
+ rules.append("sed '/^#/ s|$(%s)\.c|%s|' %s$(%s).c > %s" % (variable, base + rext,
+ rulef.build, variable,
+ rulef.build + base + rext))
+ rules.append("rm -f %s$(%s).c" % (rulef.build, variable))
+ rulef.insertTarget(rulef.build + base + rext, [sourcedir + base + ext], rules)
+
+ program.handle_source(base, rext)
+ program.cleanfiles.extend([base + rext])
+
+handlerbase.register_handler(FlexHandler())
diff --git a/unsermake/handlerbase.py b/unsermake/handlerbase.py
new file mode 100644
index 0000000..e126b52
--- /dev/null
+++ b/unsermake/handlerbase.py
@@ -0,0 +1,115 @@
+class HandlerBase:
+
+ def __init__(self):
+ self.name = "Handler"
+
+ def __repr__(self):
+ return self.name
+
+ # returns a couple of lists.
+ # the first list is the list of handlers that have to run before this one
+ # and the second list is the list of handlers that have to run after this one
+ def requirements(self):
+ return ([], [])
+
+ # this is called to find out the file types the Handler works
+ # for. So return here an array of extensions in the form
+ # ['.s', '.S'] (for assembler)
+ def extensions(self):
+ return []
+
+ # this is called for every source file in _SOURCES that fits
+ # your given extensions.
+ # The paramters are:
+ # * program: is an instance of a Program target (can be a library
+ # too (everything that can have a _SOURCES)
+ # * base: the basename of the source file (e.g. "hello")
+ # * ext: the extension of the source file (e.g. ".s")
+ def source(self, program, base, ext):
+ pass
+
+ # this is called for every source file in _SOURCES that fits
+ # your given extensions in create mode
+ # The paramters are:
+ # * program: is an instance of a Program target (can be a library
+ # too (everything that can have a _SOURCES)
+ # * base: the basename of the source file (e.g. "hello")
+ # * ext: the extension of the source file (e.g. ".s")
+ def create_source(self, program, base, ext):
+ pass
+
+ # this is called after the initial parsing is done to give the Handler
+ # a chance to either check something in the parsed Instance or to change
+ # something. Handle with care
+ # The parameter is an instance of an AmFile
+ # The function is called in both create and run mode
+ def parse(self, amfile):
+ pass
+
+ # this is called for every parsed makefile to do long term actions that
+ # should only run in create mode
+ # The parameter is an instance of an AmFile
+ # The function is called only in create mode
+ def create(self, amfile):
+ pass
+
+ # this is called for every parsed makefile to collect whatever the handler
+ # has to collect. It's not really possible to create new source files in here
+ # so do this in create (s.a.)
+ def collect(self, amfile):
+ pass
+
+ # this is called for every parsed makefile _after_ parsing is done but _before_
+ # the compile lines for the programs are created. This allows to write handlers
+ # that move sources around or create additional binaries, etc.
+ def shuffle_binaries(self, amfile):
+ pass
+
+ext_dict = {}
+handlers = []
+
+def add_handler(handler):
+ self_before, self_after = handler.requirements()
+
+ last_index = -1
+
+ for index in range(0, len(handlers)):
+ before, after = handlers[index].requirements()
+ if last_index < 0 and handler.name in before:
+ last_index = index
+ if last_index < 0 and handlers[index].name in self_after:
+ last_index = index
+
+ if last_index >= 0:
+ handlers[last_index:last_index + 1] = [handler, handlers[last_index]]
+ else:
+ handlers.append(handler)
+
+ handler_index = {}
+
+ # topologic sort
+ for index in range(0, len(handlers)):
+ hand = handlers[index]
+ handler_index[hand.name] = index
+ before, after = hand.requirements()
+ for handler in after:
+ if handler_index.has_key(handler):
+ if handler_index[handler] < index:
+ handler = handlers[index]
+ del handlers[index]
+ add_handler(handler)
+ break
+ for handler in before:
+ if handler_index.has_key(handler):
+ if handler_index[handler] > index:
+ handler = handlers[index]
+ del handlers[index]
+ add_handler(handler)
+ break
+
+def register_handler(handler):
+ for ext in handler.extensions():
+ ext_dict[ext] = handler
+
+ add_handler(handler)
+
diff --git a/unsermake/icons.um b/unsermake/icons.um
new file mode 100644
index 0000000..f4bba4a
--- /dev/null
+++ b/unsermake/icons.um
@@ -0,0 +1,163 @@
+# -*-python-*-
+# vim: ts=4
+
+import handlerbase
+
+class KDE_ICON_Handler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "icons"
+
+ def create(self, amfile):
+
+ uninstall = []
+ icons = []
+
+ for define in amfile.defines():
+ if define.endswith('_ICON'):
+ if define[:-5] == 'KDE':
+ destdir = '$(kde_icondir)'
+ else:
+ dirkey = '%sdir' % define[:-5]
+ ndirkey = '%s_%sdir' % (amfile.canon_subdir, define[:-5])
+ amfile.add_define(ndirkey, amfile.definition_rec(dirkey))
+ destdir = '$(%s)' % ndirkey
+
+ iconauto = 0
+ appnames = []
+ rhs = amfile.definition_rec(define)
+ if rhs != ['AUTO']:
+ appnames = rhs
+ else:
+ iconauto = 1
+
+ files = os.listdir(amfile.dirname)
+ iconfiles = []
+ for entry in files:
+ if not os.path.exists(amfile.dirname + "/" + entry):
+ continue
+
+ ext = utilities.extre.match(entry)
+ if not ext:
+ continue
+
+ if ext.group(2) in ['.xpm', '.png', '.mng', '.svg', '.svgz']:
+ if iconauto:
+ iconfiles.append((ext.group(1), ext.group(2)))
+ else:
+ for app in appnames:
+ if ext.group(1).endswith('-' + app):
+ iconfiles.append((ext.group(1), ext.group(2)))
+
+ for base, ext in iconfiles:
+ index = string.find(base, '-') + 1
+ if not index:
+ sys.stderr.write('%s: unknown icon format (%s)\n' %
+ (amfile.filename, base + ext))
+ continue
+ if index != 0:
+ nindex = string.find(base, '-', index) + 1
+ if nindex != 0:
+ index = nindex
+ appname = base[index:]
+ prefix = base[:index]
+ if prefix[-1] == '-':
+ prefix = prefix[:-1]
+
+ if prefix == 'mini':
+ prefix = 'lo16-app'
+ elif prefix == 'lo':
+ prefix = 'lo16-app'
+ elif prefix == 'large':
+ prefix == 'hi48-app'
+ elif len(prefix) == 3:
+ prefix = prefix + '-app'
+
+ if not string.count(prefix, '-') == 1:
+ sys.stderr.write('%s: unknown icon format (%s)\n' %
+ (amfile.filename, base + ext))
+ continue
+
+ type = prefix[string.find(prefix, '-')+1:]
+ prefix = prefix[:string.find(prefix, '-')]
+
+ type_hash = {
+ 'action': 'actions',
+ 'app': 'apps',
+ 'device': 'devices',
+ 'filesys': 'filesystems',
+ 'mime': 'mimetypes'}
+
+ if not type_hash.has_key(type):
+ sys.stderr.write('%s: unknown icon type %s (%s)\n' %
+ (amfile.filename, type, base + ext))
+ continue
+
+ dir_hash = {
+ 'los' : 'locolor/16x16',
+ 'lom' : 'locolor/32x32',
+ 'him' : 'hicolor/32x32',
+ 'hil' : 'hicolor/48x48',
+ 'lo16' : 'locolor/16x16',
+ 'lo22' : 'locolor/22x22',
+ 'lo32' : 'locolor/32x32',
+ 'hi16' : 'hicolor/16x16',
+ 'hi22' : 'hicolor/22x22',
+ 'hi32' : 'hicolor/32x32',
+ 'hi48' : 'hicolor/48x48',
+ 'hi64' : 'hicolor/64x64',
+ 'hi128' : 'hicolor/128x128',
+ 'hisc' : 'hicolor/scalable',
+ 'cr16' : 'crystalsvg/16x16',
+ 'cr22' : 'crystalsvg/22x22',
+ 'cr32' : 'crystalsvg/32x32',
+ 'cr48' : 'crystalsvg/48x48',
+ 'cr64' : 'crystalsvg/64x64',
+ 'cr128' : 'crystalsvg/128x128',
+ 'crsc' : 'crystalsvg/scalable',
+ }
+
+
+ if not dir_hash.has_key(prefix):
+ sys.stderr.write('%s: unknown icon prefix %s\n' %
+ (amfile.filename, prefix))
+ continue
+
+ dir = dir_hash[prefix] + "/" + type_hash[type]
+ newfile = appname + ext
+
+ install = ['@test -d $(DESTDIR)%s/%s || $(mkinstalldirs) $(DESTDIR)%s/%s' %
+ (destdir, dir, destdir, dir)]
+ destfile = '$(DESTDIR)%s/%s/%s' % (destdir, dir, newfile)
+ install.append('$(INSTALL_DATA) %s %s' %
+ ("$(srcdir)/" + base + ext, destfile))
+ icons.append("$(srcdir)/" + base + ext)
+ amfile.insertTarget('install-kde-icons-%s' % amfile.canon_subdir,
+ destfile, phony=1)
+ amfile.insertTarget(destfile,
+ amfile.source + base + ext,
+ install)
+ uninstall.append('-rm -f %s' % destfile)
+
+ if len(uninstall):
+ amfile.insertTarget('uninstall-kde-icons-%s' % amfile.canon_subdir, [],
+ uninstall, phony=1)
+
+ defined = amfile.is_defined("KDE_ICON")
+ amfile.del_define("KDE_ICON")
+ if len(icons):
+ amfile.add_define("KDE_ICON", icons)
+ elif defined:
+ print "%s: KDE_ICON used but no icons present" % amfile.filename
+ amfile.add_define("KDE_ICON", "")
+
+ def collect(self, amfile):
+ install_key = 'install-kde-icons-%s' % amfile.canon_subdir
+ if amfile.targets.has_key(install_key):
+ amfile.insertTarget('install-data', install_key, phony=1)
+ amfile.insertTarget('uninstall', 'un' + install_key, phony=1)
+ if amfile.is_defined("KDE_ICON"):
+ amfile.insertTarget('all-%s' % amfile.canon_subdir,
+ "$(KDE_ICON)")
+
+
+handlerbase.register_handler(KDE_ICON_Handler())
diff --git a/unsermake/kcfg.um b/unsermake/kcfg.um
new file mode 100644
index 0000000..6890f96
--- /dev/null
+++ b/unsermake/kcfg.um
@@ -0,0 +1,57 @@
+# -*-python-*-
+# vim: ts=4
+
+import handlerbase, utilities
+
+class KCFGHandler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "kcfg"
+
+ def requirements(self):
+ return ([], ["autodeps"])
+
+ def extensions(self):
+ return ['.kcfgc']
+
+ def source(self, program, base, ext):
+ sourcename = base
+
+ rulef = program.mfile
+
+ sourcedir = ''
+ if program.mfile.in_srcdir( base + ext ):
+ sourcedir = rulef.source
+ else:
+ sourcedir = rulef.build
+
+ kcfg = ""
+ kcfgre = re.compile("File=(.*\.kcfg)")
+ lines = open(program.mfile.dirname + "/" + base + ext).readlines()
+ for line in lines:
+ match = kcfgre.match(line)
+ if match:
+ kcfg = match.group(1)
+
+ if not len(kcfg):
+ utilities.print_error("%s: Could not find 'File=.*kcfg' line\n" % \
program.mfile.dirname + "/" + base + ext) + kcfg = "unknown.kcfg"
+
+ rulef.insertTarget(rulef.build + sourcename + ".cpp",
+ rulef.build + sourcename + ".h")
+ rulef.insertTarget(rulef.build + sourcename + ".h",
+ [sourcedir + base + ".kcfgc",
+ sourcedir + kcfg,
+ "$(KCFG_DEPENDENCIES)"],
+ ["@echo '%s %s.h%s'" % (utilities.creating_text, rulef.build + sourcename, \
utilities.normal), + "@echo '%s %s.cpp%s'" % (utilities.creating_text, \
rulef.build + sourcename, utilities.normal), + "@$(KCONFIG_COMPILER) -d " + \
rulef.build + " " + sourcedir + kcfg + " " + sourcedir + base + ".kcfgc ||" + "( \
rm -f " + rulef.build + sourcename + ".h; false )"]) \
+ rulef.insertTarget(rulef.build + sourcename + program.objext, + \
rulef.build + sourcename + ".h") +
+ program.handle_source(sourcename, ".cpp")
+
+ program.cleanfiles.extend([sourcename + ".cpp",
+ sourcename + ".h"])
+
+handlerbase.register_handler(KCFGHandler())
diff --git a/unsermake/kdeinit.um b/unsermake/kdeinit.um
new file mode 100644
index 0000000..325b022
--- /dev/null
+++ b/unsermake/kdeinit.um
@@ -0,0 +1,97 @@
+# -*-python-*-
+# vim: ts=4
+
+import handlerbase
+import program
+
+class KDEINIT_Handler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "kdeinit"
+
+ def shuffle_binaries(self, amfile):
+ newbins = []
+
+ for bin in amfile.binaries.values():
+ if bin.prefix == 'kdeinit':
+ if not bin.type == program.PTYPE_LTLIBRARY:
+ print '%s: warning, %s is in kdeinit, but is not a LTLIBRARY!' % \
(amfile.filename, + bin.name)
+ return
+
+ assert(bin.canon_name.endswith('_la') and bin.name.endswith('.la'))
+ canon_name = bin.canon_name[:-3]
+ name = bin.name[:-3]
+
+ ldflags = amfile.value_of(bin.canon_name + '_LDFLAGS')
+ # don't retain -module or $(KDE_PLUGIN) arguments.
+ # It's a lib we build.
+ # care for them also at string end
+ ldflags = string.strip (string.replace (ldflags + ' ', '-module ', ' '))
+ ldflags = string.strip (string.replace (ldflags + ' ', '$(KDE_PLUGIN) ', ' '))
+ # first, we add the definitions for libkdeinit_<bin>.la
+ amfile.add_define('libkdeinit_' + canon_name + '_la_LDFLAGS',
+ ldflags + ' -avoid-version -no-undefined')
+ amfile.add_define('libkdeinit_' + canon_name + '_la_SOURCES',
+ amfile.value_of(bin.canon_name + '_SOURCES') )
+ amfile.add_define('libkdeinit_' + canon_name + '_la_LIBADD',
+ amfile.value_of(bin.canon_name + '_LIBADD') )
+
+ lines=["@echo 'extern \"C\" int kdemain(int, char* []);' > $@",
+ "@echo 'int main( int argc, char* argv[] ) { return kdemain(argc, argv); }' \
>> $@"] + amfile.insertTarget(amfile.build + 'kdeinit_' + canon_name + '.cpp', [], \
> > lines)
+ lines=["@echo '#include <kdemacros.h>' > $@",
+ "@echo 'extern \"C\" int kdemain(int, char* []);' >> $@",
+ "@echo 'extern \"C\" KDE_EXPORT int kdeinitmain( int argc, char* argv[] ) { \
return kdemain(argc, argv); }' >> $@"] + amfile.insertTarget(amfile.build + \
'kdeinit_' + canon_name + '.la.cpp', [], lines) +
+ amfile.insertTarget("clean-kdeinit_%s" % canon_name, "",
+ "rm -f $(top_builddir)/%s/kdeinit_%s.cpp \
$(top_builddir)/%s/kdeinit_%s.la.cpp" % (amfile.subdir, canon_name, amfile.subdir, \
canon_name), phony=1) + amfile.insertTarget("clean", "clean-kdeinit_%s" % \
canon_name ) +
+ import copy
+
+ # then we clone bin, rename the important bits,
+ # and re-parse the sources (easier than trying to
+ # reconstruct the objects and such from scratch)
+
+ kdeinitlib = copy.copy(bin)
+ kdeinitlib.set_type(program.PTYPE_LTLIBRARY)
+ kdeinitlib.name = 'libkdeinit_' + bin.name
+ kdeinitlib.canon_name = 'libkdeinit_' + bin.canon_name
+ kdeinitlib.final_sources = {}
+ kdeinitlib.prefix = 'lib'
+
+ pchvar = bin.canon_name + "_PCH"
+ if amfile.is_defined(pchvar):
+ amfile.add_define(kdeinitlib.canon_name + '_PCH', amfile.value_list(pchvar))
+ amfile.del_define(pchvar)
+
+ # now do the same thing for the <bin> definition
+
+ amfile.add_define(canon_name + '_LDFLAGS', '$(KDE_RPATH) $(all_libraries)')
+ amfile.add_define(canon_name + '_SOURCES', 'kdeinit_' + canon_name + '.cpp' )
+ amfile.add_define(canon_name + '_LDADD', kdeinitlib.name)
+
+ kdeinit = copy.copy(bin)
+ kdeinit.set_type(program.PTYPE_PROGRAM)
+ kdeinit.final_sources = {}
+ kdeinit.prefix = 'bin'
+ kdeinit.name = name
+ kdeinit.canon_name = canon_name
+
+ # now it's time to redefine the stuff for <bin.la>
+ amfile.del_define(bin.canon_name + '_SOURCES')
+ amfile.add_define(bin.canon_name + '_SOURCES', 'kdeinit_' + canon_name + \
'.la.cpp') + amfile.del_define(bin.canon_name + '_LIBADD')
+ amfile.add_define(bin.canon_name + '_LIBADD', kdeinitlib.name)
+
+ amfile.add_define('lib_LTLIBRARIES', kdeinitlib.name)
+ amfile.add_define('bin_PROGRAMS', kdeinit.name)
+
+ newbins.append(kdeinit)
+ newbins.append(kdeinitlib)
+
+ for prog in newbins:
+ amfile.binaries[prog.name] = prog
+
+handlerbase.register_handler(KDEINIT_Handler())
diff --git a/unsermake/manpages.um b/unsermake/manpages.um
new file mode 100644
index 0000000..5b79a7a
--- /dev/null
+++ b/unsermake/manpages.um
@@ -0,0 +1,110 @@
+# vim:syntax=python:ts=4:noet
+#
+# manpages.um - Copyright (C) Frerich Raabe <raabe@kde.org>
+# Licensed under whatever license unsermake is available.
+
+import handlerbase
+
+class ManpageHandler( handlerbase.HandlerBase ):
+ def __init__( self ):
+ self.name = "manpages"
+ self.manDbFileRx = re.compile( "^man-([^\.]+)\.([0-9nl])\.docbook$" )
+
+ def create( self, amfile ):
+ if not amfile.is_defined( "KDE_MANS" ):
+ return
+
+ if amfile.is_defined( "KDE_LANG" ):
+ lang = amfile.definition_rec( "KDE_LANG" )[ 0 ]
+ else:
+ lang = "en"
+
+ manDbFiles = amfile.definition_rec( "KDE_MANS" )
+ if manDbFiles[ 0 ] == 'AUTO':
+ manDbFiles = []
+ for file in os.listdir( amfile.subdir ):
+ if self.manDbFileRx.match( file ):
+ manDbFiles.append( file )
+
+ generatedManfiles = []
+ installedManfiles = []
+
+ manTarget = "mans-%s" % amfile.canon_subdir
+
+ createdManpageTarget = 0
+
+ for file in manDbFiles:
+ if not os.path.exists( amfile.subdir + "/" + file ):
+ utilities.print_error( "%s: Manpage source file '%s' does not exist!\n" % ( \
amfile.filename, file ) ) + continue
+
+ match = self.manDbFileRx.match( file )
+ if not match:
+ utilities.print_error( "%s: File '%s' does not comply to expected format \
('man-<appname>.<sect>.docbook') for manpage sources!\n" % ( amfile.filename, file ) \
) + continue
+
+ appname = match.group( 1 )
+ mansect = match.group( 2 )
+
+ manfile = appname + "." + mansect
+
+ generatedManfiles.append( amfile.build + manfile )
+
+ deps = amfile.value_list("KDE_DOCS_DEPS")
+ amfile.insertTarget( amfile.build + manfile,
+ [ amfile.source + file, "$(KDE_XSL_MAN_STYLESHEET)" ] + \
deps, + [ '@if test -n "$(MEINPROC)" && test -n \
"$(KDE_XSL_MAN_STYLESHEET)"; then \\', + ' echo generating \
%s; \\' % ( amfile.build + manfile ), + ' $(MEINPROC) \
--check --stylesheet $(KDE_XSL_MAN_STYLESHEET) %s; \\' % ( amfile.source + file ), \
+ ' mv manpage.troff %s; \\' % ( amfile.build + manfile ), + \
'fi' ] ) + amfile.insertTarget( manTarget, amfile.build + manfile, phony = 1 )
+
+ if lang == "en":
+ targetdir = "$(DESTDIR)$(mandir)/man%s/" % mansect
+ else:
+ targetdir = "$(DESTDIR)$(mandir)/%s/man%s/" % ( lang, mansect )
+
+ installedManfiles.append( targetdir + manfile )
+
+ amfile.insertTarget( targetdir + manfile,
+ [ amfile.build + manfile, "$(UNSERMAKE_FORCE_INSTALL)" ],
+ [ "@test -d %s || $(mkinstalldirs) %s" % ( targetdir, \
targetdir ), + "@if test -f %s; then \\" % ( amfile.build + \
manfile ), + " echo '%s %s%s'; \\" % ( \
utilities.installing_text, manfile, utilities.normal ), + " \
$(INSTALL_DATA) %s %s; \\" % ( amfile.build + manfile, targetdir), + \
"fi" ] ) + amfile.insertTarget( "install-%s" % manTarget, targetdir + manfile, \
phony = 1 ) +
+ createdManpageTarget = 1
+
+ if createdManpageTarget:
+ if not amfile.is_defined("KDE_XSL_MAN_STYLESHEET"):
+ amfile.add_define( "KDE_XSL_MAN_STYLESHEET", \
"$(kde_datadir)/ksgmltools2/customization/kde-man.xsl") +
+ cmds = []
+ for f in generatedManfiles:
+ cmds.append( "-rm -f %s" % f )
+ amfile.insertTarget( "clean-%s" % manTarget, "", cmds, phony = 1 )
+
+ cmds = []
+ for f in installedManfiles:
+ cmds.append( "-rm -f %s" % f )
+ amfile.insertTarget( "uninstall-%s" % manTarget, "", cmds, phony = 1 )
+
+ def collect( self, amfile ):
+ manTarget = "mans-%s" % amfile.canon_subdir
+ if amfile.is_target( manTarget ):
+ amfile.insertTarget( "clean", "clean-%s" % manTarget, phony = 1 )
+ amfile.insertTarget( "install", "install-%s" % manTarget, phony = 1 )
+ amfile.insertTarget( "uninstall", "uninstall-%s" % manTarget, phony = 1 )
+ amfile.insertTarget( "mans", manTarget, phony = 1 )
+ amfile.insertTarget( "all", "mans", phony = 1 )
+
+ # hook to check if compiling against the correct version
+ man = amfile.expand("$(KDE_XSL_MAN_STYLESHEET)")
+ if not os.path.exists(man):
+ amfile.del_define("KDE_XSL_MAN_STYLESHEET")
+ amfile.add_define("KDE_XSL_MAN_STYLESHEET", "")
+
+handlerbase.register_handler( ManpageHandler() )
diff --git a/unsermake/mocs.um b/unsermake/mocs.um
new file mode 100644
index 0000000..07950a5
--- /dev/null
+++ b/unsermake/mocs.um
@@ -0,0 +1,302 @@
+# -*-python-*-
+# vim: ts=4
+
+import handlerbase, utilities
+
+class MOCHandler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "mocs"
+
+ def requirements(self):
+ return (["designer"], ["autodeps"])
+
+ def create(self, amfile):
+
+ for program in amfile.binaries.values():
+ program.has_auto_moc = 0
+
+ amfile.has_auto_moc = 0
+ got_metasources = 0
+
+ value = amfile.value_list('METASOURCES')
+ if value:
+ got_metasources = 1
+
+ moc_list = value
+ if 'USE_AUTOMOC' in value:
+ utilities.print_error("%s: USE_AUTOMOC is unsupported\n" % amfile.filename)
+ if 'AUTO' in value:
+ amfile.has_auto_moc = 1
+ amfile.del_define("METASOURCES")
+
+ for program in amfile.binaries.values():
+ value = amfile.value_list(program.canon_name + '_METASOURCES')
+ if value:
+ got_metasources = 1
+ moc_list.extend(value)
+ if 'USE_AUTOMOC' in value:
+ utilities.print_error("%s: USE_AUTOMOC is unsupported\n" % amfile.filename)
+ if 'AUTO' in value:
+ if amfile.has_auto_moc:
+ sys.stderr.write("%s: defines METASOURCES = AUTO twice\n" %
+ amfile.filename)
+ amfile.binaries[program.name].has_auto_moc = 1
+ if amfile.has_auto_moc:
+ utilities.print_error("%s: METASOURCES = AUTO used twice\n" % amfile.filename)
+ amfile.has_auto_moc = 1
+ amfile.del_define(program.canon_name + '_METASOURCES')
+
+ if not got_metasources:
+ return
+
+ moc_files = self.find_moc_candidates(amfile, moc_list)
+ moc_files = self.prune_moc_candidates(amfile, moc_files)
+ self.check_moc_candidates(amfile, moc_files)
+ for mocfile, list in moc_files.items():
+ rhs = list[1]
+ if len(list) == 3:
+ rhs = "%s=%s" % (list[1], list[2])
+ amfile.add_define("METASOURCES", "%s=%s/%s" % (mocfile, list[0], rhs))
+
+ amfile.add_define("METASOURCES", [])
+
+ def tag_META_INCLUDES(self, amfile):
+ # Gets the list of user defined directories -
+ # relative to $srcdir - where header files could be located.
+
+ headerdirs = ['.']
+ if not amfile.is_defined("META_INCLUDES"):
+ return headerdirs
+
+ for dir in amfile.definition("META_INCLUDES"):
+ dir = string.replace(dir, "$(srcdir)", ".")
+
+ if not amfile.in_srcdir(dir):
+ sys.stderr.write("%s: Warning: %s [META_INCLUDES] can't "
+ "be found. Must be a relative path to "
+ "$(srcdir)!\n" % (amfile.filename, dir))
+ else:
+ headerdirs.append(dir)
+
+ return headerdirs
+
+ def find_moc_candidates(self, amfile, moc_list):
+
+ headerdirs = self.tag_META_INCLUDES(amfile)
+ moc_files = {}
+ commentre = re.compile('^\s*//\s*Q_OBJECT')
+ hfilere = re.compile("([^.].+)\.(h|H|hh|hxx|hpp|h\+\+)$")
+
+ for dir in headerdirs:
+ hfiles = []
+ for file in os.listdir(amfile.dirname + "/" + dir):
+ match = hfilere.match(file)
+ if match:
+ hfiles.append((match.group(1), match.group(2)))
+
+ moc_headers = []
+ for hf, ext in hfiles:
+ # ignore generated headers
+ if os.path.exists(amfile.dirname + "/" + hf + ".ui"):
+ continue
+ if os.path.exists(amfile.dirname + "/" + hf + ".ui3"):
+ continue
+ if os.path.exists(amfile.dirname + "/" + hf + ".ui4"):
+ continue
+
+ hfData = open( os.path.join(amfile.dirname,
+ dir, hf + "." + ext)).readlines()
+ for line in hfData:
+ if not string.find(line, "Q_OBJECT") == -1:
+ if not commentre.match(line):
+ moc_headers.append((hf, ext))
+ break
+
+ for hf, ext in moc_headers:
+ if not amfile.has_auto_moc and not (hf + ".moc") in moc_list:
+ continue
+ if moc_files.has_key(hf):
+ sys.stderr.write("%s: Multiple header files found for %s\n"
+ % (amfile.filename, hf) )
+ continue
+ moc_files[hf] = [dir, "%s.%s" % ( hf, ext) ]
+
+ return moc_files
+
+ def prune_moc_candidates(self, amfile, moc_files):
+
+ if amfile.has_auto_moc or not len(moc_files):
+ return moc_files
+
+ # The programmer has specified a moc list. Prune out the moc candidates
+ # list that we found based on looking at the header files. This generates
+ # a warning if the programmer gets the list wrong, but this doesn't have
+ # to be fatal here.
+
+ pruned_mocs = {}
+
+ moc_list = amfile.value_list("METASOURCES")
+
+ for prog in amfile.binaries.values():
+ if prog.has_auto_moc:
+ continue
+
+ moc_list.extend(amfile.value_list(prog.canon_name + "_METASOURCES"))
+
+ if len(moc_list):
+ for mocname in moc_list:
+ match = utilities.extre.match(mocname)
+ if not match:
+ utilities.print_error('%s: no moc file (nor AUTO): %s\n'
+ % (amfile.filename, mocname))
+ continue
+ mocname = match.group(1) # without extension
+ assert(match.group(2) == '.moc')
+
+ if moc_files.has_key(mocname):
+ pruned_mocs[mocname] = moc_files[mocname]
+ else:
+ sys.stderr.write("%s: Warning: no moc-able header file for %s\n"
+ % (amfile.filename, mocname))
+
+ return pruned_mocs
+
+ return moc_files
+
+ def check_moc_candidates(self, amfile, moc_files):
+ # Finds the cpp files (If they exist).
+ # The cpp files get appended to the header file
+
+ if not len(moc_files):
+ return
+
+ import utilities
+ cppfiles, hfiles = utilities.list_source_files(amfile.dirname)
+
+ if not len(cppfiles):
+ return
+
+ moc2cpp = {}
+ mocre = re.compile("[ \t]*#include\s*[<\"](.*\.moc)[>\"]")
+ for (base, ext) in cppfiles:
+ file = base + ext
+ lines = open(amfile.dirname + "/" + file).readlines()
+ for line in lines:
+ match = mocre.match(line)
+ if match:
+ try:
+ moc2cpp[match.group(1)].append(file)
+ except:
+ moc2cpp[match.group(1)] = [file]
+
+ deped = []
+ new_objs = []
+ for mocfile in moc_files.keys():
+ try:
+ cppfiles = moc2cpp[mocfile + ".moc"]
+ except:
+ cppfiles = []
+ if len(cppfiles) == 1:
+ moc_files[mocfile].append(cppfiles[0])
+ deped.append(mocfile)
+ elif len(cppfiles) == 0:
+ new_objs.append(mocfile)
+ if amfile.has_auto_moc:
+ continue
+ sys.stderr.write("%s: Warning: No c++ file includes %s.moc\n" % \
(amfile.filename, mocfile)) + else:
+ # We can't decide which file to use, so it's fatal. Although as a
+ # guess we could use the mocFile.cpp file if it's in the list???
+ sys.stderr.write("%s: Error: Multiple c++ files include %s.moc\n" % \
(amfile.filename, mocfile)) + utilities.print_error("%s: \t%s\n" % \
(amfile.filename, string.join(cppfiles))) + return
+
+ if amfile.has_auto_moc and len(new_objs):
+ hits = []
+ for prog in amfile.binaries.values():
+ if prog.has_auto_moc:
+ hits.append(prog)
+ if not len(hits) and len(amfile.binaries) == 1:
+ hits.extend(amfile.binaries.values())
+ if not len(hits) == 1:
+ utilities.print_error("%s: not all moc files are included in cpp files\n"
+ "\tand there are more than one targets that could link\n"
+ "\tto them: %s\n" % (amfile.filename, string.join(new_objs)))
+
+ else:
+ for obj in new_objs:
+ amfile.add_define(hits[0].canon_name + "_SOURCES", obj + ".moc.cpp")
+ list = moc_files[obj]
+ assert(len(list) == 2)
+ dir = list[0]
+ if len(dir) and dir[0] == '.':
+ subdir_dir = amfile.source + dir[1:]
+ if not subdir_dir[-1] == '/':
+ subdir_dir = subdir_dir + '/'
+ else:
+ subdir_dir = amfile.build
+ amfile.insertTarget(amfile.build + obj + ".moc.cpp", subdir_dir + list[1],
+ ["@echo '%s %s.moc.cpp%s'" % (utilities.creating_text, amfile.build + obj, \
utilities.normal), + "@$(MOC) $? -o $@"])
+
+ def collect(self, amfile):
+
+ # adding default targets
+ amfile.insertTarget("mocs", "", phony=1)
+
+ # Add the rules for generating moc source from header files
+ # For Automoc output *.moc.cpp but normally we'll output *.moc
+ # (We must compile *.moc.cpp separately. *.moc files are included
+ # in the appropriate *.cpp file by the programmer)
+
+ if not amfile.is_defined("METASOURCES"):
+ return
+
+ metasources = amfile.definition("METASOURCES")
+
+ mocs = []
+ mocre = re.compile('^([^=]*)=(.*)$')
+
+ for mocfile in metasources:
+
+ match = mocre.match(mocfile)
+ if not match: # eventually not uptodate Makefile
+ continue
+ mocfile = match.group(1)
+ hfile = match.group(2)
+ dir = ''
+ cppfile = ''
+
+ index = string.rfind(hfile, '/')
+ if index > 0:
+ dir = hfile[:index]
+ hfile = hfile[index+1:]
+
+ index = string.find(hfile, '=')
+ if index > 0:
+ cppfile = hfile[index+1:]
+ hfile = hfile[:index]
+
+ if len(dir) and dir[0] == '.':
+ subdir_dir = amfile.source + dir[1:]
+ if not subdir_dir[-1] == '/':
+ subdir_dir = subdir_dir + '/'
+ else:
+ subdir_dir = amfile.build
+
+ if cppfile:
+ amfile.insertTarget(amfile.build + mocfile + ".moc",
+ subdir_dir + hfile,
+ ["@echo '%s %s.moc%s'" % (utilities.creating_text, amfile.build + mocfile, \
utilities.normal), + "@$(MOC) $? -o $@"])
+ mocs.append(amfile.build + mocfile + ".moc")
+ else:
+ mocs.append(amfile.build + mocfile + ".moc.cpp")
+
+ if len(mocs):
+ amfile.add_define(amfile.canon_subdir + "_MOCS", mocs)
+ amfile.insertTarget("mocs", "$(%s_MOCS)" % amfile.canon_subdir, phony=1)
+ amfile.insertTarget("clean-metasources-%s" % amfile.canon_subdir, "", "-rm -f \
$(%s_MOCS)" % amfile.canon_subdir, phony=1) + amfile.insertTarget("clean", \
"clean-metasources-%s" % amfile.canon_subdir, phony=1) +
+handlerbase.register_handler(MOCHandler())
diff --git a/unsermake/pch.um b/unsermake/pch.um
new file mode 100644
index 0000000..3eadf73
--- /dev/null
+++ b/unsermake/pch.um
@@ -0,0 +1,116 @@
+# -*-python-*-
+# vim: ts=4 noet
+# Copyright (C) 2004 Simon Hausmann <hausmann@kde.org>
+# Licensed under whatever license unsermake is available
+
+import handlerbase
+import program
+import utilities
+
+class PCHHandler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "pch"
+
+ def requirements(self):
+ return (["kdeinit"], [])
+
+ def inclusion_candidates(self, amfile, bin):
+ candidates = []
+ for src in amfile.definition_rec(bin.canon_name + '_SOURCES'):
+ match = utilities.extre.match(src)
+ base = match.group(1)
+ ext = match.group(2)
+ if ext not in utilities.cppext:
+ continue
+ for h in utilities.hext:
+ candidate = amfile.dirname + "/" + base + h
+ if os.path.exists(candidate):
+ candidates.extend([base + h])
+ return candidates
+
+ def make_auto_pch_rule(self, amfile, bin):
+ header = amfile.build + bin.canon_name + "_pch.h"
+
+ candidates = self.inclusion_candidates(amfile, bin)
+ if not len(candidates):
+ return []
+
+ lines = [ '@echo ''%s %s%s''' % (utilities.creating_text, header, \
utilities.normal), + '@echo "#if defined(__cplusplus)"> %s' % header,
+ '@for header in %s; do \\' % string.join(candidates),
+ ' echo "#include \\"$$header\\"" >> %s; \\' % header,
+ 'done',
+ '@echo "#endif" >> %s' % header ]
+
+ amfile.insertTarget(header, '', lines)
+ bin.cleanfiles.extend([header])
+
+ return [amfile.build, bin.canon_name + "_pch.h"]
+
+ def shuffle_binaries(self, amfile):
+
+ for bin in amfile.binaries.values():
+
+ pchvarname = bin.canon_name + "_PCH"
+ header = amfile.value_list(pchvarname)
+
+ if not header:
+ continue
+
+ if len(header) > 1:
+ print "%s: there can be only one header file to precompile, skipping" % \
pchvarname + continue
+
+ header = header[0]
+
+ # TODO: move to a create function
+ if header == "AUTO":
+ header = self.make_auto_pch_rule(amfile, bin)
+ if not len(header):
+ print "%s: AUTO specified but can't find candidates for inclusion" % pchvarname
+ continue
+ dir = header[0]
+ header = header[1]
+ else:
+ dir = amfile.source
+
+ pchdir = amfile.build + bin.canon_name + ".gch"
+
+# print '%s: PCH = %s / pchdir %s' % (amfile.filename, header, pchdir)
+
+ # first create rules for precompiling the header
+ cxxlines = ["@test -d %s || mkdir -p %s" % (pchdir, pchdir)]
+ clines = ["@test -d %s || mkdir -p %s" % (pchdir, pchdir)]
+
+ splittedheader = string.split(header, ".")
+ base = splittedheader[0]
+ ext = "." + splittedheader[1]
+
+ pic = '';
+ if bin.type == program.PTYPE_LTLIBRARY:
+ pic = '-fPIC -DPIC'
+
+ cxxtarget = pchdir + '/c++'
+ ctarget = pchdir + '/c'
+
+ cxxlines.extend(bin.compile_lines(dir, base, ext, "-x c++-header " + pic, 1, 1, \
"_cxx", cxxtarget)) + clines.extend(bin.compile_lines(dir, base, ext, "-x c-header \
" + pic, 0, 1, "_c", ctarget)) +
+ amfile.insertTarget(cxxtarget, dir + header, cxxlines)
+ amfile.insertTarget(ctarget, dir + header, clines)
+
+ # now create target rules for forcing the inclusion of the pch with -include
+ amfile.add_define(amfile.canon_subdir + '_' + bin.canon_name + "_CPPFLAGS", \
"-include " + bin.canon_name, 'unsermake_enable_pch') +
+ # the *_target_PCHDEP variable is what the actual objects depend on, to make sure \
the + # precompiled header gets created before the first inclusion
+ amfile.add_define(amfile.canon_subdir + "_" + bin.canon_name + "_PCHDEP", pchdir \
+ "/c++" + ' ' + pchdir + '/c', 'unsermake_enable_pch') +
+ amfile.dep_files.append('%s.U%s' % (base + "_cxx", bin.objext[1:]))
+ amfile.dep_files.append('%s.U%s' % (base + "_c", bin.objext[1:]))
+
+ bin.cleanfiles.extend([bin.canon_name + ".gch/c++", bin.canon_name + ".gch/c"])
+
+
+# register_handler(PCHHandler())
+
diff --git a/unsermake/pofiles.um b/unsermake/pofiles.um
new file mode 100644
index 0000000..9e6033e
--- /dev/null
+++ b/unsermake/pofiles.um
@@ -0,0 +1,125 @@
+# -*-python-*-
+# vim: set ts=4
+
+import handlerbase
+
+class POFILES_Handler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "pofiles"
+
+ def __toplevel(self, amfile, kdelang):
+
+ files = os.listdir(amfile.dirname)
+ restfiles = []
+ for entry in files:
+ if not os.path.exists(amfile.dirname + "/" + entry):
+ continue
+
+ if os.path.isdir(amfile.dirname + "/" + entry):
+ continue
+
+ if entry == "CVS" or entry[0] == '.' or entry.startswith( "Makefile") or \
+ entry.endswith( "~") or (entry.startswith("#") and entry.endswith("#")) or \
+ entry.endswith( ".gmo") or entry.endswith( ".po"):
+ continue
+
+ restfiles.append(entry)
+
+ if len(restfiles):
+ amfile.insertTarget("install-data", "install-nls-files", phony=1)
+ ideps = []
+ urules = []
+ mkdir = ['$(mkinstalldirs) $(DESTDIR)$(kde_locale)/' + kdelang]
+ for file in restfiles:
+ amfile.insertTarget('$(DESTDIR)$(kde_locale)/%s/%s' % (kdelang, file),
+ '$(srcdir)/%s' % file,
+ mkdir + ['$(INSTALL_DATA) $(srcdir)/%s $(DESTDIR)$(kde_locale)/%s/%s' % \
(file, kdelang, file)]) + amfile.insertTarget('install-nls-files', \
'$(DESTDIR)$(kde_locale)/%s/%s' % (kdelang, file), phony=1) + urules.append('-rm \
-f $(DESTDIR)$(kde_locale)/%s/%s' % (kdelang, file)); \
+ amfile.insertTarget("uninstall", "uninstall-nls-files", [], phony=1) \
+ amfile.insertTarget("uninstall-nls-files", [], urules, phony=1) +
+ def collect(self, amfile):
+
+ if not amfile.is_defined('POFILES'):
+ return
+
+ if amfile.is_defined("KDE_LANG"):
+ kdelang = amfile.definition_rec('KDE_LANG')
+ if not len(kdelang) == 1:
+ print_error("%s: KDE_LANG is not a value but a(n empty) list\n" % \
amfile.filename) + kdelang = kdelang[0]
+ else:
+ kdelang = ''
+
+ pofiles = amfile.definition_rec('POFILES')
+
+ # Either find the pofiles in the directory (AUTO) or use
+ # only the specified po files.
+ if pofiles == ['AUTO']:
+ pofiles = []
+ for file in os.listdir(amfile.dirname):
+ if file.endswith(".po"):
+ pofiles.append(file)
+ if os.path.exists(amfile.dirname + "/charset" ) and os.path.exists(amfile.dirname \
+ "/kdelibs/kdelibs.po" ): + self.__toplevel(amfile, kdelang)
+
+ if not len(pofiles):
+ return
+
+ self.__POFILES(amfile, pofiles, kdelang)
+
+ def __POFILES(self, amfile, pofiles, kdelang):
+ gmofiles = []
+
+ allrule = "all-nls-" + amfile.canon_subdir
+
+ for pofile in pofiles:
+ assert(pofile.endswith(".po"))
+ pofile = pofile[:-3]
+ amfile.insertTarget(amfile.build + pofile + ".gmo",
+ amfile.source + pofile + ".po",
+ ["rm -f " + amfile.build + pofile + ".gmo",
+ "$(GMSGFMT) -o %s.gmo %s.po" % (amfile.build + pofile, amfile.source + \
pofile)]) + gmofiles.append(pofile + ".gmo")
+ amfile.insertTarget(allrule, amfile.build + pofile + ".gmo", phony=1)
+
+ if len(gmofiles):
+ amfile.insertTarget("clean-nls", [],
+ ["-rm -f " + string.join(gmofiles)], phony=1)
+ amfile.insertTarget("clean", "clean-nls", phony=1)
+ amfile.insertTarget("all-" + amfile.canon_subdir, allrule, phony=1)
+
+ urules = []
+
+ for pofile in pofiles:
+ pofile = pofile[:-3]
+ if kdelang:
+ installdir = "$(DESTDIR)$(kde_locale)/%s/LC_MESSAGES/" % kdelang
+ amfile.insertTarget(installdir + pofile + ".mo",
+ amfile.build + pofile + ".gmo",
+ ["@$(mkinstalldirs) " + installdir,
+ "@echo $(INSTALL_DATA) %s.gmo %s.mo" % (amfile.build + pofile, installdir + \
pofile), + "@$(INSTALL_DATA) %s.gmo %s.mo" % (amfile.build + pofile, \
installdir + pofile)]) + urules.append("-rm -f " + installdir + pofile + ".mo")
+ amfile.insertTarget("install-data", installdir + pofile + ".mo", phony=1)
+ else:
+ # FILENAME == LANG
+ installdir = "$(DESTDIR)$(kde_locale)/%s/LC_MESSAGES" % pofile
+ if amfile.is_defined("PACKAGE"):
+ installfile = installdir + "/" + amfile.definition_rec("PACKAGE")[0] + ".mo"
+ else:
+ installfile = installdir + "/" + "$(PACKAGE).mo"
+
+ amfile.insertTarget(installfile,
+ amfile.build + pofile + ".gmo",
+ ["@$(mkinstalldirs) " + installdir,
+ "@echo $(INSTALL_DATA) %s.gmo %s" % (amfile.build + pofile, installfile),
+ "@$(INSTALL_DATA) %s.gmo %s" % (amfile.build + pofile, installfile)])
+ urules.append("-rm -f " + installfile)
+ amfile.insertTarget("install-data", installfile, phony=1)
+
+ amfile.insertTarget("uninstall-nls-" + amfile.canon_subdir, [], urules, phony=1)
+ amfile.insertTarget("uninstall", "uninstall-nls-" + amfile.canon_subdir, phony=1)
+
+handlerbase.register_handler(POFILES_Handler())
diff --git a/unsermake/program.py b/unsermake/program.py
new file mode 100644
index 0000000..e00a877
--- /dev/null
+++ b/unsermake/program.py
@@ -0,0 +1,559 @@
+import string, os
+import utilities, handlerbase
+from target import DefValue
+
+PTYPE_PROGRAM = 1
+PTYPE_LIBRARY = 2
+PTYPE_LTLIBRARY = 3
+
+__pychecker__ = 'no-shadowbuiltin'
+
+def program_type(suffix):
+ type = 0
+ if suffix == 'PROGRAMS':
+ type = PTYPE_PROGRAM
+ elif suffix == 'LIBRARIES':
+ type = PTYPE_LIBRARY
+ else:
+ type = PTYPE_LTLIBRARY
+ return type
+
+class Program:
+ def __init__(self, mfile, name, prefix, type):
+ self.name = name
+ self.prefix = prefix
+ self.canon_name = utilities.canon_name(name)
+ self.mfile = mfile
+ self.objs = []
+ self.cleanfiles = []
+ self.final_sources = {}
+ self.set_type(type)
+
+ def __repr__(self):
+ return self.name
+
+ def set_type(self, type):
+ self.type = type
+ if self.type == PTYPE_LTLIBRARY:
+ self.objext = '.lo'
+ else:
+ self.objext = '.o'
+
+ def is_cpp(self, ext):
+ return ext in utilities.cppext
+
+ def check_target(self, targ):
+ if not targ:
+ return 0
+ if targ.has_rules() or not targ.user_specified:
+ return 1
+ else:
+ print '%s: there are dependencies for the source file %s without' % \
(self.mfile.filename, + targ.target)
+ print '\trules to generate it. This should most probably read as'
+ print '\tdependencies for the object file.'
+ return 0
+
+ def compile_lines(self, dir, base, ext, extraflags = "", forcecxx = 0, \
forcelibtoolOff = 0, depfilesuffix = '', deptargetoverride = ''): +
+ libtool=0
+ compile = ""
+
+ if self.type == PTYPE_LTLIBRARY and forcelibtoolOff == 0:
+ libtool=1
+ #compile = '$(LIBTOOL) --mode=compile '
+ #if self.is_cpp(ext):
+ # compile = compile + '--tag=CXX '
+
+ if self.is_cpp(ext) or forcecxx != 0:
+ compile += '$(CXX) '
+ else:
+ compile += '$(CC) '
+
+ compile += self.mfile.default_includes + ' $(INCLUDES) '
+ compile += self.handle_variable("CPPFLAGS")
+
+ if len(extraflags):
+ compile += extraflags + ' '
+
+ if self.is_cpp(ext) or forcecxx:
+ compile += self.handle_variable("CXXFLAGS")
+ else:
+ compile += self.handle_variable("CFLAGS")
+
+ if libtool:
+ compile = compile + "-fPIC -DPIC "
+
+ rulef = self.mfile
+ file = dir + base + ext
+ lines = ['@$(V_ECHO) "%s %s%s"' % (utilities.compile_text, file, \
utilities.normal)] +
+ # Output in local .libs if a subdir source file is passed on SOURCES line
+ base = os.path.basename(base)
+ output='$@'
+ if libtool:
+ output=rulef.build + '.libs/' + base + '.o'
+ lines.extend(['@if test ! -d "' + rulef.build + '.libs"; then mkdir "' + \
rulef.build + '.libs"; \\', + 'status=$$?; if test "$$status" -ne 0 && test ! \
-d "' + rulef.build + '.libs"; then \\', + 'exit $$status; fi; fi'])
+
+ target = rulef.build + base + self.objext
+ if deptargetoverride:
+ target = deptargetoverride
+
+ lines.extend(["@depfile='%s$(DEPDIR)/%s.U%s' tmpdepfile='%s$(DEPDIR)/%s.TU%s' \
targetfile='$%s';\\" + % (rulef.build, base + depfilesuffix, self.objext[1:],
+ rulef.build, base + depfilesuffix, self.objext[1:],
+ target),
+ "set %s-c %s -o %s -Wp,-MD,$$tmpdepfile; \\" % (compile, file, output),
+#gcc3 "set %s-c %s -o %s -MT %s -MD -MP -MF $$tmpdepfile; \\" % (compile, \
file, output, output), + "$(V_COMPILE)"])
+
+ if libtool:
+ outname = rulef.build + base + '.lo'
+ lines.extend(["@echo '# Generated by libtool replacement' > " + outname,
+ "@echo \"pic_object='.libs/" + base + ".o'\" >> " + outname,
+ "@echo \"non_pic_object='.libs/" + base + ".o'\" >> " + outname])
+ return lines
+
+ def handle_source(self, base, ext, forcegenerated=0, final=0):
+
+ rulef = self.mfile
+
+ dir = rulef.source
+ insource = 1
+ if self.check_target(rulef.target(rulef.build + base + ext)) \
+ or self.check_target(self.mfile.target(base + ext)) \
+ or forcegenerated:
+ dir = rulef.build
+ insource = 0
+
+ lines = self.compile_lines(dir, base, ext)
+
+ dep = dir + base + ext
+ pchdep = self.canon_name + '_PCHDEP'
+ if self.mfile.is_defined(pchdep):
+ dep += ' $(' + pchdep + ')'
+
+ # Output in local .libs if a subdir source file is passed on SOURCES line
+ base = os.path.basename(base)
+ rulef.insertTarget(rulef.build + base + self.objext,
+ dep,
+ lines, compile_target=1)
+
+ rulef.dep_files.append('%s.U%s' % (base, self.objext[1:]))
+ self.mfile.translate_target(base + ext)
+ self.mfile.translate_target(base + self.objext)
+
+ if self.is_cpp(ext):
+ if final:
+ objpref = '_final'
+ else:
+ objpref = '_nofinal'
+ else:
+ objpref = ""
+
+ rulef.add_define(self.canon_name + "%s_OBJECTS" % objpref,
+ rulef.build + base + self.objext)
+
+
+ self.objs.append(base + self.objext)
+ # no final for no cpp
+ if not self.is_cpp(ext):
+ return
+
+ if not self.final_sources.has_key(ext):
+ # first
+ rulef.insertTarget(rulef.build + self.canon_name + '_all_' + ext[1:] + ext,
+ rulef.source + "Makefile.in")
+ if insource:
+ self.final_sources[ext] = ([ base + ext ], [] )
+ else:
+ self.final_sources[ext] = ([], [ base + ext ] )
+ else:
+ if insource:
+ self.final_sources[ext][0].append( base + ext )
+ else:
+ self.final_sources[ext][1].append( base + ext )
+
+ if not final:
+ final_target = rulef.build + self.canon_name + '_all_' + ext[1:]
+ rulef.insertTarget(final_target + ext, dir + base + ext)
+ for oext in ['.o', '.lo']:
+ targ = self.mfile.target(base + oext)
+ if targ:
+ ndeps = self.mfile.replace_builddir(targ.deps, 0)
+ deps = []
+ for dep in ndeps:
+ deps.append(DefValue(dep))
+ rulef.insertTarget(final_target + oext, deps)
+
+ def handle_variable(self, var): # Now handles the Paths
+ svar = "AM_%s" % var
+ if self.mfile.is_defined(svar):
+ added = string.join(self.mfile.definition(svar))
+ else:
+ added = ""
+ if self.mfile.is_defined(var) or utilities.subst_vars.has_key(var):
+ added += " $(%s) " % var
+ else:
+ added += " "
+ svar = "KDE_%s" % var
+ if self.mfile.is_defined(svar):
+ added += string.join(self.mfile.definition(svar))
+ svar = "%s_%s" % (self.canon_name, var)
+ if self.mfile.is_defined(svar) or utilities.subst_vars.has_key(svar):
+ added += " $(%s) " % svar
+ else:
+ added += " "
+
+ return added
+
+ def handle_add_variable(self, var):
+ cvar = self.canon_name + "_" + var
+
+ if not self.mfile.is_defined(cvar) and self.mfile.is_defined(var):
+ adds = self.mfile.definition_rec(var)
+ elif self.mfile.is_defined(cvar):
+ adds = self.mfile.definition_rec(cvar)
+ else:
+ adds = []
+
+ if len(adds):
+ for add in adds:
+ self.mfile.translate_target(add)
+
+ self.mfile.del_define(cvar)
+ self.mfile.add_define(cvar, self.mfile.replace_builddir(adds, 0))
+ return "$(%s) " % cvar
+ else:
+ return ""
+
+ def create_handle_variable(self, var, replace_srcdir=0):
+ self.mfile.add_prefixed_variable("AM_%s" % var, replace_srcdir)
+ self.mfile.add_prefixed_variable(var, replace_srcdir)
+ self.mfile.add_prefixed_variable("KDE_%s" % var, replace_srcdir)
+
+ targetvar = self.canon_name + "_" + var
+ self.mfile.add_prefixed_variable(targetvar, replace_srcdir)
+
+ def create_variables(self):
+ self.create_handle_variable("LDFLAGS", 0)
+
+ self.mfile.add_prefixed_variable("INCLUDES", replace_srcdir=1)
+ self.create_handle_variable("CPPFLAGS", replace_srcdir=1)
+ self.create_handle_variable("CXXFLAGS", replace_srcdir=1)
+
+ def add_targets(self):
+ prefix = self.canon_name
+
+ if self.type == PTYPE_LIBRARY:
+ link = '@set $(AR) rcu $@ $(%s_OBJECTS) ' % prefix
+ link += self.handle_add_variable("LIBADD")
+ link = [ link + ' ;\\', '$(V_EXEC)',
+ '@set $(RANLIB) $@ ;\\', '$(V_EXEC)' ]
+ else:
+ if utilities.subst_vars.has_key("LIBTOOL"):
+ link = '@set $(LIBTOOL) --mode=link '
+ if self.use_c_linker:
+ link += '--tag=CC '
+ else:
+ link += '--tag=CXX '
+ else:
+ link = '@set '
+
+ if self.use_c_linker:
+ link += '$(CLD) '
+ link += self.handle_variable("CFLAGS")
+ link += self.handle_variable("CPPFLAGS")
+ else:
+ link += '$(CXXLD) '
+ link += self.handle_variable("CXXFLAGS")
+ link += self.handle_variable("CPPFLAGS")
+
+ link += self.handle_variable("LDFLAGS")
+
+ link += "-o $@ "
+
+ is_installed = 1
+ if self.prefix in ['noinst', 'check', 'EXTRA']:
+ is_installed = 0
+
+ if self.type == PTYPE_LTLIBRARY and is_installed:
+ link = link + "-rpath $(%sdir) " % self.prefix
+
+ if self.mfile.is_defined(self.canon_name + "_LDFLAGS"):
+ ldflags = self.mfile.definition_rec(self.canon_name + "_LDFLAGS")
+ if '-no-undefined' in ldflags and not '$(KDE_PLUGIN)' in ldflags:
+ if utilities.subst_vars.has_key("KDE_NO_UNDEFINED"):
+ link += '$(KDE_NO_UNDEFINED) '
+
+ if self.type == PTYPE_LTLIBRARY:
+ var = self.handle_add_variable("LIBADD")
+ else:
+ var = self.handle_add_variable("LDADD")
+
+ create_deps = []
+ if len(var) and is_installed:
+ match = utilities.variablere.match(string.strip(var))
+ if match:
+ list = self.mfile.definition(match.group(1))
+ for l in list:
+ if l.endswith('.la') and (l.startswith(self.mfile.build) or
+ l.startswith('$(top_builddir)/')):
+ create_deps.append(l)
+ deps_lines = []
+ if len(create_deps):
+ depfile= self.mfile.build + '$(DEPDIR)/%s.Ula' % self.canon_name
+ deps_lines = [ "@echo \"# DESTDIR deps\" > %s.tmp" % depfile,
+ "@for file in " + string.join(create_deps) + " ; do \\",
+ " ( . $$file ;\\",
+ " if test -n \"$$libdir\"; then \\",
+ " base=`basename $$file` ;\\",
+ " echo \'$(DESTDIR)$(%sdir)/%s: " % (self.prefix, self.name)
+ + "$(DESTDIR)\'\"$$libdir/$$base\" >> %s.tmp; fi ) ;\\" % depfile,
+ "done; mv %s.tmp %s" % (depfile,depfile)]
+ self.mfile.dep_files.append('%s.Ula' % self.canon_name)
+
+ link = [ link + var + "$(%s_OBJECTS) $(LIBS) ;\\" % prefix, '$(V_EXEC)' ] + \
deps_lines +
+ if self.type == PTYPE_LTLIBRARY or self.type == PTYPE_LIBRARY:
+ add_prefix = "LIB"
+ else:
+ add_prefix = "LD"
+
+ deps = [DefValue('$(%s_OBJECTS)' % prefix)]
+ for var in ['%s_DEPENDENCIES' % self.canon_name,
+ 'DEPENDENCIES']:
+ if self.mfile.is_defined(var):
+ ndeps = self.mfile.replace_builddir(self.mfile.definition_rec(var), 0)
+ for dep in ndeps:
+ deps.append(DefValue(dep))
+
+ for var in [ '%s_%sADD' % (self.canon_name, add_prefix),
+ '%sADD' % add_prefix]:
+ if self.mfile.is_defined(var):
+ ndeps = self.mfile.definition_rec(var)
+ index = 0
+ while index < len(ndeps):
+ dep = ndeps[index]
+ if len(dep) > 1:
+ if dep[0] == '-':
+ if dep[1] == 'l':
+ index += 1
+ continue
+ elif dep[1] == 'L' or dep[1] == 'R':
+ if len(dep) == 2:
+ index += 2
+ else:
+ index += 1
+ continue
+ elif dep[1:4] == 'Wl,' or dep == '-Xlinker':
+ index += 2
+ continue
+ elif dep == '-framework': # Darwin
+ index += 2
+ continue
+ elif dep == '-pthread':
+ index += 1
+ continue
+ utilities.print_error("%s: variable '%sADD' contains unknown flag %s. These \
should be in LDFLAGS!\n" % + (self.mfile.filename, add_prefix, dep))
+ break
+ index += 1
+ list = self.mfile.replace_builddir([dep], 1)
+ if len(list) == 1:
+ deps.append(DefValue(list[0]))
+ elif len(list) != 0:
+ print list
+ assert(False)
+
+ rulef = self.mfile
+ rulef.insertTarget(rulef.build + self.name,
+ deps,
+ ["@rm -f " + rulef.build + self.name,
+ '@$(V_ECHO) "%s %s%s"' % (utilities.linking_text, rulef.build + self.name, \
utilities.normal)] + link) + if not self.prefix in ['check', 'EXTRA']:
+ rulef.insertTarget('compile', '$(%s_OBJECTS)' % prefix, phony=1)
+ lines = ["rm -f $(%s_OBJECTS)" % prefix,
+ "rm -f " + rulef.build + self.name]
+ if len(self.cleanfiles):
+ line = "rm -f"
+ for file in self.cleanfiles:
+ line = line + " " + rulef.build + file
+ lines.append(line)
+ if self.type == PTYPE_LTLIBRARY:
+ lines.append("rm -rf " + rulef.build + ".libs")
+ rulef.insertTarget("clean-%s" % prefix, "", lines, phony=1)
+ rulef.insertTarget("clean", "clean-%s" % prefix, phony=1)
+ self.add_closure_target(deps)
+
+ def add_closure_target(self, deps):
+
+ if 'KDE_USE_CLOSURE' in utilities.false_conds or not \
utilities.subst_vars.has_key("LIBTOOL"): + return
+
+ # adding closure
+ rulef = self.mfile
+ prefix = self.canon_name
+
+ if self.type == PTYPE_LTLIBRARY and self.mfile.is_defined(self.canon_name + \
"_LDFLAGS"): + flags = self.mfile.definition(self.canon_name + "_LDFLAGS")
+ if not '-no-undefined' in flags and not '$(KDE_PLUGIN)' in flags:
+ if self.name.startswith('lib'):
+ pass
+ # print self.mfile.filename, 'lib', self.name, 'with undefined'
+ return
+
+ closure = rulef.build + self.name + ".closure"
+ linkline = '@$(LIBTOOL) --mode=link '
+ if self.use_c_linker:
+ linkline += '--tag=CC $(CLD) '
+ else:
+ linkline += '--tag=CXX $(CXXLD) '
+ linkline += self.handle_variable("LDFLAGS")
+ if self.mfile.is_defined(self.canon_name + "_LDFLAGS"):
+ linkline = linkline + "$(%s_LDFLAGS) " % prefix
+
+ linkline = linkline + '-o %s ' % closure
+ linkline = linkline + '%s_closure.lo ' % (rulef.build + self.canon_name)
+ linkline = linkline + '$(%s_OBJECTS) ' % prefix
+ linkline = linkline + self.handle_add_variable("LIBADD")
+ linkline = linkline + '$(LIBS) ;\\'
+
+ lines = ['@echo "int main() {return 0;}" > ' + rulef.build + self.canon_name + \
'_closure.cpp'] + lines.extend(self.compile_lines(rulef.build, self.canon_name + \
'_closure', '.cpp')) + lines.extend([
+ '@echo "%s %s%s"' % (utilities.creating_text,closure, utilities.normal),
+ linkline,
+ 'stat=$$? ;\\',
+ 'rm -f %s_closure.* %s ;\\' % (rulef.build + self.canon_name, closure),
+ 'if test "$$stat" = 0; then echo "timestamp" > %s; fi' % closure])
+
+ rulef.insertTarget(closure,
+ ['$(%s_OBJECTS)' % prefix] + deps,
+ lines, compile_target=1)
+
+ rulef.add_define('%s_CLOSURE' % prefix, [closure])
+ rulef.insertTarget(rulef.build + self.name, '$(%s_CLOSURE)' % prefix)
+
+ rulef.insertTarget('clean-closure-' + prefix,
+ [],
+ 'rm -f ' + closure, phony=1)
+ rulef.insertTarget('clean', 'clean-closure-' + prefix, phony=1)
+
+
+ def add_final_target(self):
+ count = 0
+ for ext in self.final_sources.keys():
+ count = count + len(self.final_sources[ext][0]) + len(self.final_sources[ext][1])
+
+ rulef = self.mfile
+ prefix = self.canon_name
+
+ if count < 2 or self.mfile.get_opt("nofinal"):
+ defs = rulef.definition(prefix + '_OBJECTS')
+ defs.extend( rulef.definition(prefix + '_nofinal_OBJECTS') )
+
+ if rulef.defs.has_key(prefix + '_OBJECTS'):
+ del rulef.defs[prefix + '_OBJECTS']
+ if rulef.defs.has_key(prefix + '_nofinal_OBJECTS'):
+ del rulef.defs[prefix + '_nofinal_OBJECTS']
+ rulef.add_define(prefix + '_OBJECTS', defs)
+ return
+
+ # adding final target
+ for ext in self.final_sources.keys():
+ finaltarget = rulef.build + self.canon_name + '_all_' + ext[1:] + ext
+ lines = ["@echo '%s %s%s'; \\" % \
(utilities.creating_text,finaltarget,utilities.normal), + "rm -f %s.final; \\" % \
finaltarget, + "echo \"#define KDE_USE_FINAL 1\" >> %s.final ;\\" % finaltarget,
+ "echo \"#include <config.h>\" >> %s.final;\\" % finaltarget]
+ if len(self.final_sources[ext][0]):
+ lines.extend(["for file in " + string.join(self.final_sources[ext][0]) + "; do \
\\", + " echo \"#include \\\"$$file\\\"\" >> %s.files; \\" % finaltarget,
+ " grep '^#pragma +implementation' %s$$file >> %s.final;\\" % \
(rulef.source, finaltarget), + "done; \\"])
+ if len(self.final_sources[ext][1]):
+ lines.extend(["for file in " + string.join(self.final_sources[ext][1]) + "; do \
\\", + " echo \"#include \\\"$$file\\\"\" >> %s.files; \\" % finaltarget,
+ "done; \\"])
+ lines.extend(["cat %s.final %s.files > %s; \\" % (finaltarget, finaltarget, \
finaltarget), + "rm -f %s.final %s.files" % (finaltarget, finaltarget)])
+ for file in self.final_sources[ext][0] + self.final_sources[ext][1]:
+ match = utilities.extre.match(file)
+ assert(match)
+ targ = rulef.target(rulef.build + match.group(1) + self.objext)
+ if targ:
+ rulef.insertTarget(finaltarget, targ.deps)
+ else:
+ print 'no target for', rulef.source + match.group(1) + self.objext
+ rulef.insertTarget(finaltarget,
+ [],
+ lines)
+ self.handle_source( self.canon_name + '_all_' + ext[1:], ext, 1, 1)
+ added_finals = 1
+
+ assert(added_finals) # I kind of doubt it can not be set (TODO)
+
+ if added_finals:
+ extern_objs = rulef.definition(prefix + '_OBJECTS')
+
+ rulef.del_define(prefix + "_OBJECTS")
+ if 'KDE_USE_FINAL' in utilities.true_conds or \
+ utilities.environment_vars.has_key("UNSERMAKE_FORCE_FINAL"):
+ rulef.add_define(prefix + "_OBJECTS", ['$(%s_final_OBJECTS)' % prefix] + \
extern_objs) + else:
+ rulef.add_define(prefix + "_OBJECTS", ['$(%s_nofinal_OBJECTS)' % prefix] + \
extern_objs) +
+ def collect_final_dependencies(self):
+ # who cares otherwise?
+ if not 'KDE_USE_FINAL' in utilities.true_conds and \
+ not utilities.environment_vars.has_key("UNSERMAKE_FORCE_FINAL"):
+ return
+
+ objs = self.mfile.definition(self.canon_name + "_nofinal_OBJECTS")
+ fobjs = self.mfile.definition(self.canon_name + "_final_OBJECTS")
+ for obj in objs:
+ objtarg = self.mfile.target(obj)
+ assert(objtarg)
+ for fobj in fobjs:
+ self.mfile.insertTarget(fobj, objtarg.deps)
+
+ def add_random(self):
+ print 'random:', self.mfile.subdir + "/" + self.name
+ self.mfile.insertTarget('random',
+ self.mfile.build + self.name, phony=1)
+
+ # For a binary B (an executable or lib) this takes B->sources
+ # and produces B->objects, B->deps and B->fin_sources.
+ # An example should illustrate the behaviour best: Say, we have
+ # bla_SOURCES = a.c b.cpp c.skel d.ui
+ # then this produces
+ # bla->objects = a.o b.o c.o d.o
+ # bla->deps = c.cpp d.cpp
+ # and bla->fin_sources = a.c b.cpp c.cpp d.cpp
+ # i.e. The ->deps only contain the files, which are not already in ->objects
+ def handle_sources(self, sources):
+ self.objects = []
+ self.use_c_linker = 1
+ for source in sources:
+ match = utilities.extre.match(source)
+ if not match:
+ utilities.print_error('%s: "%s" doesnt match extre\n' % (self.mfile.filename, \
source)) + continue
+ base = match.group(1)
+ ext = match.group(2)
+
+ if ext in utilities.hext: # ignore headers for _everything_
+ continue
+ if handlerbase.ext_dict.has_key(ext):
+ handlerbase.ext_dict[ext].source(self, base, ext)
+ elif self.is_cpp(ext) or ext == '.c':
+ self.handle_source(base, ext)
+ else:
+ utilities.print_error('%s: unknown source extension %s for %s\n' % \
(self.mfile.filename, ext, self.name)) + continue
+
+ if ext != '.c':
+ self.use_c_linker = 0
diff --git a/unsermake/rcc.um b/unsermake/rcc.um
new file mode 100644
index 0000000..051492d
--- /dev/null
+++ b/unsermake/rcc.um
@@ -0,0 +1,45 @@
+# -*-python-*-
+# vim: ts=4 noet
+# Copyright (C) 2005 Simon Hausmann <hausmann@kde.org>
+# Licensed under whatever license unsermake is available
+
+import handlerbase
+import program
+import utilities
+
+class RCCHandler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "rcc"
+
+ def extensions(self):
+ return ['.qrc']
+
+ def source(self, program, base, ext):
+ outputBase = "qrc_" + base
+ outputFileName = outputBase + ".cpp"
+ outputFile = program.mfile.build + outputFileName;
+
+ depFile = program.mfile.build + '$(DEPDIR)/%s.Ucpp' % outputBase
+ program.mfile.dep_files.append("%s.Ucpp" % outputBase)
+
+ rules = ["@echo '%s %s%s'" % (utilities.creating_text, outputFile, \
utilities.normal), + "$(RCC) -name %s -o %s $(srcdir)/%s.qrc" % (base, \
outputFile, base), + "@echo 'builddir=$$(top_builddir)/$(subdir)' > %s.tmp" % \
depFile, + "@echo 'top_builddir=$(top_builddir)' >> %s.tmp" % depFile,
+ "@echo 'targetfile=$$(top_builddir)/$(subdir)/%s' >> %s.tmp" % (outputFileName, \
depFile), + "@for file in `$(RCC) -list $(srcdir)/%s.qrc` ; do \\" % base,
+ " if test -r $$file; then \\",
+ " echo $$file >> %s.tmp;\\" % depFile,
+ " fi ;\\",
+ "done; mv %s.tmp %s" % (depFile, depFile)
+ ]
+
+ program.mfile.insertTarget(outputFile, [program.mfile.source + base + ".qrc"],
+ rules)
+
+ program.handle_source(outputBase, ".cpp")
+
+ program.cleanfiles.append(outputFileName)
+
+handlerbase.register_handler(RCCHandler())
+
diff --git a/unsermake/sourcehash.py b/unsermake/sourcehash.py
new file mode 100644
index 0000000..0ab4bf2
--- /dev/null
+++ b/unsermake/sourcehash.py
@@ -0,0 +1,141 @@
+import zlib
+import cPickle as pickle
+
+class Source:
+ def __init__(self, hash, mtime):
+ self.hash = hash
+ self.ctime = mtime # hash changed time
+ self.mtime = mtime
+
+class SourceHash:
+ def __init__(self, filename = ''):
+ try:
+ self.__src = pickle.load(file(filename,'r'))
+ except IOError:
+ self.__src = {}
+ except EOFError:
+ self.__src = {}
+ self.__dirty = False
+
+ def save(self, filename):
+ if self.__dirty:
+ pickle.dump(self.__src, file(filename,'w'))
+
+ def strip(self, filename):
+ data = file(filename).read()
+ white = " \t\n"
+ punct = r'!%^&*()-+=[]{};:~<>,./?#'
+
+ inWhite = [False]
+ inComment1 = False
+ inComment2 = False
+ inQuote = None
+ escaped = False
+
+ result = []
+
+ def setInWhite(white = True):
+ if inWhite[0] != white:
+ inWhite[0] = white
+ if white:
+ result.append(' ')
+
+ # assume starts with whitespace
+ prev = ' '
+ setInWhite()
+ for char in data:
+ # ending comments
+ if inComment1 and char == '\n' and prev != '\\':
+ inComment1 = False
+ elif inComment2 and char == '/' and prev == '*':
+ inComment2 = False
+ char = ' ' # /**// is not start of //
+ # in comments - ignore
+ elif inComment1 or inComment2:
+ pass
+ # ending quote
+ elif char == inQuote and not escaped:
+ inQuote = False
+ result.append(char)
+ # in quote - copy
+ elif inQuote:
+ if char == '\\':
+ escaped = not escaped
+ else:
+ escaped = False
+ result.append(char)
+ # starting comments - strip previous /
+ elif char == '/' and prev == '/':
+ # remove /
+ result.pop()
+ # find out if we were in whitespace
+ if result[-1] == ' ':
+ inWhite[0] = True
+ inComment1 = True
+ setInWhite()
+ elif char == '*' and prev == '/':
+ result.pop()
+ if result[-1] == ' ':
+ inWhite[0] = True
+ inComment2 = True
+ setInWhite()
+ # starting quote
+ elif char == '"' or char == "'":
+ escaped = False
+ inQuote = char
+ # assume whitespace before quote
+ setInWhite()
+ setInWhite(False)
+ result.append(char)
+ # whitespace - collapse to single space
+ elif char in white:
+ setInWhite()
+ # other character: record
+ else:
+ punc = char in punct
+ ppunc = prev in punct
+ # assume space between punctuation and identifiers
+ if punc != ppunc:
+ setInWhite()
+ result.append(char)
+ setInWhite(False)
+ prev = char
+ # assume ends with whitespace
+ setInWhite()
+
+ if inQuote:
+ print 'Warning: %s ends in quote' % filename
+ if inComment1 or inComment2:
+ print 'Warning: %s ends in comment' % filename
+
+ return ''.join(result)
+
+ def hash(self, filename):
+ return zlib.adler32(self.strip(filename))
+
+ def mtime(self, filename, real_mtime):
+ try:
+ try:
+ src = self.__src[filename]
+ if src.mtime == real_mtime:
+ # cache up-to-date
+ return src.ctime
+ new_hash = self.hash(filename)
+ src.mtime = real_mtime
+ self.__dirty = True
+ if src.hash != new_hash:
+ # file changed
+ src.hash = new_hash
+ src.ctime = real_mtime
+ return real_mtime
+ else:
+ # file appears unchanged
+ print 'skipping %s' % filename
+ return src.ctime
+ except KeyError:
+ myhash = self.hash(filename)
+ self.__src[filename] = Source(myhash, real_mtime)
+ self.__dirty = True
+ return real_mtime
+ except IOError:
+ return real_mtime
diff --git a/unsermake/target.py b/unsermake/target.py
new file mode 100644
index 0000000..69cebd4
--- /dev/null
+++ b/unsermake/target.py
@@ -0,0 +1,293 @@
+import types, string, re, os, glob
+import utilities
+
+just_print = False
+num_todo = 0
+print_progress = False
+precise_print_progress = False
+
+class PhonyConflict(Exception):
+ pass
+
+class DefValue:
+ def __init__(self, value):
+ #assert(type(value) == types.StringType)
+ self.value = value
+
+ def __repr__(self):
+ return "-" + self.value + "-"
+
+class Target:
+ def __init__(self, target, deps, rules, user_specified, phony):
+ self.target = target
+ if type(deps) == types.StringType:
+ deps = string.split(deps)
+ self.deps = []
+ for dep in deps:
+ self.deps.append(DefValue(dep))
+ else:
+ assert(type(deps) == types.ListType)
+ self.deps = []
+ for dep in deps:
+ if isinstance(dep, DefValue):
+ self.deps.append(dep)
+ else:
+ self.deps.append(DefValue(dep))
+
+ if type(rules) == types.StringType:
+ self._rules = [ rules ]
+ elif user_specified:
+ self._rules = []
+ for rule in rules:
+ striped = string.strip(rule)
+ if len(striped) and not striped[0] == '#':
+ if not rule[0] == '\t':
+ if user_specified:
+ raise 'rule %s doesn\'t start with <tab>' % target
+ self._rules.append(string.rstrip(rule))
+ else:
+ self._rules.append(string.rstrip(rule[1:]))
+ else:
+ self._rules = rules[:]
+ self.user_specified = user_specified
+ self.is_phony = phony
+ self.changed = False
+ self.expanded = False
+
+ def __repr__(self):
+ return self.target + ":"
+
+ ret = self.target + ": " + self.deps_string()
+ for rule in self._rules:
+ ret += "\n\t" + rule
+ if self._rules:
+ ret += "\n"
+ return ret
+
+ def has_rules(self):
+ return len(self._rules) != 0
+
+ def merge(self, targ):
+ assert(self.target == targ.target)
+ if not self.is_phony == targ.is_phony:
+ raise PhonyConflict
+
+ self.expanded = self.expanded and targ.expanded
+ self.compile_target = self.compile_target or targ.compile_target
+
+ for dep in targ.deps:
+ self.deps.append(dep)
+
+ if self._rules == targ._rules:
+ return
+ elif len(self._rules) and len(targ._rules):
+ if not self.user_specified or not targ.user_specified:
+ print self._rules
+ print targ._rules
+ raise ("two targets named '%s' define rules!" % self.target)
+ else:
+ if targ.user_specified and len(targ._rules):
+ self.user_specified = 1
+ self._rules.extend(targ._rules)
+
+ def deps_string(self):
+ output = ""
+ for dep in self.deps:
+ if isinstance(dep, DefValue):
+ output += " " + dep.value
+ else:
+ output += " " + dep
+ if len(output):
+ return output[1:]
+ return output
+
+ def print_out(self, output):
+ output.append('%s: %s\n' % (self.target, self.deps_string()))
+ for rule in self._rules:
+ if rule[0] == '\002':
+ output.append('%s\n' % string.replace(rule[1:], '\001', '\\\n'))
+ else:
+ output.append('\t%s\n' % string.replace(rule, '\001', '\\\n'))
+ if len(self._rules):
+ output.append("\n")
+
+ def _used_defines_str(self, str):
+ match = utilities.variablesre.match(str)
+ if not match:
+ return []
+ if not len(match.group(3)):
+ return [match.group(2)]
+ else:
+ return [match.group(2)] + self._used_defines_str(match.group(3))
+
+ def used_defines(self):
+ res = []
+ for dep in self.deps:
+ if type(dep) == types.StringType:
+ res.append(dep)
+ else:
+ res.extend(self._used_defines_str(dep.value))
+
+ for rule in self._rules:
+ strings = string.split(rule)
+ for _str in strings:
+ res.extend(self._used_defines_str(_str))
+ return res
+
+ def expand_target(self, makefile):
+ ntarg = makefile.expand(self.target)
+ if ntarg == self.target:
+ return
+ try:
+ del makefile.targets[self.target]
+ except KeyError:
+ pass
+
+ if not len(ntarg):
+ return
+
+ ntargs = string.split(ntarg)
+
+ if len(ntargs) == 1:
+ self.target = ntarg
+ makefile.addTarget(self)
+ else:
+ self.target = ntargs[0]
+ makefile.addTarget(self)
+ for index in range(1, len(ntargs)):
+ nt = Target(ntargs[index], self.deps, self._rules,
+ phony=self.is_phony,
+ user_specified=0)
+ nt.user_specified = self.user_specified
+ nt.compile_target = self.compile_target
+ makefile.addTarget(nt)
+
+ def expand_rules(self, makefile, replace=1):
+ rlen = len(self._rules)
+ for index in range(rlen):
+ line = makefile.expand(self._rules[index])
+ if replace:
+ line = makefile.replace_autoconf(line)
+ line = string.replace(line, '$@', self.target)
+ line = string.replace(line, '$<', self.deps_string())
+ self._rules[index] = line
+
+ def expand_deps(self, makefile):
+ if not self.user_specified:
+ ndeps = self.deps
+ self.deps = []
+ for dep in ndeps:
+ if type(dep) == types.StringType:
+ self.deps.append(dep)
+ else:
+ deps = string.split(makefile.expand(dep.value))
+ for dep2 in deps:
+ self.deps.append(dep2)
+ return
+ deps = self.deps
+ self.deps = []
+ for dep in deps:
+ if type(dep) == types.StringType:
+ edeps = [dep]
+ else:
+ edeps = string.split(makefile.expand(dep.value))
+ for dep2 in edeps:
+ gdeps = glob.glob(dep2)
+ if len(gdeps):
+ for dep3 in gdeps:
+ self.deps.append(dep3)
+ else:
+ self.deps.append(dep2)
+
+ def expand(self, makefile):
+ if self.expanded:
+ return
+ self.expand_target(makefile)
+ self.expand_rules(makefile, replace=0)
+ self.expand_deps(makefile)
+ self.expanded = True
+
+ def _flags(self, current, ignore_exit=False, echo_line=True):
+ if current[0] == '@':
+ return self._flags(current[1:], ignore_exit, False)
+
+ if current[0] == '-':
+ return self._flags(current[1:], True, echo_line)
+
+ return (current, ignore_exit, echo_line)
+
+ def call_command(self, makefile):
+ self.changed = 1
+ endswith = 0
+ current = ''
+ targetre = re.compile('^(.*[^$])\$@(.*)')
+ depsre = re.compile('^(.*[^$])\$\?(.*)$')
+ fdepre = re.compile('^(.*[^$])\$<(.*)$')
+ self.expand_rules(makefile, replace=0)
+ linere = re.compile('\s*\001\s+')
+ first_line = True
+
+ for line in self._rules:
+
+ if line[0] == '\002':
+ line = linere.sub(' ', line[1:])
+ else:
+ line = linere.sub(' ', line)
+
+ line = string.lstrip(line)
+ if endswith:
+ current += line
+ else:
+ current = line
+ endswith = current[-1] == '\\'
+ if endswith:
+ current = current[:-1]
+ continue
+
+ current, ignore_exit, echo_line = self._flags(current)
+
+ while True:
+ match = targetre.match(current)
+ if not match:
+ break
+ current = match.group(1) + self.target + match.group(2)
+ while True:
+ match = depsre.match(current)
+ if not match:
+ break
+ current = match.group(1) + self.deps_string() + match.group(2)
+ while True:
+ match = fdepre.match(current)
+ if not match:
+ break
+ current = match.group(1) + self.deps[0] + match.group(2)
+
+ current = string.replace(current, "$$", "$")
+
+ if echo_line:
+ if print_progress and not first_line:
+ print " " + current
+ elif precise_print_progress and not first_line:
+ print " " + current
+ else:
+ print current
+ first_line = False
+
+ if just_print:
+ print current
+ continue
+
+ pid = os.fork()
+ if not pid:
+ if self.user_specified:
+ os.chdir(makefile.asubdir)
+ os._exit(os.execl("/bin/sh", "/bin/sh", "-c", current))
+ else:
+ (pid, exit_code) = os.waitpid(pid, 0)
+ if exit_code and not ignore_exit:
+ if os.WIFEXITED(exit_code):
+ return os.WEXITSTATUS(exit_code)
+ return exit_code
+
+ return 0
+
diff --git a/unsermake/texinfo.um b/unsermake/texinfo.um
new file mode 100644
index 0000000..dc784ef
--- /dev/null
+++ b/unsermake/texinfo.um
@@ -0,0 +1,17 @@
+# -*-python-*-
+# vim: ts=4
+
+import handlerbase
+
+class TEXInfo_Handler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "texinfo"
+
+ def create(self, amfile):
+ # TODO: support for texinfo
+ for define in amfile.defines():
+ if define.endswith( '_TEXINFOS' ):
+ amfile.del_define(define)
+ amfile.add_define(define, [])
+
+handlerbase.register_handler(TEXInfo_Handler())
diff --git a/unsermake/unsermake b/unsermake/unsermake
new file mode 100755
index 0000000..cb1c495
--- /dev/null
+++ b/unsermake/unsermake
@@ -0,0 +1,16 @@
+#! /bin/sh
+
+self="$0"
+if test -L "$self"; then
+ try=`readlink $self 2>/dev/null` && self=$try
+fi
+mods=`dirname $self`
+directory=`dirname $mods`
+mod=`basename $mods`
+if test -n "$PYTHONPATH"; then
+ export PYTHONPATH=$directory:$PYTHONPATH
+else
+ export PYTHONPATH=$directory
+fi
+exec python -c "import $mod; $mod.main()" --modules $mods "$@"
+
diff --git a/unsermake/utilities.py b/unsermake/utilities.py
new file mode 100644
index 0000000..f2407b2
--- /dev/null
+++ b/unsermake/utilities.py
@@ -0,0 +1,169 @@
+# -*-python-*-
+# vim: ts=4
+
+import os, re
+
+red = "^[[2;31m"
+green = "[2;32m"
+blue = "[34m"
+yellow = "[1;33m"
+cyan = "[36m"
+normal = "[0m"
+progress_color="[2;35m"
+bold = "[0m"
+compile_text = green + "compiling" + bold
+creating_text = blue + "creating" + bold
+linking_text = yellow + "linking" + bold
+installing_text = cyan + "installing" + bold
+uninstalling_text = cyan + "uninstalling" + bold
+
+def clearAllColors():
+ global compile_text,creating_text,linking_text,installing_text,uninstalling_text,progress_color,normal
+ compile_text = "compiling"
+ creating_text = "creating"
+ linking_text = "linking"
+ installing_text = "installing"
+ uninstalling_text = "uninstalling"
+ progress_color = ""
+ normal = ""
+
+variablere = re.compile('\$[{(]([^({})]*)[})]$')
+variablesre = re.compile('(.*?)\$[{(]([^({})]*)[})](.*)')
+autoconfre = re.compile('(.*?)@([^@]*)@(.*)')
+extre = re.compile('(.*)(\.[^.]*)$')
+definere = re.compile('\s*([^+=\s]*)\s*(\+?)=(.*)$')
+cppext = [".cpp", ".cc", ".C", ".cxx", ".c++"]
+hext = [".h", ".H", ".hh", ".hxx", ".hpp", ".h++"]
+
+_topsrcdir = None
+top_builddir = None
+top_builddir_abs = None
+subst_vars = {}
+environment_vars = {}
+true_conds = []
+false_conds = []
+
+configure_in = None
+our_path = None
+
+__pychecker__ = 'unusednames=_topsrcdir'
+
+def list_source_files(dirname):
+ cppfiles = []
+ hfiles = []
+ for myfile in os.listdir(dirname):
+ match = extre.match(myfile)
+ if not match:
+ continue
+ base = match.group(1)
+ ext = match.group(2)
+
+ if not ext in cppext and not ext in hext:
+ continue
+ if base.endswith('.moc') or len(base) == 0 or base[0] == '.':
+ continue
+ found = 0
+ for cpp_ext in cppext:
+ all_cpp = ".all_%s.%s" % (cpp_ext, cpp_ext)
+ if myfile.endswith(all_cpp):
+ found = 1
+ if found:
+ continue
+ if ext in cppext:
+ cppfiles.append((base, ext))
+ else:
+ hfiles.append((base, ext))
+
+ return (cppfiles, hfiles)
+
+default_force = 1
+
+def write_if_changed(filename, lines, force = 0):
+ # I'm not convinced any longer we can make use of this
+ force = default_force
+ if not force and os.path.exists(filename):
+ index = 0
+ while index < len(lines):
+ if string.find(lines[index], '\n', 0, -1) != -1:
+ split = string.split(lines[index], '\n')
+ if not len(split[-1]):
+ split = split[:-1]
+ for si in range(0, len(split)):
+ split[si] = split[si] + '\n'
+ lines[index:index+1] = split
+ index = index + len(split)
+ else:
+ index = index + 1
+ new_lines = open(filename).readlines()
+ if new_lines == lines:
+ return
+ else:
+ print filename, 'changed'
+ pass
+
+ open(filename, 'w').writelines(lines)
+
+import string
+
+def parse_autoconf(full=0):
+ tracelines = []
+
+ configure_file = "configure.in"
+ if os.path.exists("configure.ac"):
+ configure_file = "configure.ac"
+ tracevalid = not full and os.path.exists(".autoconf_trace")
+ if tracevalid and os.path.getmtime(configure_file) > \
+ os.path.getmtime(".autoconf_trace"):
+ tracevalid = 0
+
+ if tracevalid and os.path.getmtime("aclocal.m4") > \
+ os.path.getmtime(".autoconf_trace"):
+ tracevalid = 0
+
+ if tracevalid:
+ tracelines = open( ".autoconf_trace" ).readlines()
+
+ if not len(tracelines):
+ autoconf = os.environ.get('AUTOCONF', 'autoconf')
+ traces = autoconf + " -t AC_SUBST -t AC_CONFIG_FILES -t AC_CONFIG_AUX_DIR -t \
AC_CONFIG_HEADERS" + tracelines = os.popen( traces ).readlines()
+ if not len(tracelines):
+ print 'couldn\'t call', traces
+ sys.exit(1)
+ open(".autoconf_trace", 'w').writelines(tracelines)
+
+ subst = {}
+ files = []
+ headers = []
+ auxdir = '.'
+ for line in tracelines:
+ splitted = string.split(string.strip(line), ':')
+ if splitted[2] == 'AC_SUBST':
+ subst[splitted[3]] = ""
+ elif splitted[2] == 'AC_CONFIG_FILES':
+ files.extend( string.split( splitted[3] ) )
+ elif splitted[2] == 'AC_CONFIG_HEADERS':
+ headers.extend( string.split( splitted[3] ) )
+ elif splitted[2] == 'AC_CONFIG_AUX_DIR':
+ auxdir = splitted[3]
+ else:
+ print splitted, 'doesn\'t contain the traced macros'
+
+ if auxdir == '.':
+ return subst, files, headers, '$(top_srcdir)'
+ else:
+ return subst, files, headers, '$(top_srcdir)/' + auxdir
+
+def canon_name(name):
+ return re.sub('[^a-zA-Z0-9_]', '_', name)
+
+sys_exit_code = 0
+
+import sys
+
+def print_error(string):
+ sys.stderr.write("ERROR:")
+ sys.stderr.write(string)
+ global sys_exit_code
+ sys_exit_code = 1
+
diff --git a/unsermake/yacc.um b/unsermake/yacc.um
new file mode 100644
index 0000000..7ada228
--- /dev/null
+++ b/unsermake/yacc.um
@@ -0,0 +1,61 @@
+# -*-python-*-
+# vim: ts=4
+
+import handlerbase
+
+class YaccHandler(handlerbase.HandlerBase):
+ def __init__(self):
+ self.name = "yacc"
+
+ def extensions(self):
+ return ['.yy', '.y']
+
+ def source(self, program, base, ext):
+
+ rulef = program.mfile
+
+ sourcedir = ''
+ if program.mfile.in_srcdir(base + ext):
+ sourcedir = rulef.source
+ else:
+ sourcedir = rulef.build
+
+ rext = '.cc'
+ if ext == '.y':
+ rext = '.c'
+
+ output = rulef.build + base
+ compile = 'sdir=`cd %s && pwd`; cd %s && $(YACC) ' % (sourcedir, rulef.build)
+ compile += program.handle_variable('YFLAGS') + '$$sdir/%s' % (base + ext)
+ rules = [ compile ]
+ rules.append("sed '/^#/ s|y\.tab\.c|%s|' %s > %s ;\\" % (base + rext, rulef.build \
+ 'y.tab.c', output + rext)) + rules.append("rm -f " + rulef.build + 'y.tab.c')
+ rules.append('if test -f ' + rulef.build + 'y.tab.h; then \\')
+ rules.append('to=`echo "%s_H" | sed \\' % base)
+ rules.append(" -e 'y/abcdefghijklmnopqrstuvwxyz/ABCDEFGHIJKLMNOPQRSTUVWXYZ/' \
\\") + rules.append(" -e 's/[^ABCDEFGHIJKLMNOPQRSTUVWXYZ]/_/g'`; \\")
+ rules.append('sed "/^#/ s/Y_TAB_H/$$to/g" ' + rulef.build + 'y.tab.h > ' + output \
+ ".ht; \\") + rules.append('rm -f ' + rulef.build + 'y.tab.h; \\')
+ rules.append('if cmp -s %s.ht %s.h; then \\' % (output, output))
+ rules.append(' rm -f %s.ht ;\\' % output)
+ rules.append('else \\')
+ rules.append(' mv %s.ht %s.h; \\' % (output, output))
+ rules.append('fi; \\')
+ rules.append('fi')
+ rules.append('if test -f y.output; then \\')
+ rules.append(' mv y.output %s.output; \\' % output)
+ rules.append('fi')
+
+ rulef.insertTarget(rulef.build + base + rext, [sourcedir + base + ext], rules)
+
+ program.handle_source(base, rext)
+ program.cleanfiles.extend([base + rext])
+
+ rulef.insertTarget(rulef.build + base + '.h',
+ rulef.build + base + rext,
+ ["@if test ! -f $@; then \\",
+ "rm -f %s; \\" % (rulef.build + base + rext),
+ "$(MAKE) %s; \\" % (rulef.build + base + rext),
+ "else :; fi"])
+
+handlerbase.register_handler(YaccHandler())
[prev in list] [next in list] [prev in thread] [next in thread]
Configure |
About |
News |
Add a list |
Sponsored by KoreLogic