So after background tutorials let's actually hack something. First hack should be obviously very simple, so I've chosen an ubuntu game "loopy".
Installation: Go to Synaptic Package Manager->install sgt-puzzles package. This will show a logic submenu in Application->Games. There is a game called loopy in that logic menu.
Aim: So whenever the game is started, we'll see rectangular grid with numbers inside. Color of line forming rectangle is yellowish-green, which does not match with gray background. So our aim in this experiment is to change color of that line.
Source Code: I've taken it from https://launchpad.net/ubuntu/+source/sgt-puzzles. Take for your own distribution.
Build: This package does not have any configure script, but a Makefile is present in build directory. So just go to the root folder for source code of this game, and type make.
Make sure that it builds. To confirm it, see for any error message when build process is finished and then run our loopy game from local build using ./loopy.
Remember to put './' to run from local build.
So far, so good. We have successfully compiled code locally. So now all we need to do is just find the location where color for that line is defined and change it.
Let's start actual code analysis and modification now. An overview over code directory will tell you that all code is in same folder with icons in subfolder. So lets start CScope analysis on this code. Type cscope -R -q. '-R' is not necessary since code is in only one directory, but its usage will do no harm. '-q' will build a reverse index, so that string search inside code is faster.
Ok, we have a cscope index built now. What next? Let's start with standard C entrypoint 'main' function. So I'll search for global definitions of main in cscope, and it lists number of main functions, I guess one for each game. Which means, each game is kind of independent code in it's C file. So for any particular game we just need to see its code file. So our job is now reduced to analysis of C file for loopy game. If we look into code folder we'll find loopy.c present. So let's open it in 'geany code editor'.
Up to this point it was all predefined. No need to deviate from this procedure for any code. After this point however most of the things are intuition based. We need to look at multiple things and guess where our code of interest might be.
What we are looking for in this game is color used for drawing lines on grid. So let's see if there is any function for drawing line in the game. We have trust pretty much on function variable names now.
In geany, on left hand side, in list of functions, no functions starts with 'draw'. So let us see if anything is there starting with 'line'. No function with 'line' at start. OK, let's read the names of functions in the file then. We have a function named game_redraw. Now that is close. Name suggests it would redraw game board each time some activity happens. Let's take a closer look at the function.
Now within first 10 lines of function code, there is a call to draw_rect. Looks like we are on right track. CScope global definition of draw_rect gives 2 entries. Trusting on filenames, let's for with one in draw.c. I pulled draw.c in geany now. The function list for draw.c contains draw_rect and near that we can see a draw_line too. We're closing in here. Now we need to see calls to draw_line from our loopy.c.
So, let's go back to loopy.c, go to function game_redraw and search for call to draw_line using geany search. We have one. In the code that I have, call is located on line 3544. Depending on your code version it may very, but call should be there. Now prototype of draw_line from draw.c tells us that last parameter passed is line color. That is what we should look for in loopy.c. Last parameter in call is line_colour. We can guess that it is a variable which gets proper color as per the action. So let's search where that yellowish green color is assigned. For that let's see assignment to line_colour, first in the same function.
A look on 50 code lines above call to draw_line gives us few lines where line_colour is assigned some values. All values look like COL_XXX, which suggests it might be a constant defined using #define. So, let's search for global definition of COL_ constants in CScope. Entry for loopy.c takes us to an enum. So we were wrong. Now we need to find out how this enum is used get exact color. Now, since every game has its own definition of COL_XX enum (as pointed by CScope), we can search for enum-to-color conversion in local file only.
Let's go to line 1 of loopy.c then and start searching for any one enum value. I searched for COL_FAINT. First search takes us to enum definition again, but next one is interesting. It takes us to a function named game_colours. And that function simply assigns 3 numbers to ret array indexed by COL_ enum values. Bingo! This must be red, green and blue component for every color.
Great, now all we need to find is yellowish color assigned somewhere, and change it. Again trusting on names and a bit of knowledge of game rules tells me it should be either COL_LINEUNKNOWN or COL_FAINT. But, COL_FAINT is 0.9, 0.9, 0.9 color, which is as good as white. So, by elimination we have only COL_LINEUNKNOWN which is 0.8, 0.8, 0.0. I want to make it greenish. So I'll change it to 0.1, 0.8, 0.1 (reducing red component).
We're nearly done. All we need to do is to recompile and check if it has changed. So, let's run make. It builds fine. Finally, run ./loopy and see that magic has happened.
That's it from this post. Please post comments for any suggestion, doubt or mistake in this post.
ASD
Sunday, November 4, 2012
Tuesday, October 30, 2012
Divisibility Rules Explained
I am no maths genius, and this may be one of the few posts about mathematics on this blog. Here I'll try to explain a bit why divisibility rules work, and maybe try to derive one or two new rules according to that explanation. There are tonnes of articles on internet on this topic, and this is just another one in that crowd.
Basic mathematical concepts used:
1. Every n digit number can be written as sum of n digits multiplied with respective power of 10. These sums can be grouped and compressed suitably.
E.g.
5 = 5*100 = 5*1
13 = 1*101 + 3*100 = 1*10 + 3*1 = 13 * 1
467 = 4*102 + 6*101 + 7*100 = 4*100 + 6*10 + 7*1 = 46*10 + 7
1234 = 123*10 + 4 = 12*100 + 34 = 1004 + 23*10
2. If X is divisible by Y then for every integer a, a*X is divisible by Y.
E.g. 6 divides 18, hence 6 divides 18 * 5 = 90
3. If Z = X + Y, and X, Y are divisible by T, then Z is divisible by T. This applies even if any of X, Y have negative value.
E.g. 13 divides 65, 91. Hence 13 divides 65 + 91 = 156 and 13 divides -65 + 91 = 26
4. If C =A*B and C divides X then A and B both divide X.
E.g. 6 = 3*2, 6 divides 12 thus 3 and 2 divide 12.
That's all we need. So lets start with simple tests of 2, 4, 8.
Test of 2: The last digit is even (divisible by 2).
Test of 4: The last 2 digits are divisible by 4.
Test of 8: The last 3 digits are divisible by 8.
All three can be explained by same logic.
Consider n digit number A1A2A3..An = A1 * 10n + .. + An = (A1A2..An-1)*10 + An.
Let A = A1A2..An-1 and B = An. So effectively AB is n digit number where A itself is n-1 digit number and B is single digit. Lets call this as length(A) = n-1 and length(B) = 1.
So AB = A*10 + B
Test of 2 says AB divisible by 2 if B divisible by 2.
now 10 is divisible by 2.
=> A*10 divisible by 2 (from Concept 2 above)
so if B divisible by 2 then
A*10 + B divisible by 2 (from concept 3)
=> AB divisible by 2.
Similarly for test of 4,
length(A) = n-2, length(B) = 2
Test of 4 says: B divisible by 4 => AB divisible by 4.
AB = A*100 + B (length(B) = 2)
now 100 divisible by 4
=> A*100 divisible by 4
=> A*100 + B divisible by 4
=> AB divisible by 4
Test of 8:
length(A) = n-3, length(B) = 3
=> AB = A*1000 + B
Test of 8 says, B divisible by 8 => AB divisible by 8.
now 1000 divisible by 8
=> A*1000 divisible by 8
=> A*1000 + B divisible by 8
=> AB divisible by 8
Test of 3: The sum of the digits is divisible by 3
Test of 9: The sum of the digits is divisible by 9
Test of 3 says 4 digit number ABCD = A*1000 + B*100 + C*10 + D is divisible by 3 if A+B+C+D is divisible by 3.
Interesting property about numbers 10, 100, 1000... is that all of them are multiple of 3 + 1. ie 10 = 9 + 1, 100 = 99 + 1. In fact they are multiple of 9 + 1, which is useful in test of 9.
Now let's assume ABCD is divisible by 3.
=> A*1000 + B*100 + C*10 + D is divisible by 3.
=> 999*A + A + 99*B + B + 9*C + C + D is divisible by 3
removing multiples of 3 from above expression
=> A + B + C + D is divisible by 3
This argument is as it is applicable to test of 9, and it can be easily extended to n digit numbers.
Test of 7: If you double the last digit and subtract it from the rest of the number and the answer is divisible by 7.
This is an interesting one. Again lets assume number AB, length(A) = n-1, length(B) = 1 to be divisible by 7.
=> A*10 + B divisible by 7
=> 2*(A*10 + B) divisible by 7
=> A*20 + 2*B divisible by 7
=> 21*A - A + 2*B divisible by 7
=> 2*B - A divisible by 7
=> A - 2*B divisible by 7
which is the test statement.
Test of 11: If you sum every second digit and then subtract all other digits and the answer is divisible by 11.
11 has very interesting property which is all odd powers of 10 i.e. 10(2n + 1) are 1 short of multiple of 11 (e.g. 11, 1001), whereas all even powers of 10 i.e. 10(2n) are 1 excess of multiple of 11 (e.g. 99, 9999).
So let's assume 4 digit number ABCD which is divisible by 11.
=> A*1000 + B*100 + C*10 + D is divisible by 11.
=> A*1001 - A + 99*B + B + 11*C - C + D is divisible by 11.
=> B + D - (A + C) is divisible by 11, which is test statement.
This can be easily extended to n digit numbers.
Test of 5, 10 can be easily proven from test of 2.
So now we have seen logic behind most of the well known tests, let's try to design our own. Let's take 13. I haven't seen any test for 13, so let's see if anything simple enough comes out.
Now 13 does not show any properties like 3, 9, 11 around powers of 10. So that class of tests is not applicable here. Let's try something like test of 7 then.
Let AB be a n digit number with length(A) = n - 1 and length (B) = 1.
Let AB be divisible by 13.
=> A*10 + B is divisible by 13.
Now like test of 7, lets try to make coefficient of A to be 1 and still let it be divisible by 13.
Nearest of 13 multiple for A*10 is 13*A, resulting in -3*A.
=> -3*A + B is divisible by 13.
If we multiply this expression by 4, we get 12*A which is one less than 13's multiple.
=> -12*A + 4*B divisible by 13.
=> (-13*A + A) + 4*B divisible by 13.
=> A + 4*B is divisible by 13.
Just to remind that this is one of the tests for 13. We can have multiple such by applying different operations to above expression. All of them will be valid as long as operations are applied as per above mentioned rules.
So A + 4*B is a test for 13, which looks good enough to me. Let's try to state it in words:
Any number is divisible by 13 if you multiply the last digit by 4 and add it to the rest of the number and the answer should be divisible by 13. This can be recursively applied.
Let's verify if this is correct:
26459 is not divisible by 13. Lets test it on our expression:
=>2645 + 9*4 = 2681
=>268+4=272
=>27+8=35 => Not divisible by 13.
4632199 is divisible by 13. Lets see if our test shows it:
=>4632199
=>463219 + 36 = 463255
=>46325 + 20 = 46345
=>4634 + 20 = 4654
=>465 + 16 = 481
=>48 + 4 = 52
which is divisible by 13
We can apply similar methodology to derive test for other numbers.
Test of 17:
10x+y => -7x + y => -35x + 5y
=> x - 5y
Test of 19:
10x + y => 20x + 2y
=> x + 2y
So that's it for this post. If you have any interesting fact about any divisibility test mentioned/ not mentioned here, please comment.
ASD
Wednesday, October 10, 2012
Overview of Build Systems
This is a very short overview of build systems used in Linux or in general
GNU softwares. These tools are also ported to Windows, but generally only GNU
tools ported to windows, which makes use of cygwin or mingw use these build
systems. Tools aimed at windows only systems usually use Visual Studio build
environments.
So quick description of basic terminologies:
Build: It usually means compiling the source code and generating binary
(executable) file. Optionally it contains features to scan host system (system
on which code is being compiled) and make changes in source code accordingly.
Example would be GUI in Linux. Linux supports multiple GUI libraries eg: GTK,
QT. So build system may scan host to determine which library should be used for
compiling this code.
Installer: This program installs the generated executables, shared libraries and
other build outputs to some location, so that OS can find it easily while
executing the programs. Sometimes installer is part of build system itself,
while some times build and installer comes in pair.
Now let us see some very popular build systems:
1. Shell Script: Used for very small and very simple tools or utterly
complex builds. For simple builds, it just contains few lines to compile the
code files and generate and install binaries.
2. Makefile: Most common build system on unix and related environment. Makefile
contains rules about source files and their dependency with each other. It
effectively makes a dependency tree. Also, once built, it can detect which
files have been changed in subsequent builds and compile only those.
3. CMake: I am not too familier with it, so I'll just mention that it is
another build system.
4. SCons: Build system created in python. Gives you flexibility of specifying
rules like Makefile and also add python code to make very complex build
systems.
5. Configure Scripts: These are not truelly build systems, but rather pair up
usually with Makefiles. These scripts are very versatile in the sense they detect
host environment and change Makefiles accordingly. Can be modified to make a
generic build system, which will fit multiple host environments very smoothly.
6. Automake and Autoconf: These are tools to generate Makefile and configure
scripts from specification files. In rest of this post, we'll talk about these
two with few examples.
Building source code:
These steps are common for most of the codes that make use of configure script
and Makefile for building. Change in command line might be needed to change
default build options.
1. Make configure script executable if it is not already: chmod +x ./configure
2. ./configure (optionally --prefix=/custom/install/folder)
3. make
4. make install
5. make clean (only if you want to rebuild everything next time)
Autoconf: As name indicates, it is a tool used to generate configure scripts
from some options file. Usually this options file is named configure.ac or
configure.in. It'll have multiple macros having "AC_" prefix. Most interesting
ones are AC_ARG_ENABLE and AC_ARG_WITH. Here is what GNU manual says about theses
macros:
AC_ARG_ENABLE: "If a software package has optional compile-time features, the
user can give configure command line options to specify whether to compile them.
The options have one of these forms:
--enable-feature[=arg]
--disable-feature"
Structure of this macro is: AC_ARG_ENABLE(feature, help-string, [action-if-given],
[action-if-not-given]). Here action-if-not-given simply means default value.
AC_ARG_WITH: "Some packages require, or can optionally use, other software
packages that are already installed. The user can give configure command line
options to specify which such external software to use. The options have one of
these forms:
--with-package[=arg]
--without-package"
Structure of this macro is: AC_ARG_WITH(package, help-string, [action-if-given],
[action-if-not-given]). Here action-if-not-given simply means default value.
So let's see first example of Autoconf options file:
1. Bash:
I am using bash 4.2 source for this example. Source availble at
http://ftp.gnu.org/gnu/bash/bash-4.2.tar.gz. It has a configure.in file. Lets
open it. As I said, it contains lots of AC_ macros. Description of those can be
found at GNU site. First AC_ARG_WITH macro is present on line number 106. We can
see which optional packages bash has like afs, bash-malloc, gnu-malloc etc. Help
string is the string displayed as description about that package. First
AC_ARG_ENABLE macro is found on line 211. Features like alias, array-variables, debugger are specified here. Interestingly none of these macros had default
value specified. Lets find out where it is specified. eg To find default value
of debugger feature, let's see action associated with it. Third argument of
macro is opt_debugger=$enableval, which means enable value of this feature is
stored in opt_debugger. Simple grep command 'grep -n "opt_debugger"
configure.in |less' on configure.in shows its default value is set on line
number 184 as yes. In configure.in file surrounding lines set default value for
other features. Similar search on curses package shows default package values
are initialized around line 58.
2. Gnash:
Now we'll see confgure file for gnash, the open source flash player. I am
using version 0.8.9 from ftp://ftp.gnu.org/gnu/gnash/0.8.9/. It has a
configure.ac file. Simple grep shows it has 36 features and 13 packages
configureble through config script. Few of them have default value in macro,
others use shell scripting lines to decide values.
3. Nautilus:
The reason behind understanding structure of autoconf file is that all features
and packages may not have enough description. So if one wants to probe for
enabling/disabling specific functionality he can easily look into these macros
and corresponding shell variables to see its effect on output of configure file.
So, suppose we want to add configure option of enabling/disabling debug flag
during compilation of nautilus , we can add following lines to corresponding
configure.in files and generate configure file from it.
I have added following option to line 110 of configure.in after option for
enabling debug.
AC_ARG_ENABLE(
[debug],
[ --enable-debug Enable Debug Flags (default: enabled)],
[case "${enableval}" in
yes) CFLAGS="$CFLAGS -g -O0" ;;
no) CFLAGS="$CFLAGS -g -O3" ;;
*) AC_MSG_ERROR([bad value ${enableval} for --enable-debug option]) ;;
esac],
CFLAGS="$CFLAGS -g -O0")
This option when set, makes CFLAGS(used when compiling .c files) to be -g and
-O0 thus enabling debug output and disabling all optimizations.
Generate configure script using 'autoconf'. Now if we type ./configure --help,
we can see enable-debug option in it. If we do ./configure --enable-debug, and
make, CLFAGS used while compiling .c files will have -g -O0 arguments in it.
Thus debug version of nautilus will be built.
If we want to show enbaled-disabled debug feature after completion of configure
script, we can add following lines at end of configure.in file. That part
usually contains echo statements for other features. We can add status of our
own feature by adding
if test x"echo $CFLAGS | grep \"-g -O0"= "x"; then
echo " Debug disabled"
else
echo " Debug enabled "
fi
at very end of the file. Generate configure file again, and configure it. On
successful configuration it'll show the message Debug disabled/enabled on last
line of output.
Automake: As you can guess from name, this is used to generate a Makefile from
provided template, usually specified in Makefile.am file.
Important thing to note here is nomenclature of macros defined in Makefile.am
files. It is usually specified as "folder name"_"description" or "file
name"_"description" where file/folder name is name of file to be built or name
of the folder relevent files should be copied into.
So e.g. "bin_PROGRAM = asd" means binary executable named asd should be built
(PROGRAM) and copied into "bin" folder.
Here are few other common macros:
1. bin_PROGRAM : binary name to be built and copied to bin folder.
2. libexec_PROGRAM : binary name to be built and copied to libexec folder.
3. lib_LIBRARIES : library name to be built and copied to lib folder.
4. lib_LTLIBRARIES : shared library name to be built and copied to lib folder.
5. noinst_PROGRAM : just build program but don't copy it anywhere
6. asd_SOURCES : Now this important one. Here we specify source files for
building asd (binary)
7. asd_a_SOURCES : sources for building asd as static library.
8. asd_la_SOURCES : sources for building asd as shared library.
9. EXTRA_asd_SOURCES : Source files usually for conditional compilation.
10. asd_LDFLAGS : linker flags for building asd.
11. asd_CFLAGS : C flags for building asd.
12. asd_LDADD : libraries to be linked while building asd
13. EXTRA_DIST : copy all mentioned files/folders to installation path.
14. SUBDIRS : List of subdirectories to be traversed while 'Making' package. May
or may not contain Makefile.am files.
As we can see from these macros, it is very easy to alter Makefiles by finding
required macros in Makefile.am and tweaking few macros to change flags, add or
remove sources or do something else.
Important thing is to regenerate Makefile using automake, you must go to parent
folder which contains config.ac or config.in and then execute following
commands:
aclocal
automake
Lets see few example Makefile.am now.
1. AS31
First package we will see here is AS31 an assembler and compiler for 8051
code. It is available on http://www.pjrc.com/tech/8051/.
So Makefile.am in root source dir contains just SUBDIR macro pointing to 'as31
example' to indicate these folders will contain Makefile.am or Makefile for
actual code and examples.
Now Makefile.am in as31 folder contains bin_PROGRAMS having value as31 and
conditionally as31gtk. Both these binary targets have their source file macro
defined with list of code files to be compiled for its generation. There is also
one LDADD macro to indicate linking of libraries.
Makefile.am in examples directory only contains EXTRA_DIST macro to move those
files to installation location.
2. Extreme Tux Racer:
A 3D game available at http://extremetuxracer.com/.
As with previous case its root Makefile.am contains SUBDIR and EXTRA_DIST macro.
Makefile.am in src is also trivial. Contains one bin_PROOGRAM and huge list of
source code files for it.
Now data directory does not contain Makefile.am at all but simple Makefile
indicating it is not to be regenerated through Automake. Modifying such Makefile
has to be done manually as far as I know.
3. Nautilus:
File manager for GNOME. Availalble at
http://ftp.gnome.org/pub/GNOME/sources/nautilus/.
As expected root Makefile.am contains list of subdirectories along with few
extra dist folders.
In libnautilus-extension, we see a new directive named includedir and
include_HEADER which means header files under $(includedir) should be installed
or copied to install location. Usually done for installing header along with
libraries for developers.
In src folder, there is a directive libexec_PROGRAM indicating binary to be
copied to libexec folder rather than usual bin folder.
So that's all I have for basic build info on open source softwares. Please
comment if there is any specific topic missing here or any peculiar
configuration files you came across and wish to be explained here.
ASD
GNU softwares. These tools are also ported to Windows, but generally only GNU
tools ported to windows, which makes use of cygwin or mingw use these build
systems. Tools aimed at windows only systems usually use Visual Studio build
environments.
So quick description of basic terminologies:
Build: It usually means compiling the source code and generating binary
(executable) file. Optionally it contains features to scan host system (system
on which code is being compiled) and make changes in source code accordingly.
Example would be GUI in Linux. Linux supports multiple GUI libraries eg: GTK,
QT. So build system may scan host to determine which library should be used for
compiling this code.
Installer: This program installs the generated executables, shared libraries and
other build outputs to some location, so that OS can find it easily while
executing the programs. Sometimes installer is part of build system itself,
while some times build and installer comes in pair.
Now let us see some very popular build systems:
1. Shell Script: Used for very small and very simple tools or utterly
complex builds. For simple builds, it just contains few lines to compile the
code files and generate and install binaries.
2. Makefile: Most common build system on unix and related environment. Makefile
contains rules about source files and their dependency with each other. It
effectively makes a dependency tree. Also, once built, it can detect which
files have been changed in subsequent builds and compile only those.
3. CMake: I am not too familier with it, so I'll just mention that it is
another build system.
4. SCons: Build system created in python. Gives you flexibility of specifying
rules like Makefile and also add python code to make very complex build
systems.
5. Configure Scripts: These are not truelly build systems, but rather pair up
usually with Makefiles. These scripts are very versatile in the sense they detect
host environment and change Makefiles accordingly. Can be modified to make a
generic build system, which will fit multiple host environments very smoothly.
6. Automake and Autoconf: These are tools to generate Makefile and configure
scripts from specification files. In rest of this post, we'll talk about these
two with few examples.
Building source code:
These steps are common for most of the codes that make use of configure script
and Makefile for building. Change in command line might be needed to change
default build options.
1. Make configure script executable if it is not already: chmod +x ./configure
2. ./configure (optionally --prefix=/custom/install/folder)
3. make
4. make install
5. make clean (only if you want to rebuild everything next time)
Autoconf: As name indicates, it is a tool used to generate configure scripts
from some options file. Usually this options file is named configure.ac or
configure.in. It'll have multiple macros having "AC_" prefix. Most interesting
ones are AC_ARG_ENABLE and AC_ARG_WITH. Here is what GNU manual says about theses
macros:
AC_ARG_ENABLE: "If a software package has optional compile-time features, the
user can give configure command line options to specify whether to compile them.
The options have one of these forms:
--enable-feature[=arg]
--disable-feature"
Structure of this macro is: AC_ARG_ENABLE(feature, help-string, [action-if-given],
[action-if-not-given]). Here action-if-not-given simply means default value.
AC_ARG_WITH: "Some packages require, or can optionally use, other software
packages that are already installed. The user can give configure command line
options to specify which such external software to use. The options have one of
these forms:
--with-package[=arg]
--without-package"
Structure of this macro is: AC_ARG_WITH(package, help-string, [action-if-given],
[action-if-not-given]). Here action-if-not-given simply means default value.
So let's see first example of Autoconf options file:
1. Bash:
I am using bash 4.2 source for this example. Source availble at
http://ftp.gnu.org/gnu/bash/bash-4.2.tar.gz. It has a configure.in file. Lets
open it. As I said, it contains lots of AC_ macros. Description of those can be
found at GNU site. First AC_ARG_WITH macro is present on line number 106. We can
see which optional packages bash has like afs, bash-malloc, gnu-malloc etc. Help
string is the string displayed as description about that package. First
AC_ARG_ENABLE macro is found on line 211. Features like alias, array-variables, debugger are specified here. Interestingly none of these macros had default
value specified. Lets find out where it is specified. eg To find default value
of debugger feature, let's see action associated with it. Third argument of
macro is opt_debugger=$enableval, which means enable value of this feature is
stored in opt_debugger. Simple grep command 'grep -n "opt_debugger"
configure.in |less' on configure.in shows its default value is set on line
number 184 as yes. In configure.in file surrounding lines set default value for
other features. Similar search on curses package shows default package values
are initialized around line 58.
2. Gnash:
Now we'll see confgure file for gnash, the open source flash player. I am
using version 0.8.9 from ftp://ftp.gnu.org/gnu/gnash/0.8.9/. It has a
configure.ac file. Simple grep shows it has 36 features and 13 packages
configureble through config script. Few of them have default value in macro,
others use shell scripting lines to decide values.
3. Nautilus:
The reason behind understanding structure of autoconf file is that all features
and packages may not have enough description. So if one wants to probe for
enabling/disabling specific functionality he can easily look into these macros
and corresponding shell variables to see its effect on output of configure file.
So, suppose we want to add configure option of enabling/disabling debug flag
during compilation of nautilus , we can add following lines to corresponding
configure.in files and generate configure file from it.
I have added following option to line 110 of configure.in after option for
enabling debug.
AC_ARG_ENABLE(
[debug],
[ --enable-debug Enable Debug Flags (default: enabled)],
[case "${enableval}" in
yes) CFLAGS="$CFLAGS -g -O0" ;;
no) CFLAGS="$CFLAGS -g -O3" ;;
*) AC_MSG_ERROR([bad value ${enableval} for --enable-debug option]) ;;
esac],
CFLAGS="$CFLAGS -g -O0")
This option when set, makes CFLAGS(used when compiling .c files) to be -g and
-O0 thus enabling debug output and disabling all optimizations.
Generate configure script using 'autoconf'. Now if we type ./configure --help,
we can see enable-debug option in it. If we do ./configure --enable-debug, and
make, CLFAGS used while compiling .c files will have -g -O0 arguments in it.
Thus debug version of nautilus will be built.
If we want to show enbaled-disabled debug feature after completion of configure
script, we can add following lines at end of configure.in file. That part
usually contains echo statements for other features. We can add status of our
own feature by adding
if test x"echo $CFLAGS | grep \"-g -O0"= "x"; then
echo " Debug disabled"
else
echo " Debug enabled "
fi
at very end of the file. Generate configure file again, and configure it. On
successful configuration it'll show the message Debug disabled/enabled on last
line of output.
Automake: As you can guess from name, this is used to generate a Makefile from
provided template, usually specified in Makefile.am file.
Important thing to note here is nomenclature of macros defined in Makefile.am
files. It is usually specified as "folder name"_"description" or "file
name"_"description" where file/folder name is name of file to be built or name
of the folder relevent files should be copied into.
So e.g. "bin_PROGRAM = asd" means binary executable named asd should be built
(PROGRAM) and copied into "bin" folder.
Here are few other common macros:
1. bin_PROGRAM : binary name to be built and copied to bin folder.
2. libexec_PROGRAM : binary name to be built and copied to libexec folder.
3. lib_LIBRARIES : library name to be built and copied to lib folder.
4. lib_LTLIBRARIES : shared library name to be built and copied to lib folder.
5. noinst_PROGRAM : just build program but don't copy it anywhere
6. asd_SOURCES : Now this important one. Here we specify source files for
building asd (binary)
7. asd_a_SOURCES : sources for building asd as static library.
8. asd_la_SOURCES : sources for building asd as shared library.
9. EXTRA_asd_SOURCES : Source files usually for conditional compilation.
10. asd_LDFLAGS : linker flags for building asd.
11. asd_CFLAGS : C flags for building asd.
12. asd_LDADD : libraries to be linked while building asd
13. EXTRA_DIST : copy all mentioned files/folders to installation path.
14. SUBDIRS : List of subdirectories to be traversed while 'Making' package. May
or may not contain Makefile.am files.
As we can see from these macros, it is very easy to alter Makefiles by finding
required macros in Makefile.am and tweaking few macros to change flags, add or
remove sources or do something else.
Important thing is to regenerate Makefile using automake, you must go to parent
folder which contains config.ac or config.in and then execute following
commands:
aclocal
automake
Lets see few example Makefile.am now.
1. AS31
First package we will see here is AS31 an assembler and compiler for 8051
code. It is available on http://www.pjrc.com/tech/8051/.
So Makefile.am in root source dir contains just SUBDIR macro pointing to 'as31
example' to indicate these folders will contain Makefile.am or Makefile for
actual code and examples.
Now Makefile.am in as31 folder contains bin_PROGRAMS having value as31 and
conditionally as31gtk. Both these binary targets have their source file macro
defined with list of code files to be compiled for its generation. There is also
one LDADD macro to indicate linking of libraries.
Makefile.am in examples directory only contains EXTRA_DIST macro to move those
files to installation location.
2. Extreme Tux Racer:
A 3D game available at http://extremetuxracer.com/.
As with previous case its root Makefile.am contains SUBDIR and EXTRA_DIST macro.
Makefile.am in src is also trivial. Contains one bin_PROOGRAM and huge list of
source code files for it.
Now data directory does not contain Makefile.am at all but simple Makefile
indicating it is not to be regenerated through Automake. Modifying such Makefile
has to be done manually as far as I know.
3. Nautilus:
File manager for GNOME. Availalble at
http://ftp.gnome.org/pub/GNOME/sources/nautilus/.
As expected root Makefile.am contains list of subdirectories along with few
extra dist folders.
In libnautilus-extension, we see a new directive named includedir and
include_HEADER which means header files under $(includedir) should be installed
or copied to install location. Usually done for installing header along with
libraries for developers.
In src folder, there is a directive libexec_PROGRAM indicating binary to be
copied to libexec folder rather than usual bin folder.
So that's all I have for basic build info on open source softwares. Please
comment if there is any specific topic missing here or any peculiar
configuration files you came across and wish to be explained here.
ASD
Sunday, July 15, 2012
Tools
Obviously first post will be tools needed for reading codes. Earlier I simply
used to untar the source and start reading from some random files, but now
I think tools are equally necessary at-least for beginners. So here is list of
my favorite tools. Most of these are also available on windows with some good
additions (like Visual Studio 2010 editor + symbols search).
1. Geany: http://www.geany.org/ My editor of choice. It is very light-wight,
shows functions, structure, classes and globals on left hand panel. It has a
drawback that it doesn't show symbols from other files, but with it works for me
with combination of other tools.
I used to use Vim before, and at some point may again go back to it. I still use
it as default browser for Cscope. Vim with plugins may have all capabilities of
geany (with infinite advanced features of-course).
2. Cscope: http://cscope.sourceforge.net/ An awesome tool to extract almost
anything from your code. Works best with C, but as we go to C-like languages
(C++, Java and othrs) it starts giving hitches. There is a tool to generate
python index for Cscope. I haven't tried it yet, but if it is good enough,
Cscope should work smoothly for python too.
3. Cscope-Vim plugin: http://cscope.sourceforge.net/cscope_vim_tutorial.html. Vim already has support for Cscope, but this plugin reduces commands to just a
combination of keys. Also keeps track of your jumps in code.
4. Doxygen: http://www.stack.nl/~dimitri/doxygen/ Call graph generator for all
languages. Perticularly useful when we need to trace path between two points in
code path.
5. grep/perl/python: Very useful for search/modifications which are easily
codable and difficult to do from above tools. I am not very familier with perl,
but it is part of my current in-progress-learnings.
6.GDB: Default linux debugger. Easy to use once we know basic commands. As with
all debuggers, it may miss breakpoints due to non-loaded libraries,
optimizations, multithreading etc. I perticularly like putting breakpoints
using "INT 3" so that none of them are missed.
used to untar the source and start reading from some random files, but now
I think tools are equally necessary at-least for beginners. So here is list of
my favorite tools. Most of these are also available on windows with some good
additions (like Visual Studio 2010 editor + symbols search).
1. Geany: http://www.geany.org/ My editor of choice. It is very light-wight,
shows functions, structure, classes and globals on left hand panel. It has a
drawback that it doesn't show symbols from other files, but with it works for me
with combination of other tools.
I used to use Vim before, and at some point may again go back to it. I still use
it as default browser for Cscope. Vim with plugins may have all capabilities of
geany (with infinite advanced features of-course).
2. Cscope: http://cscope.sourceforge.net/ An awesome tool to extract almost
anything from your code. Works best with C, but as we go to C-like languages
(C++, Java and othrs) it starts giving hitches. There is a tool to generate
python index for Cscope. I haven't tried it yet, but if it is good enough,
Cscope should work smoothly for python too.
3. Cscope-Vim plugin: http://cscope.sourceforge.net/cscope_vim_tutorial.html. Vim already has support for Cscope, but this plugin reduces commands to just a
combination of keys. Also keeps track of your jumps in code.
4. Doxygen: http://www.stack.nl/~dimitri/doxygen/ Call graph generator for all
languages. Perticularly useful when we need to trace path between two points in
code path.
5. grep/perl/python: Very useful for search/modifications which are easily
codable and difficult to do from above tools. I am not very familier with perl,
but it is part of my current in-progress-learnings.
6.GDB: Default linux debugger. Easy to use once we know basic commands. As with
all debuggers, it may miss breakpoints due to non-loaded libraries,
optimizations, multithreading etc. I perticularly like putting breakpoints
using "INT 3" so that none of them are missed.
About
This is my first tutorial ever. My inspiration was my own struggle to read src
codes. I am still bad at it, and I hope that I'll find better methods through
comments. But I have learned a few things by now, and I hope sharing them would be useful since as far as I know there are no good articles on this topic yet.
I'll try to cover how to read src code of any tool or in general any src code
available. Rather than reading whole code in one shot, I'll focus on some feature
(of my interest) and try to tweak it a little or sometimes add new feture
altogether. Given that most of the codes for Linux are written in C/C++, most
of the tutorials will focus on them. But I'd like to find some python code and
work on it. All suggestions on this are welcome. As I said I'll take source from
linux tools, but if any interesting Windows src is there, I'd love to work on it.
P.S. The name of blog is illogical expansion of my favorite character string ASD.
ASD
(Anonymous Source Dissector)
codes. I am still bad at it, and I hope that I'll find better methods through
comments. But I have learned a few things by now, and I hope sharing them would be useful since as far as I know there are no good articles on this topic yet.
I'll try to cover how to read src code of any tool or in general any src code
available. Rather than reading whole code in one shot, I'll focus on some feature
(of my interest) and try to tweak it a little or sometimes add new feture
altogether. Given that most of the codes for Linux are written in C/C++, most
of the tutorials will focus on them. But I'd like to find some python code and
work on it. All suggestions on this are welcome. As I said I'll take source from
linux tools, but if any interesting Windows src is there, I'd love to work on it.
P.S. The name of blog is illogical expansion of my favorite character string ASD.
ASD
(Anonymous Source Dissector)
Subscribe to:
Comments (Atom)