This is the html version of the file http://www.cs.hmc.edu/clinic/projects/2004/google/final_report/main.pdf.
Google automatically generates html versions of documents as we crawl the web.
Page 1
Differential Code Coverage in the
Context of the Wine Project
Dan Kegel
Google Liaison
Elizabeth Sweedyk
Team Advisor
Aaron Arvey
Claremont McKenna
Edward Kim
Harvey Mudd
Evan Parry
Harvey Mudd
Cal Pierog
Harvey Mudd
April 24, 2004
Abstract
This project will help ensure that Google’s Windows applications
run properly under Wine on Linux. We have improved coverage
tools to identify areas of untested code used by an application
and used these new tools to identify bugs in Wine that effect
Google applications, focusing primarily on the Picasa 2 appli-
cation. The methodologies developed can be extended to other
software applications.

Page 2
Google
Technical Report
HMC CS
Contents
1 Background
4
1.1 Why does Google care about Wine? . . . . . . . . . . . . . . . . . . . . . . .
4
1.2 Wine is not the silver bullet . . . . . . . . . . . . . . . . . . . . . . . . . . .
5
2 Coverage: the answer to Wine’s drawbacks
5
2.1 What is code coverage? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5
2.2 Differential code coverage . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6
3 Summary of Work
6
4 GCOV
7
4.1 What Does GCOV Do? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7
4.2 Some Internals of GCOV . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7
4.3 How to Apply GCOV to Wine . . . . . . . . . . . . . . . . . . . . . . . . . .
8
4.4 GCOV is not Enough . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
8
5 LCOV
8
5.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
8
5.2 What Does LCOV Do? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
9
5.3 Some Internals of LCOV . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
9
6 Our Code Coverage Enhancements
11
6.1 Integrating Code Coverage into Wine . . . . . . . . . . . . . . . . . . . . . .
11
6.2 Adding Differential Code Coverage . . . . . . . . . . . . . . . . . . . . . . .
11
6.3 Adding a Legend to LCOV . . . . . . . . . . . . . . . . . . . . . . . . . . . .
12
6.4 How to Use the New LCOV . . . . . . . . . . . . . . . . . . . . . . . . . . .
13
6.5 Automated Wine Coverage Script . . . . . . . . . . . . . . . . . . . . . . . .
14
7 Google Applications’ Performance Under Wine
15
7.1 Feature Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
15
7.2 Manual Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
16
7.3 Resolved Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
16
7.3.1 Installer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
16
7.3.2 GUI Fonts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
16
7.4 Outstanding Bugs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
17
7.4.1 Order Print Bug . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
17
7.4.2 Slideshow Bug . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
17
7.4.3 Help Bug . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
17
8 Automated Testing with Cxtest
17
8.1 Cxtest Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
18
8.2 Our Experience with Cxtest . . . . . . . . . . . . . . . . . . . . . . . . . . .
19
8.3 Our Findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
19
2

Page 3
Google
Technical Report
HMC CS
9 Writing Tests for Wine
20
9.1 The Purpose of Wine Tests . . . . . . . . . . . . . . . . . . . . . . . . . . .
20
9.2 The Test Suite Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
20
9.3 Coverage-driven Test-making . . . . . . . . . . . . . . . . . . . . . . . . . . .
21
9.4 The LZExpand Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
21
9.4.1 Reading a Compressed File . . . . . . . . . . . . . . . . . . . . . . .
22
9.4.2 Alternate File Endings . . . . . . . . . . . . . . . . . . . . . . . . . .
22
9.4.3 Reading Past File End . . . . . . . . . . . . . . . . . . . . . . . . . .
22
9.5 AdvApi32 Registry Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
23
10 Assessment
23
10.1 Code Coverage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
23
10.1.1 Directing New Users . . . . . . . . . . . . . . . . . . . . . . . . . . .
23
10.1.2 Motivating New Tests . . . . . . . . . . . . . . . . . . . . . . . . . .
23
10.1.3 Reexamining Faulty Tests . . . . . . . . . . . . . . . . . . . . . . . .
24
11 Future Work
24
A Online Resources
25
B Wine Documentation Patch For Enabling Code Coverage
26
C --missing-from-baseline Patch For LCOV
31
D --legend Patch For LCOV
31
E Automated Wine Coverage Script
31
3

Page 4
Google
Technical Report
HMC CS
1 Background
During the last decade, the Internet and personal computing have grown from luxuries to
household necessities. At the same time, Microsoft has enjoyed a monopoly on the personal
computer operating system market. Essentially all PCs sold today in the US are bundled with
Microsoft Windows, and a large percentage of those bundles also include Microsoft Office.
Large PC vendors such as Dell receive “marketing incentive” payments from Microsoft to
ensure that this situation continues. Free of any meaningful competition, Microsoft has
essentially been free to dictate the price it charges for its software, and as a result is one of
the most profitable companies in the world.
During the same decade, Linux has matured as a server operating system, and is now
beginning to mature as a desktop operating system as well. Three examples of high quality
free software often bundled with Linux include the Mozilla web browser, the OpenOffice.org
office suite, and the Gnome desktop. All three of these projects are of increasing interest
to the large customers who pay large site licenses to Microsoft, because together they offer
something novel: competition.
Recently, hardware markets have tightened considerably and prices for the actual ma-
chines have fallen dramatically. As a result, software licensing fees as a percentage of total
cost have risen, causing many companies and governments to look to the free Linux software
as an alternative to the costly Microsft software. Some companies like IBM, Sun, Novell,
and HP are even pouring billions of dollars into enhancing Linux to suit their needs. The
IDC is predicting that Linux’s share of the desktop market will rise to 7 percent by 2007.
However, there is a chicken-and-egg problem with switching to Linux: software vendors
won’t write Linux versions of their programs until a large enough portion of personal com-
puters are sold with Linux, and nobody will buy a personal computer with Linux until
applications are available. (Antitrust lawyers call this the Applications Barrier to Entry; it’s
what prevents new operating systems from entering the market.)
The Linux community recognized this problem and began the Wine project, which aims
at implementing the Windows API under Linux. In theory, Wine would allow all Windows
programs to run under Linux.
In the recursive acronym tradition of open source, Wine stands for “Wine Is Not an
Emulator.” This refers to the fact that Wine does not try to emulate a complete hardware
environment, but rather allows the Windows binaries to run directly on the CPU. Wine’s
job is to intercept Windows API calls and seamlessly perform the desired behavior under
Linux.
1.1 Why does Google care about Wine?
Google is best known as a search engine giant, but its mission is wider: Google wants to
organize the world’s information, and make it easy to use. Since lots of information is stored
on desktop computers, Google naturally wants to help manage that information too. Its first
offering in this direction is Picasa, a program to help users organize their photographs.
Google, like other software developers, is quite aware of the statistical imbalance in OS
usage mentioned above and, not surprisingly, has developed its applications for Windows.
4

Page 5
Google
Technical Report
HMC CS
While Google does not yet have sufficient incentive to port its Microsoft Windows appli-
cations to Linux, it would still like its applications to be available to users at companies that
have migrated to Linux. The most economical way to do this is to run Google’s Microsoft
Windows applications under Linux using Wine.
1.2 Wine is not the silver bullet
Although a decade of work has been contributed, Wine still labels itself as alpha software.
The main problem is that the Windows API is a moving target. The Wine project began
as an attempt to run Windows 3.1 applications. Since then Microsoft has released Windows
95, 98, ME, NT 4, 2000, and XP, in addition to many new APIs such as the DirectX media
libraries or the Winsock networking calls. The Wine project has made a valiant effort to
keep up all the new Windows functionality while continuing to develop on the rapidly moving
Linux platform.
Due to the volunteer nature of open source development, it comes as little surprise that
Wine has a rather uneven level of support for various Windows libraries. The essential
libraries and interesting features have generally been implemented before the less interesting
and difficult ones. In addition, the desire to make things work right away has led to some
neglect of documentation and testing. As things are, it is hard to say with great confidence
what is well supported (and tested) under Wine, what is partially supported, and what
simply does not work.
2 Coverage: the answer to Wine’s drawbacks
Thus, we are left with the question, “How can we figure out exactly what is supported and
tested by Wine?” Not only do we need to figure out what is being tested now, we would like
an automated system that will always be able to tell us what is being tested at any point
in the future, even after Wine extensive change and revision. Our team determined that the
cleanest and most generalizable solution would be to add automated code coverage to Wine.
2.1 What is code coverage?
Whenever a program does anything, lines of code are executed to achieve the effect. Some
lines of code are quite general in purpose and are executed every time a program is run,
while other lines of code are only executed under very specific circumstances. Code coverage
refers to the process of tracking which lines of code are executed while a program runs.
We believe that code coverage will prove to be highly useful when applied to the Wine
test suite because it will allow anyone running the test suite to quickly and easily see what
parts of the Wine code are executed during a run of the wine test suite, and which ones are
not. Thus, with the addition of code coverage, the question of Wine libraries are tested and
more fully supported will be answered.
The actual utilities used for code coverage, as well as an explanation of their use can be
found in sections 4 and 5.
5

Page 6
Google
Technical Report
HMC CS
2.2 Differential code coverage
However, simply adding code coverage to Wine will not suffice. With a source code totaling
roughly 80 megabytes, Wine is quite large and, as such, a description of everything that is
executed will not be specific enough to be truly useful. To further hone our search, we will
need to focus on the code that is most important to us, specifically Google’s applications.
To this end, we introduce the idea of differential code coverage, which requires two
different runs of the same source code, the “baseline” and the “test”. Any lines of code
executed in the baseline will not show up in the differential code coverage, leaving only the
lines of code executed in the baseline, but not in the test.
For our purposes, the “baseline” will be Wine’s test suite and the “test” will be a Google
application running under Wine. Both the test suite and the Google application both use
the same Wine source code. The test suite runs the Wine source directly while the Google
Windows application will use the Wine source whenever Wine must step in to provide func-
tionality normally provided by the Windows operating system. The differential code coverage
displays what lines of the Wine code are used by the Google application, but are not tested
by Wine’s test suite, which corresponds to the red region in Figure 1.
Figure 1:
In our differential code coverage, we compare what lines of the Wine source are used by the
Google application, but are not tested by Wine’s test suite. The area marked in red represents the untested
lines of code that Google’s applications use, which is the area where we want to concentrate our own test
writing efforts.
Details of our implementation of differential code coverage can be found in section 6.2.
3 Summary of Work
From these initial realizations, we set out two main objectives for our clinic work over the
year.
• Integrate code coverage into Wine’s development and testing process
• Use the newly refashioned process to identify and write tests for untested areas of Wine
that are important for Google’s applications
6

Page 7
Google
Technical Report
HMC CS
To understand the steps we took addressing these two objectives, we must first explain
the coverage utilities that we chose (GCOV/LCOV). Both how to use them and their internal
workings are detailed in sections 4 and 5. These coverage utilities were then enhanced and
integrated into Wine. An in-depth explanation of the integration and changes can be found
in section 6.
Once our coverage utilities were integrated into Wine, we were able to run the Google ap-
plications under Wine, and compare the coverage with that of the Wine test suite. Moreover,
while we were running the Google applications under Wine, we not only collected coverage
data, but also discovered a number of bugs resulting from the Google applications being run
under Wine instead of under a true Windows operating system. These bugs are the result of
Wine imperfectly mimicking Windows and, because they prevent Google applications from
functioning correctly for Linux users running Wine, are of key importance to both Google
and the Wine developers. Section 7 includes a description of how the Google applications
performed under Wine as well as providing a summary of all the bugs we found.
Most of our testing of Google applications under Wine was conducted my manually exe-
cuting the programs. However, our team recognized that we could produce more consistent,
reproducible results with considerably less hassle if we automated the testing process with
Cxtest. Due to time constraints and the immature nature of the Cxtest utility, we met with
limited success, which is detailed in section 8.
Having exposed untested areas of Wine code that are used by Google applications, we
then began to fill these gaps by writing new tests. Ideally, these tests reveal many of the
bugs still lingering in the previously untested areas of code. Moreover, once the bugs our
fixed, Google applications will be able to run much more seamlessly under Wine. The tests
that we have written for the Wine test suite can be found in section 9.
4 GCOV
4.1 What Does GCOV Do?
GCOV, which stands for “GNU Code Coverage”, is a tool developed by the GNU Project
for the GCC compiler suite,
1
and is used to generate code execution statistics concerning
which source lines of machine instruction code were run. For instance, a source file source.c
is compiled using gcc with the appropriate switches. These switches cause gcc to output
several additional files which will be used to determine which lines are executed. We can
then run our program in whatever manner we see fit. When the program terminates, we
can run gcov on the original source file source.c and an output file, source.c.gcov, will
describe how many times each line of source code was run during execution.
4.2 Some Internals of GCOV
GCOV needs several things in order to work. First and foremost a developer must compile
the application with all the additional information needed to run GCOV, specifically the
1
Documentation can be found at http://gcc.gnu.org/onlinedocs/gcc/Gcov.html and the gcc com-
piler suite can be downloaded at http://gcc.gnu.org/
7

Page 8
Google
Technical Report
HMC CS
-ftest-coverage and -fprofile-arcs flag. These flags cause the program to produce
three files for each source file, named sourcefilename.{bb, bbg, da}.
sourcefilename.bb is output at compile time when the -ftest-coverage switch is used.
The .bb file enumerates the other source and header files which are referenced by the
file that is currently being compiled. It also keeps track of the line numbers in the file
being compiled.
sourcefilename.bbg is output at compile time when the -ftest-coverage switch is used.
The .bbg file contains possible branches from each block of code. This allows gcov to
do a full construction of a program flow graph.
sourcefilename.da is output after a program compiled with -fprofile-arcs is run. The
.da file contains numbers which represent how many times a line of code was run. The
numbers are cumulative, which means that if you run the program twice in the exact
same manner, then all the numbers in sourcefilename.da will be doubled.
More information can be found at http://gcc.gnu.org/onlinedocs/gcc-3.0/gcc_8.html.
This page also specifies the formats of the three files.
4.3 How to Apply GCOV to Wine
There is a brief tutorial in Appendix B that describes how to use gcov on a single source file
in Wine. This documentation was submitted to the Wine project in the patch that enabled
GCOV code coverage abilities to be compiled in to Wine.
4.4 GCOV is not Enough
Although useful for quickly profiling small portions of code, GCOV is not well suited to
handle large projects. As such we will be using a wrapper around GCOV which was developed
by the LTP (Linux Test Project) called LCOV. This wrapper does several things which
enables GCOV to be applied to larger projects.
5 LCOV
5.1 History
Historically, LCOV, which stands for “LTP GCOV” has been used by the LTP project to see
what parts of the Linux Kernel have been tested.
2
LCOV was developed by people at IBM
Linux Technology Center in Austin and the IBM Development Lab in Boeblingen, Germany.
Since the Wine project is similar to that of an operating system environment, such as the
Linux Kernel, this tool is well-suited to profiling Wine.
2
Further description and documentation can be found at http://ltp.sourceforge.net/
8

Page 9
Google
Technical Report
HMC CS
5.2 What Does LCOV Do?
LCOV is a wrapper around GCOV which automates a lot of tasks that would be very daunt-
ing otherwise, such as generating HTML output and running GCOV on a whole directory
of source files. Since LCOV is just a wrapper, a developer still needs to compile the project
with the necessary flags as referenced in section 4.2.
Once, the project is compiled and has gone through an execution, one can run lcov to
generate coverage statistics for that execution of the project. When the source code is spread
across multiple files and/or directories, LCOV displays a directory view like that found in
figure 2. The total coverage for the entire project is summarized in the header at the top,
while the coverages for each individual source code directory are summarized in the blue
table.
Figure 2: The directory view of LCOV-generated code coverage statistics
LCOV also displays the source code at an individual file level, as seen figure 3. The
header information now only corresponds to this file instead of all of the contents of the
directory. In the body, the number at the left of each line indicates the number of times
that the line was executed. The highlighting reinforces the numbers, marking completely
unexecuted lines red and the rest blue.
5.3 Some Internals of LCOV
LCOV is implemented as five Perl scripts:
gendesc creates test descriptions to be used later by genhtml. These descriptions are meant
to be used so that one can see which tests are being run for a given execution.
genhtml takes a tracefile (“.info” file) and cross references the data to the source file and
creates organized, comprehendible HTML output.
geninfo creates the tracefiles from the data obtained from GCOV. This is the part of the
project which is the wrapper for recursively applying GCOV.
genpng creates a picture of a single source file by using each character as a single picture.
9

Page 10
Google
Technical Report
HMC CS
Figure 3: The individual source file view of LCOV-generated code coverage statistics
lcov is wrapper around geninfo and also handles many of the options. lcov makes the
command line interface a little more pleasant by being a front end for a few of the
other scripts.
When lcov is called upon to generate coverage statistics it runs gcov on every file in
every directory recursively. Then, it then inspects the .da file and creates a set of records
for every source line. Once all files have been looked at, the set of records for each of the
source files is written to a tracefile in a format that genhtml will understand.
The tracefile format is fairly straight forward. Every line begins with a two letter token
which represents what kind of line it is. For our purposes this token can be one of the
following:
TN Test Name.
SF Source File. This is specified as an absolute address. One should not rename the
directories before using genhtml on a tracefile.
FN Function. This is a list of functions in the given source file.
DA Data. This is 3-tuple containing information on every line implemented into machine
code. This information includes the number of times the line was ran, the line number
in the source file, and the a hash of the source line for verification when running
genhtml.
LF The total number of source lines found in the file.
10

Page 11
Google
Technical Report
HMC CS
LH The total number of source lines executed in the file.
Every record is preceded by the TN token and ended by an end of record token. Once
the tracefile has been produced by running lcov, genhtml turns it into a HTML website.
This allows for easy navigation and a very nice overview of the results.
6 Our Code Coverage Enhancements
6.1 Integrating Code Coverage into Wine
Instead of the usual make command to build Wine, if you wish to use code coverage, use the
following command.
%> make CFLAGS="-fprofile-arcs -ftest-coverage"
With these extra compiler flags, Wine will produce extra profiling data every time it runs.
This is exactly the data necessary for the GCOV and LCOV utilities to collect code coverage
for Wine.
A more detailed explanation of the process is included in our patch to the Winedocu-
mentation in appendix B.
6.2 Adding Differential Code Coverage
As mentioned in section 2.2, our project requires the use of differential code coverage to
hone in on the sections of untested code that are needed by the Google applications. The
most logical place to put this is in the LCOV utility, which is already dealing with the code
coverage of the entire source tree.
As noted in section 5.3, lcov puts all the data for one run of the program is stored in
a tracefile. Since we want to compare two runs of the same program (Wine), we want to
leave the data in these two tracefiles and then simply compare them to produce output, a
job assigned to genhtml. Thus, the process of adding differential code coverage was broken
down to simply adding a new command line option --missing-from-baseline to genhtml.
As one might expect, this flag takes in two tracefiles and flags only the lines that are run
the main run (Google application) but not in the baseline (Wine test suite).
Our alternation was completely general so that any two tracefiles can be used as the
main run and the baseline (not just the Google applications and the Wine test suite. Thus,
we produced an enhancement not only useful to us but to anyone wishing to do any sort of
differential code coverage. The patch itself can be found in Appendix C.
Our change to LCOV was also minimal and has no effect if our switch is not used. Only
when a developer uses --missing-from-baseline in combination with -b (--baseline)
does one notice the changes we made (see the lcov man page for more information concerning
these options). For best results, we strongly encourage using the same source tree, rather
than different source trees for builds in order to avoid conflicts when displaying the code in
the LCOV html output.
11

Page 12
Google
Technical Report
HMC CS
A description of how to properly use the new --missing-from-baseline flag in conjunc-
tion with our other changes to LCOV is included in section 6.4. The patch itself is included
in appendix C.
6.3 Adding a Legend to LCOV
While we were using LCOV to collect coverage data, we noticed that the coloring of the
various fields was somewhat confusing, at least to the relatively-new user. It was decided
that a legend explaining the meaning of each color would add greatly to the clarity of the
code coverage output. Examples of the code coverage before and after our additions can be
seen in figures 4 and 5, respectively.
Figure 4: Simple code coverage output using the original LCOV
Figure 5: Simple code coverage output using our new legend
The legend at the top of figure 5 details the range of each of the colors used to differentiate
varying coverage levels. This legend is only displayed when the user adds the --legend flag
when calling genhtml to produce the code coverage output.
Additionally, we noticed an inconsistency with the coloring in the code coverage output.
For the low and medium entries (normally red and orange, respectively), the color of the
horizontal bar and the background color of the corresponding coverage percentage were
the same color. However, with the high coverage entries, the colors no longer matched.
The background color of the of the coverage percentage was always blue while the bar was
normally green. We fixed this, making sure that, unless the user manually changes the
generated css file, the colors match up. As you can see in figure 5, the green color of the
12

Page 13
Google
Technical Report
HMC CS
horizontal bar now matches the green background of the coverage percentage. This change
affects all genhtml output, regardless of the presence or absence of the --legend flag.
This legend patch to LCOV can be found in appendix D.
6.4 How to Use the New LCOV
When using the --missing-from-baseline flag, the user needs two tracefiles (.info files)
generated from two different runs of the same source (see section 5.3 for more informa-
tion on tracefiles). For the sake of relevance, call these two tracefiles testSuite.info and
GoogleApp.info, the former being a run of Wine’s test suite and the latter being a run of a
Google application under Wine. Although we are using an example specific to our project,
these two tracefiles can be any runs for any single source tree.
Using genhtml with our --missing-from-baseline flag (and --legend), we can produce
differential code coverage results which tell us which lines are used by the Google application
but remain untested. The syntax for creating this differential code coverage is:
%> genhtml --legend -p . -o compare.out -b testSuite.info \
--missing-from-baseline -s GoogleApp.info
For the specific functionality associated with each of the different flags, please refer to the
genhtml man pages. Once completed, the command above places the desired differential
coverage in a folder named compare.out, which can be opened with one’s browser of choice.
An example of typical output can be found in figure 6.
Figure 6: An example of differential code coverage, which compares a run of Picasa under
Wine against the Wine test suite.
Due to the differential nature of this code coverage, anything in Wine that was used
by Picasa but not tested by the test suite is marked as unexecuted (which is bad). Corre-
spondingly the percentages note the amount of untested code used by Picasa, making high
13

Page 14
Google
Technical Report
HMC CS
percentages good and low percentages bad (assuming our eventual goal is to have Picasa
only use Wine code that has been tested.
In figure 6, there are additional footnotes which explain what the “Code Covered” and
“Executed Lines” fields mean. These are intended for newer users to help interpret their
differential coverage results. As such, these footnotes are only included when both
--missing-from-baseline and --legend flags are used.
6.5 Automated Wine Coverage Script
While the output of differential code coverage is quite user-friendly (see figure 6), the various
commands used to create it are less so (see section 6.4). As such, we have devised a bash
script which automates the process of collecting code coverage data of Wine’s executions.
The script, called wine cov, is included in appendix E.
The script does little by itself, but when passed various flags, it provides a plethora of
functionality.
-h Displays the usual help message
-c Downloads Wine and LCOV from anonymous CVS servers at winehq and LTP respec-
tively. If the wine code changes, it is automatically recompiled.
-d Currently, this downloads three patches to the patches directory from our clinic website.
• A patch which allows the script to easily integrate code coverage into the virgin
Wine source (downloaded with -c).
• The --missing-from-baseline patch, detailed in section 6.2.
• A patch that tells LCOV to ignore more errors than it otherwise might. This
allows the code coverages to be automatically generated with much less need for
human intervention.
The intent of this option is to quickly be able to get any patches that we have produced
but have not yet been accepted into the Wine or LCOV source codes, enabling us to
easily use our own changes even before the wider community has a chance to look over
them.
This option also enables the -p option below.
-p Apply all the patches currently in the patches directory to the source. If new patches
are applied to the Wine source, you may wish to also include the -r flag below.
-r Forces the script to recompile Wine from the current source.
-t Run the Wine test suite and use LCOV to get coverage results for it. To ensure current
test suite results, re-run the test suite every time you change either the Wine or LCOV
source codes (with either -c or -p).
14

Page 15
Google
Technical Report
HMC CS
-i To use this option you must also provide the filename of a tracefile (a.k.a. infofile; see
section 5.3) which was made using the Wine source code downloaded by this script.
If it receives a valid tracefile, the script will generate coverage for both the tracefile
independently as well as differential code coverage using the Wine test suite as a
baseline. Because of the latter, this option requires the previous or concurrent use of
-t.
-I Like -i except the filename provided is not a pre-existent file, but the name of a tracefile
which the script should produce using any and all Wine run data. This run data
includes anything that Wine has done since the last time the run data was cleared. To
clear the run data, execute:
%> lcov -d <wine source directory> -z
Once the tracefile is generated by this command, it functions exactly like -i.
Any tracefile used by this script will have any files ending in .spec.c or .dbg.c removed
from it such that they will not appear in the coverage results, which includes the tracefile for
the Wine test suite (-t) and user-provided tracefiles (-i or -I). Wine produces these .spec.c
and .dbg.c files as part of its build process but they are not relevant to coverage results as
they are not truly part of the source code.
When this script is run without flags for the first time, it assumes that the user wants to
the Wine and LCOV source codes, as well as a compiled Wine executable. Essentially, the
script assumes -c and -r options.
7 Google Applications’ Performance Under Wine
Using the enhanced version of LCOV, we began comparing the portions of Wine source
code used by a Google application and not by the Wine test suite. We decided to focus
on Picasa, since it was the only Google application that we could initially get working in
Wine. It also seemed to be the best candidate for an application that Wine could run
because it appeared to require the least external dependencies. (Hello! required an Internet
blog, Keyhole required Internet login and the requirements for Google Desktop Search were
unknown.) Early in the second half of the project, Picasa 2 became publicly available, and
so we shifted our attention to the new version of Picasa.
7.1 Feature Mapping
In hopes of exposing untested parts of code methodically, we decided it would be fruitful
to map the feature set in Picasa 2. To do this, an initial audit was made of the menu
options featured in Picasa. Often, these menu options were linked to the same functionality
as the easy-access buttons in the UI. It was not unusual for menu options to reveal more
functionality via dialog boxes. In order to keep the feature list manageable, we limited
15

Page 16
Google
Technical Report
HMC CS
our initial audit to the top-level menu options available when an image was selected. The
features in Picasa 2 can be broken into three main components: image management, image
editing and image publishing.
Image management describes how Picasa allows users to locate, sort and organize all
the images on a computer. This includes being able to copy and move files from folder
to folder within the program, and flagging and labeling files that are important. Picasa’s
image editing allows users to remove red eye from photos, crop and rotate images, as well
as apply various image filters. Image publishing describes how Picasa provides users a range
of methods for sharing their photos. Quick access to CD burning, printing, online photo
ordering and integration with Hello! are a just a few of the ways Picasa can help a user
produce and present their images.
7.2 Manual Testing
With the feature set mapped, we began by running coverage while manually testing features
in Picasa. We looked for features that we believed would expose a portion of Wine that
is untested by the Wine test suite. Among the features we tested were the print ordering
functionality, the Help system, the image editing capabilities, the slideshow and various
minor file management features.
From the standpoint of limited manual testing, it seems that Wine can support most
commonly used features, but has trouble with features that have external dependencies such
as printers or browsers. Even in Wine, Picasa’s file management and image editing appears
robust.
We looked at ways to possibly automate and expand our testing process. When CXtest
was publicly released, we examined CXtest for automating our UI feature testing. Our work
with CXtest is detailed in section 8.
7.3 Resolved Issues
7.3.1 Installer
Previously, the Picasa installer would fail in Wine. The work-around we used was to simply
transplant an install directory from a normal install in Windows. The Picasa 2 installer,
however, seems fully functional in Wine.
7.3.2 GUI Fonts
For certain buttons, dialog boxes and tool tips, the new version of Picasa introduced an
issue that caused fonts to be rendered incorrectly. Communication with the Picasa team
indicated that this might be a problem with a few new font rendering functions they added
to Picasa. Further investigation turned up a recent patch for Wine that seemed to fix the
problem. Currently, Wine renders fonts in Picasa correctly.
16

Page 17
Google
Technical Report
HMC CS
7.4 Outstanding Bugs
7.4.1 Order Print Bug
This bug occurs when Order Print is selected from the File menu. Usually, Picasa directs
the user to a page where prints can be ordered from Kodak. In Wine, Picasa prompts the
user to download an ActiveX control. Regardless of the choice of the user, Picasa will crash
and freeze, only allowing itself to be closed.
7.4.2 Slideshow Bug
This bug occurs when the user attempts to run a slideshow in Picasa. Usually, the slideshow
takes a selection of images and displays them full-screen for a set amount of time. In Wine,
when the user attempts to run a slideshow, Picasa enters into a bad state that requires the
entire process to be terminated.
7.4.3 Help Bug
This bug occurs when the user attempts to access Wine’s help documentation. Usually, the
online help documentation is brought up in the user’s default web browser. In Wine, Picasa
seems to freeze, allowing itself to be closed.
8 Automated Testing with Cxtest
Having developed our differential code coverage tools for analyzing what portions of Wine
Google applications use, we would like a consistent and repeatable test procedure generating
this data. This means we want to generate our usage information for Picasa by going
through the same steps each time. A consistent test environment allows us to be sure that,
for example, increases in code coverage are due to improvements in Wine unit tests, rather
than just using different parts of Wine because our tests are not the same. These goals could
be reached by having a specific written procedure for a human operator to perform for each
test, but this solution is labor and time intensive while still being prone to error.
A much better solution is to use an automated testing environment. The Wine test
suite provides this on the function and unit level. That is, the tests call specific functions
in Wine and check that the results are correct. We cannot do this sort of testing with a
user application like Picasa since usage of the program involves interacting with the GUI
elements. Instead we need a test infrastructure that is capable of interacting with graphical
applications and, more specifically, can handle the intricacies of Wine and X11 interactions.
Fortunately, the open-source software company CodeWeavers has released just such a
tool:
Cxtest stands for “CodeWeavers X11 Test”. Cxtest is an open source project
that provides visual regression testing facilities for X11 based systems. It can
automate basic X Window functionality, including finding X windows by title
or by graphic picture. It can also drive Wine based windows, using Windows
semantics. (http://www.cxtest.org)
17

Page 18
Google
Technical Report
HMC CS
CodeWeavers produces their own commercial version of Wine, so they have a vested
interest in testing applications in the Wine environment. Thus, they have developed Cxtest,
which aims to be exactly the type of GUI test infrastructure that we need. Unfortunately,
as the Cxtest website states, “Right now, Cxtest is in an ’early adopter’ stage. That is,
it’s really only appropriate for folks that are willing to suffer some rough edges.” Indeed,
almost all the code is centered around the CrossOver Office product and testing for it.
Documentation is spotty and there were definitely rough edges. Still, we were able to work
with the Cxtest people to develop a script for testing the Picasa 2 installation process, which
has been included in the main Cxtest CVS tree. We will describe some of the structure of
Cxtest here and what we have produced.
8.1 Cxtest Structure
There are three layers of components in the Cxtest design. For each piece of software to
be supported, there is a package. Each package has its own directory which the package
configuration file resides. This file describes various options for the package such as software
and version prerequisites.
Also in the package directory are directories for each of the most basic units, the test.
Tests are basic shell scripts which do the work of actually running the program and interact-
ing with it. A typical test script loads various environmental variables needed by the Cxtest
programs, runs the desired program (usually under Wine), and performs the interactions.
The trickiest part of the Cxtest system is the actual interactions with the program. Since
we are using graphical applications, we cannot just use text streams to provide input and
output. Instead, Cxtest is capable of taking screenshots and comparing them against a
reference “correct” version to make sure the program is producing the correct output. As for
input, when creating a package, the Cxtest user can record various actions to the test script
file. These include running programs, entering text to the currently active text box, and
recording mouse clicks. We can also record test related activities, such as taking screenshots
and comparing good screenshots versus the test output.
For the purposes of I/O and taking screenshots, the Cxtest designers decided they really
needed a cleaner, more controlled environment than could be accomplished in a standard
users X11 environment. To that end, in the process of recording or executing a test, Cx-
test creates a local VNC connection. All input and output is accomplished through this
VNC environment, which effectively gives Cxtest its own X desktop which can be controlled
through the scripts. In addition, many Cxtest packages create their own user in the process
of running the test. This allows Cxtest to also control the user and file experience of the test.
Thus, for each test, Cxtest creates a controlled environment which can be exactly replicated
each time. All these are essential for the goals of regression testing.
When package is populated with tests, the final step is to create a a run. Run files tell
Cxtest how to perform a particular set of tests. The first item tells Cxtest which user to
run the test as, usually the test specific user which will be created and deleted in the course
of running the tests. Next, the run can tell Cxtest to first install various required pieces of
software. Finally, we tell Cxtest to install the software we are interested in, and tell it any
additional tests we would like performed.
18

Page 19
Google
Technical Report
HMC CS
When a run is actually executed, Cxtest provides output listing which steps it is taking
and whether these have succeeded. The entire process is automated, with the user informed
at the end of the success or failure. The system is well regulated, and could easily be included
in nightly testing scripts. Indeed, much effort has been made to ensure the system runs as
smoothly as possible from start to finish.
8.2 Our Experience with Cxtest
While theoretically perfect for our needs, Cxtest, as its website warns, is not quite ready for
prime time use. While the system is not overly complex, there are lots of little components
which combine somewhat mysteriously to produce the final test runs. Further aggravating
the situation, the Cxtest documentation is very spotty, with some overall description of
structure, but little in the way of particular description.
Luckily, the actual recording programs are, for the most part, self-documenting with
labeled menus. After a few hours of experimenting and learning the system, we were able to
generate a Picasa 2 install test. The actual process for creating this involved downloading
the Picasa 2 installation program, placing it in a standard Cxtest files directory and running
the record package.sh script.
Our actual usage of Cxtest was not without problems, specifically with the click recording
mechanisms. Cxtest has two different ways of recording clicks. The first the standard X11
version, which just records the mouse pointer’s coordinates and adds a line to the test file
to run the clickat utility, which sends a click signal to the program at the appropriate
coordinates. There is a second version, known as “Wine clicks,” which generates clicks using
Windows terminology. In addition to the pure mouse coordinates, the Wine click utility
includes information about the window title, the button title, and additional identifying
information.
Our first version of the installation test used the X click mechanism, but on consultation
with the Cxtest users mailing list and IRC channel, we learned that X clicks are deprecated
and prone to failure. Thus, we went back to re-record our test using Wine clicks. The
recording went fine, but when we went to actually run the test, we found that the test
was failing on click timeout. Where the X clicks just wait for a specified delay period and
click at the specified location, the Wine clicks are much smarter and look for the identifying
information to know where to click. We emailed our problem to the Cxtest users list. A
CodeWeavers developer, Jozef Stefanka, figured out that the Cxtest Wine click recorder had
recorded the incorrect window title (it had dropped a trailing space character). He sent us
back a corrected version and added our Picasa 2 package to the Cxtest CVS tree.
8.3 Our Findings
Cxtest is a Wine-friendly graphical testing utility which we have used to generate an au-
tomated test for the installation of Picasa 2. We are including this test as part of our
overall code coverage script, so that differential code coverage can be generated completely
automatically for the Wine test suite and Picasa 2.
While we have been successful in generating a limited test of Picasa 2, Cxtest is not quite
19

Page 20
Google
Technical Report
HMC CS
ready for heavy-duty testing of Google applications. Key functionality like drag and dropping
is not supported yet. However, with some added features and improved documentation,
Cxtest may provide an excellent way to generate consistent code coverage data for a wide
array of Picasa functionality. In addition, Cxtest could be used in general to test Picasa or
other Google applications with Wine.
9 Writing Tests for Wine
As we alluded to in sections 1.2 and 3, we want to improve Wine’s test coverage in the areas
that the Google applications use in the hopes that Wine will become a more reliable way to
run these windows applications under Linux. As discussed earlier, we have used our coverage
utilities to determine what parts of the Wine source are used by these Google applications,
and then we actually wrote tests for the untested parts and incorporate them into the Wine
test suite.
However, before describing the tests that we wrote for Wine, we must first explain Wine’s
rather intricate testing framework.
9.1 The Purpose of Wine Tests
In order to take the place of the Windows API, Wine must catch any Windows API calls made
by an application, and transparently provide the same functionality under Linux. To achieve
true transparency, Wine needs to implement a function that will take the same parameters,
perform the same desired functionality, and return the same results as in Windows for each
API call.
We can assure ourselves of this property by writing a test that calls a particular API
function. A test is essentially a small program that only calls the APIs we are trying to test
in addition to a bit more supporting code for creating the desired test conditions. When
actually writing a test, we read the windows documentation to learn what results to expect
from various function calls, and then write code that makes sure all the results are as we
would expect.
Wine has specially constructed its test infrastructure to allow the tests to be compiled
and executed using both Wine in Linux and the Microsoft libraries in Windows. If the Wine
implementation is correct, test execution under Wine should perform exactly the same as
when we use the actual Windows libraries.
9.2 The Test Suite Structure
Test programs are located with the code in Wine they test. They are normally put in a
“tests” subdirectory of the main code directory. So for example, the LZExpand test code is
located in the directory dlls/lzexpand/tests.
The tests are written in C and use a large supporting code framework common to all
tests. The driver function is called START TEST(testname). This is called when the test is
executed. The programs follow fairly normal execution from there. The test suite aggregates
statistics from all the tests on all parts of Wine. Test success is reported by the program
20

Page 21
Google
Technical Report
HMC CS
using an assert like function called ok(), which takes a boolean expression indicating correct
operation (eg. checking a pointer is not NULL), and a string which explains the problem if
the check fails.
This means we need a mechanism for checking the veracity of return values and function
call results. This is often accomplished by hard coding the correct result into the program
and checking against that. For example, when checking that the LZExpand DLL correctly
decompresses and reads a file, we will encode the decompressed file data in our program and
check it against the result of the read call.
9.3 Coverage-driven Test-making
As part of our testing of Google applications in Wine, we want to identify areas of Wine
that Google applications use but that are not well tested by the test suite and then write
appropriate tests for these areas.
Using our coverage tools, we discovered a few dlls that were not covered by the Wine
test suite but still used by Picasa including: ddraw, shell32, ntdll, comctl32 and riched.
Thus we decided these might be good dlls to concentrate our testing efforts on.
Unfortunately, many of these dlls support graphical or UI-based functionality, and, as
we tried to test them, it became increasingly clear that there was no obvious mechanism for
testing this functionality inside the context of the Wine test suite. One potential solution
proposed was to simulate a UI or graphical element in a memory buffer, but the feasibility
of this task is unknown. This solution is non-trivial and attempting to write unit tests for
graphical or UI elements was abandoned.
However, both shell32 and ntdll are modules that support important kernel-level oper-
ations regarding the execution of programs among a variety of other purposes. Both shell32
and ntdll are fairly well documented, and so it was surprising to find that they were largely
untested. Pragmatically, due to the ubiquitous use of the modules, most Windows applica-
tions act as inherent tests.
At the time of this document’s authorship, the tests for both shell32 and ntdll are still
under construction. However, we have previously written tests for two other dlls: LZExpand
(section 9.4) and AdvApi32 (section 9.5). Although these dlls were not chosen for further
testing using our code coverage utilities, they are nevertheless important contributions to
the Wine test suite and the functionality of the Google applications.
9.4 The LZExpand Test
The LZExpand dll was chosen by our liaison as a good entry point into writing tests. Not
only is it important for many installers, but the dll initially had no tests written for it
whatsoever.
Our additions constitute a fairly comprehensive test program for the LZExpand DLL
function calls LZOpenFile(), LZRead(), and LZCopy(). The implementation of the test
was fairly straightforward, following the conventions mentioned above. For example, for
LZOpenFile(), we call the function before creating a file and after creating a file with
different flags. Before the file has been created, we check that calls that open the file for
21

Page 22
Google
Technical Report
HMC CS
reading correctly produce a non-existent file error. After the file is created, we check that
calls to open the file for reading return a valid file handle.
The next few sections will discuss specific problems we needed to address in writing the
LZExpand test.
9.4.1 Reading a Compressed File
Part of our testing required decompressing and reading a compressed file using the LZRead()
call. The LZExpand DLL operates on files compressed using a variant of the Lempel-Ziff
compression algorithm as implemented in the compress.exe program from Microsoft. To test
the reading functionality, we needed a compressed file our test program could operate on.
Since tests are discouraged from including extranenous data files beyond the source code,
we elected to include a hex string in the code which represents the compressed file. When
we need the file, we write it out to disk (using the normal file manipulation functions, which
we assume are correct), and then use the LZ functions to manipulate it.
9.4.2 Alternate File Endings
The MSDN specification for LZOpenFile states: “If LZOpenFile is unable to open the file
specified by lpFileName, on some versions of Windows it attempts to open a file with almost
the same file name, except the last character is replaced with an underscore (“ ”). Thus, if
an attempt to open MyProgram.exe fails, LZOpenFile tries to open MyProgram.ex .”
It is unclear what the desired behavior for Wine is in this case. On some versions of
Windows, if the filename does not match exactly, failure results, while on other versions, the
filename with a “ ” ending is tried also. Our testing has indicated that the former behavior
exists on Windows 98, while the latter behavior is exhibited by Windows XP and Wine. The
Wine testing guidelines say we should accept the behavior of any valid version of Windows,
which means our test cannot judge one behavior or the other to be correct. With such a
predicament, we have left this section of code commented out.
This case also highlights a difficulty in using MSDN as the API reference. Since it
attempts to cover all of the Microsoft APIs for all versions of Windows, MSDN is not very
precise or up-to-date. This is not a surprising state of affairs since computer documentation
is often deficient. However, it reinforces the idea that testing under Windows and Wine is
the only real way to ensure compatibility.
9.4.3 Reading Past File End
Another problem was a bit stranger and one we have not completely resolved. To rigorously
test the LZRead behavior, we had our test read twice as much data as should have been
possible from the file. From the MSDN documentation for LZRead, it appeared this would
be handled as one would expect from the C library. That is, the read function should
return the number of bytes actually read (which, in this case, would be less than the number
requested), but it should not be an error. This was the functionality we observed with Wine
under Linux. However, attempting to read past the end of the file in Windows produces a
22

Page 23
Google
Technical Report
HMC CS
read error. Even though it agrees with the MSDN documentation, because Wine does not
function like Windows, this is likely a bug in Wine.
9.5 AdvApi32 Registry Test
When we were running Picasa’s slideshow option, Wine hung as described in section 7.4.2.
When a further inspection via winedbg was commenced, it was found that the bug was in
advapi32.dll, and more specifically in dlls/advapi32/registry.c.
Once it was discovered where the bug was, differential code coverage was used to see
what sections of this code were already covered by the test suite. It turned out that the
code which housed the bug had been reasonably well tested and all tests were passing,
unsettling news indeed. Further investigation revealed that 28 of the tests which passed on
Wine, failed on windows! The tests were corrected by setting the results windows gave as
the correct results to the test, and a patch was submitted to wine-patches mailing list.
The progress that was made can be observed on the wine test site which can be found at
http://test.winehq.com/data/. While the problem is not yet fixed in the Wine source
code, the tests at least have brought the problem to the attention of the Wine developers.
10 Assessment
10.1 Code Coverage
It was known from the start of this project that code coverage will not reveal where bugs
are. Bugs can be found in very well tested code and yet be curiously absent in code which
has yet to be tested. However, the question posed was not whether or not code coverage
would find bugs, but would is assist in the test writing process. A single syllable response
to this is: “yes.” However, it did not help in perhaps the way originally envisioned.
10.1.1 Directing New Users
A program can never be considered fully tested. There are many criterion a company or
open source project may use to say when a program has been well-tested; however, there
is no sure way to know every angle has been covered and every screw turned. A tester can
only hope to get the most bang for the buck when testing an application.
When the clinic team began writing tests for Wine, we had no idea where to start. The
wine source code is over half a million lines, thus making testing a rather daunting task.
Once we ran LCOV to see what aspects of wine had already been tested, we had a much
better idea of where our efforts would help the most.
10.1.2 Motivating New Tests
While code coverage cannot measure the quality of the test, it can encourage more tests
to be written. It is very rewarding to see a number increase (in our case the number is
percentage of code covered in a given module) and know that you caused that change. This
23

Page 24
Google
Technical Report
HMC CS
kind of good feeling can help motivate people to write tests for areas previously not covered
because there is a reward and concrete measure of how much they contributed.
10.1.3 Reexamining Faulty Tests
We did not find the advapi32.dll bug using code coverage techniques. We realized there
was a bug when Wine hung, and we used winedbg to track it down. However, once we
discovered this code had been executed by the test suite we knew that either the tests were
not comprehensive or the tests were testing the wrong thing.
So while we could not have tracked down the bug using LCOV, once the bug was found,
LCOV yielded additional information which proved to be helpful in the testing and program
verification processes.
11 Future Work
Our main goal was in making Wine more reliable for Google applications. The method
through which we wished to accomplish this goal was to implement differential code coverage;
however, there are extensions to differential code coverage which could be explored and there
are other methods which could have been employed.
For instance, Alexandre suggested that we check code covered by the test suite in Mi-
crosoft’s implementation of the Windows API. While there would be several large hurdles to
jump (e.g. no source code), there are other code coverage methods which could be employed
that do not assume access to Microsoft’s source code. If a programmer could figure out a
good heuristic for measurement of assembly code covered, this would not violate any legal
issues and still give Wine developers an idea of what areas are heavily exercised versus those
which are neglected.
Another example of an improvement was suggested by our liaison. He discussed the
possibility of a way to test graphical units of the source code such as edit controls. The
clinic team toyed with this idea some, but in the end decided that there was not sufficient
time to implement such a framework. Furthermore, it looks like the CXtest folks are already
working in that direction with a separate framework, which for modularity sake, may be
easier to use. Please refer to sections 8.2 and 8.3.
24

Page 25
Google
Technical Report
HMC CS
A Online Resources
The following online resources were utilized during Wine test development.
http://www.cs.hmc.edu/clinic/projects/2004/google/ is the clinic team’s main web-
site. We have LCOV output alongside all patches submitted to Wine and LTP. We
also have posted all documentation and presentations.
http://www.cs.hmc.edu/clinic/twiki/bin/view/Google04/WebHome/ is the clinic team’s
wiki site. We used this more for internal purposes. Directions how to use some of the
tools developed and student work logs can be found here.
http://google:*****@www.cs.hmc.edu/clinic/mail/google04l/threads.html is a full
email archive. Any email sent to [email protected] was archived via the standard
mailman format.
http://test.winehq.com/data/ is a site which posts the results of the test suite. New
results are posted at a fairly frequent rate and the site is useful to folks who would
like to write new tests as they can have a visual for which areas need testing. In this
respect, it is similar to some of the goals of our project.
ltp.sourceforge.net houses the Linux Test Project. Any documentation and code relat-
ing to LCOV can be found here.
http://gcc.gnu.org/ is the official site of the GNU C compiler. While the team did not
delve too deep into the internals of gcc, the parts we did learn about proved to be very
helpful.
http://gcc.gnu.org/onlinedocs/gcc/Gcov.html is the official GCOV documentation. It
goes over how to use GCOV and also delves into some of the internals of how gcc
outputs information for GCOV to read. We cover some of the material in our discussion
of GCOV earlier in this report.
http://www.winehq.com/site/documentation is home to all of Wine’s documentation.
There is information for developers, users, and porters. The clinic team made contri-
butions to the developer’s manual and documented how to run code coverage techniques
on the wine source code.
http://www.linux-mag.com/2003-07/compile_01.html is an article written in linux-mag
describing how to use GCOV and a little bit about the internals. We found this to be
a much gentler introduction than GNU’s; however, it is not as in depth.
http://www.cxtest.org is the site which contains all of the current information regarding
CXtest. As of mid-April 2005, CXtest is still in development. Documentation is
minimal, and all of our experience have been documented in this report which will be
forwarded to the CXtest group. While this project may not even be in alpha form, it
is clear that it will be very valuable to testing Wine and other programs in the near
future.
25

Page 26
Google
Technical Report
HMC CS
B Wine Documentation Patch For Enabling Code Cov-
erage
Index: documentation/winedev-otherdebug.sgml
===================================================================
RCS file: /home/wine/wine/documentation/winedev-otherdebug.sgml,v
retrieving revision 1.1
diff -u -3 -p -u -r1.1 winedev-otherdebug.sgml
--- documentation/winedev-otherdebug.sgml 26 Oct 2004 22:45:47 -0000 1.1
+++ documentation/winedev-otherdebug.sgml 18 Apr 2005 23:27:44 -0000
@@ -497,6 +497,183 @@ cvs update -PAd -D "2004-08-23 15:17:25
</listitem>
</orderedlist>
</sect1>
+
<sect1>
+
<title>Which code has been tested?</title>
+
<para>
+
Deciding what code should be tested next can be a difficult
+
decision. And in any given project, there is always code that
+
isn’t tested where bugs could be lurking. This section goes
+
over how to identify these sections using a tool called gcov.
+
</para>
+
<para>
+
To use gcov on wine, do the following:
+
</para>
+
<orderedlist>
+
<listitem>
+
<para>
+
In order to activate code coverage in the wine source code,
+
when running <command>make</command> set
+
<literal>CFLAGS</literal> like so <command>make
+
CFLAGS="-fprofile-arcs -ftest-coverage"</command>. Note that
+
this can be done at any directory level. Since compile
+
and run time are significantly increased by these flags, you
+
may want to only use these flags inside a given dll directory.
+
</para>
+
</listitem>
+
<listitem>
+
<para>
+
Run any application or test suite.
+
</para>
+
</listitem>
+
<listitem>
+
<para>
26

Page 27
Google
Technical Report
HMC CS
+
Run gcov on the file which you would like to know more
+
about code coverage.
+
</para>
+
</listitem>
+
</orderedlist>
+
<para>
+
The following is an example situation when using gcov to
+
determine the coverage of a file could be helpful. We’ll use
+
the <filename>dlls/lzexpand/lzexpand_main.c.</filename> file.
+
At one time the code in this file was not fully tested (as it
+
may still be). For example at the time of this writing, the
+
function <function>LZOpenFileA</function> had the following
+
lines in it:
+
<screen>
+if ((mode&0x70)!=OF_READ)
+
return fd;
+if (fd==HFILE_ERROR)
+
return HFILE_ERROR;
+cfd=LZInit(fd);
+if ((INT)cfd <= 0) return fd;
+return cfd;
+
</screen>
+
Currently there are a few tests written to test this function;
+
however, these tests don’t check that everything is correct.
+
For instance, <constant>HFILE_ERROR</constant> may be the wrong
+
error code to return. Using gcov and directed tests, we can
+
validate the correctness of this line of code. First, we see
+
what has been tested already by running gcov on the file.
+
To do this, do the following:
+
<screen>
+cvs checkout wine
+mkdir build
+cd build
+../wine/configure
+make depend && make CFLAGS="-fprofile-arcs -ftest-coverage"
+cd dlls/lxexpand/tests
+make test
+cd ..
+gcov ../../../wine/dlls/lzexpand/lzexpand_main.c
+ 0.00% of 3 lines executed in file ../../../wine/include/wine/unicode.h
+ Creating unicode.h.gcov.
+ 0.00% of 4 lines executed in file /usr/include/ctype.h
+ Creating ctype.h.gcov.
+ 0.00% of 6 lines executed in file /usr/include/bits/string2.h
27

Page 28
Google
Technical Report
HMC CS
+ Creating string2.h.gcov.
+ 100.00% of 3 lines executed in file ../../../wine/include/winbase.h
+ Creating winbase.h.gcov.
+ 50.83% of 240 lines executed in file ../../../wine/dlls/lzexpand/lzexpand_main.c
+ Creating lzexpand_main.c.gcov.
+less lzexpand_main.c.gcov
+
</screen>
+
Note that there is more output, but only output of gcov is
+
shown. The output file
+
<filename>lzexpand_main.c.gcov</filename> looks like this.
+
<screen>
+
9: 545:
if ((mode&0x70)!=OF_READ)
+
6: 546:
return fd;
+
3: 547:
if (fd==HFILE_ERROR)
+
#####: 548:
return HFILE_ERROR;
+
3: 549:
cfd=LZInit(fd);
+
3: 550:
if ((INT)cfd <= 0) return fd;
+
3: 551:
return cfd;
+
</screen>
+
<command>gcov</command> output consists of three components:
+
the number of times a line was run, the line number, and the
+
actual text of the line. Note: If a line is optimized out by
+
the compiler, it will appear as if it was never run. The line
+
of code which returns <constant>HFILE_ERROR</constant> is
+
never executed (and it is highly unlikely that it is optimized
+
out), so we don’t know if it is correct. In order to validate
+
this line, there are two parts of this process. First we must
+
write the test. Please see <xref linkend="testing"> to
+
learn more about writing tests. We insert the following lines
+
into a test case:
+
<screen>
+INT file;
+
+/* Check for non-existent file. */
+file = LZOpenFile("badfilename_", &amp;test, OF_READ);
+ok(file == LZERROR_BADINHANDLE,
+
"LZOpenFile succeeded on nonexistent file\n");
+LZClose(file);
+
</screen>
+
Once we add in this test case, we now want to know if the line
+
in question is run by this test and works as expected. You
+
should be in the same directory as you left off in the previous
+
command example. The only difference is that we have to remove
+
the <filename>*.da</filename> files in order to start the
28

Page 29
Google
Technical Report
HMC CS
+
count over (if we leave the files than the number of times the
+
line is run is just added, e.g. line 545 below would be run 19 times)
+
and we remove the <filename>*.gcov</filename> files because
+
they are out of date and need to be recreated.
+
</para>
+
<screen>
+rm *.da *.gcov
+cd tests
+make
+make test
+cd ..
+gcov ../../../wine/dlls/lzexpand/lzexpand_main.c
+ 0.00% of 3 lines executed in file ../../../wine/include/wine/unicode.h
+ Creating unicode.h.gcov.
+ 0.00% of 4 lines executed in file /usr/include/ctype.h
+ Creating ctype.h.gcov.
+ 0.00% of 6 lines executed in file /usr/include/bits/string2.h
+ Creating string2.h.gcov.
+ 100.00% of 3 lines executed in file ../../../wine/include/winbase.h
+ Creating winbase.h.gcov.
+ 51.67% of 240 lines executed in file ../../../wine/dlls/lzexpand/lzexpand_main.c
+ Creating lzexpand_main.c.gcov.
+less lzexpand_main.c.gcov
+
</screen>
+
<para>
+
Note that there is more output, but only output of gcov is
+
shown. The output file
+
<filename>lzexpand_main.c.gcov</filename> looks like this.
+
</para>
+
<screen>
+
10: 545:
if ((mode&0x70)!=OF_READ)
+
6: 546:
return fd;
+
4: 547:
if (fd==HFILE_ERROR)
+
1: 548:
return HFILE_ERROR;
+
3: 549:
cfd=LZInit(fd);
+
3: 550:
if ((INT)cfd <= 0) return fd;
+
3: 551:
return cfd;
+
</screen>
+
<para>
+
Based on gcov, we now know that
+
<constant>HFILE_ERROR</constant> is returned once. And since
+
all of our other tests have remain unchanged, we can assume
+
that the one time it is returned is to satisfy the one case we
+
added where we check for it. Thus we have validated a line of
29

Page 30
Google
Technical Report
HMC CS
+
code. While this is a cursory example, it demostrates the
+
potential usefulness of this tool.
+
</para>
+
<para>
+
For a further in depth description of gcov, the official gcc
+
compiler suite page for gcov is <ulink
+
url="http://gcc.gnu.org/onlinedocs/gcc-3.2.3/gcc/Gcov.html">
+
http://gcc.gnu.org/onlinedocs/gcc-3.2.3/gcc/Gcov.html</ulink>.
+
There is also an excellent article written by Steve Best for
+
Linux Magazine which describes and illustrates this process
+
very well at
+
<ulink url="http://www.linux-mag.com/2003-07/compile_01.html">
+
http://www.linux-mag.com/2003-07/compile_01.html</ulink>.
+
</para>
+
</sect1>
</chapter>
30

Page 31
Google
Technical Report
HMC CS
C --missing-from-baseline Patch For LCOV
Can be found at: http://www.cs.hmc.edu/clinic/projects/2004/google/patches/missing-from-baseline.
patch
D --legend Patch For LCOV
Can be found at: http://www.cs.hmc.edu/clinic/projects/2004/google/patches/lcov-legend.
patch
E Automated Wine Coverage Script
Can be found at: http://www.cs.hmc.edu/clinic/projects/2004/google/wine_cov
31