I am an experienced Linux developer but a Wine newbie. Recently I have been
exploring the possibility of using CMake-2.8.1 and MinGW-4.5.0 in a
Wine-1.1.42 environment (the backported Debian Lenny packages from WineHQ)
to check that CMake-based build systems I use for various software projects
work properly for that Windows platform. I am collecting notes for a wiki
entry I am planning to write about using this platform for software
development if anybody else here is interested.
My initial experiences have been promising for this platform. For example,
I have been able to build CMake itself as well as build and test PLplot.
However, one issue that is of concern is the builds are extremely slow on
Wine compared to Linux.
Here are some comparisons for components of CMake builds done in an
initially empty build tree.
Wine I.
wine@raven> time wine cmake -G "MinGW Makefiles" \
"-DCMAKE_INSTALL_PREFIX=z:/home/wine/cmake/install1" \
z:/home/software/cmake/cmake-2.8.1_patched/ >& cmake.out
real 14m17.124s
user 0m16.233s
sys 0m6.320s
That real time is a factor of ~40 (!) longer than the sum of user and sys
time which implies wine is spending most of its time in a wait state for
this work load with no cpu activity at all. I frankly don't believe those
numbers for user and sys and therefore don't necessarily believe this
explanation because the user and sys numbers are clearly unreliable for the
next experiment. However, for all experiments the real time is reliable from
my independent time measurements
Wine II.
wine@raven> time wine mingw32-make install >& make_install.out
real 20m13.421s
user 0m0.136s
sys 0m0.012s
For this workload the ratio between the "wait" time and total cpu time is
a factor of 8000 (!) But thse user and sys numbers are so tiny compared
to the corresponding Linux experiement that I don't trust them. But
real time is reliable.
Wine III.
Here are the corresponding latency numbers (how long it takes make to
figure out that nothing has to be done by running this command right
after the previous one).
wine@raven> time wine mingw32-make install >& make_install.out1
real 1m39.018s
user 0m0.100s
sys 0m0.052s
By looking at make_install.out1 I confirmed that indeed nothing was done
other than to check dependencies (which normally involves running make a
fairly large number of times for CMake-based build systems).
Here are the corresponding timing numbers under Linux for the same three
experiments for an initially empty build tree and Linux versions of
CMake-2.8.1 and gcc-4.3.2 (for Debian Lenny).
Linux I.
wine@raven> time cmake -G "Unix Makefiles" \
"-DCMAKE_INSTALL_PREFIX=/home/wine/cmake/install_linux" \
/home/software/cmake/cmake-2.8.1_patched/ >& cmake.out
real 0m34.015s
user 0m18.513s
sys 0m6.100s
This user+sys time is actually reasonably similar to the equivalent for Wine
I, but that may just be a coincidence. The reliable fact to draw from this
is the overall real time is a factor of 25 (!) faster or ~13 minutes (!)
faster than the Wine I equivalent. Why?
Linux II.
wine@raven> time make install >& make_install.out
real 2m12.805s
user 1m56.607s
sys 0m13.665s
Here user+sys add up closely to real time so there are virtually no times
in this Linux build case where the cpu is idle. The real time
is a factor of 9 (!) faster or 18 minutes (!) faster than
the Wine II case.
Linux III.
wine@raven> time make install >& make_install.out1
real 0m0.869s
user 0m0.592s
sys 0m0.272s
This "latency" time required to figure out dependencies and decide nothing
needs to be done (confirmed by looking at make_install.out1) is a factor
(in real time) of ~110 faster or 1m34s faster than the Wine III case.
One Wine slowness factor that affects all these results is that wine has a
long start-up latency for every task that is run by a command such as cmake
or make in the wine environment. Indeed when I did timing experiments for
simple commands such as "wine cmake --version" or "wine gcc --version" that
are normally instantaneous on Linux, there seemed to always be a start-up
latency near 0.25 seconds which was not from the wine command itself (since
commands like "wine lxx" returned much faster than 0.25 seconds saying the
lxx.exe command could not be found). A CMake build and install (the work that
occurs for the Wine and Linux II timing experiments) requires roughly 3000
commands to complete. So the Wine startup latency accounts for roughly 12
minutes of the measured 18 minute discrepancy between Wine II and Linux II,
and considering the roughness of that 12-minute estimate it is reasonable to
ascribe all the difference to Wine startup latency. So let's also take
that as a working hypothesis to explain the Wine I versus Linux I
discrepancy of ~13 minutes (where I am not sure how many commands are run)
and the Wine III versus Linux III discrepancy of 1m38s (where I know
many fewer commands were run than in the Wine II and Linux II cases).
Given the known Wine startup latency for simple commands and the possibility
that all these timing comparsion results for the same build on Linux and
Wine can be explained by that, what can I do to reduce that latency? Note,
all Wine results above were done with "timeserver -p" run beforehand. Also,
the environment variable controlling Wine debugging was set to
WINDEBUG='-all" (as suggested in a web article I read) to eliminate that
particular source of latency.
If some Wine developer suggests rebuilding Wine with a different
configuration than that used in the WineHQ Debian Lenny packages for 1.1.42
I would be happy to try that. I would also be happy to test any further
changes they would like to make to the Wine source code to reduce startup
latency. For example, if Wine establishes a useless GUI environment each
time a command such as gcc is run under Wine, I would be happy to test a
run-time option to eliminate that.
Alan
__________________________
Alan W. Irwin
Astronomical research affiliation with Department of Physics and Astronomy,
University of Victoria (astrowww.phys.uvic.ca).
Programming affiliations with the FreeEOS equation-of-state implementation
for stellar interiors (freeeos.sf.net); PLplot scientific plotting software
package (plplot.org); the libLASi project (unifont.org/lasi); the Loads of
Linux Links project (loll.sf.net); and the Linux Brochure Project
(lbproject.sf.net).
__________________________
Linux-powered Science
__________________________
Speed/latency issues for development in a Wine environment
I've done a number of similar builds
( see http://wiki.winehq.org/UnitTestSuites )
and agree Wine is slow at running builds.
See in particular
http://bugs.winehq.org/show_bug.cgi?id=21423
Please file an enhancement request at http://bugs.winehq.org with your test data, perhaps your bottleneck is different.
( see http://wiki.winehq.org/UnitTestSuites )
and agree Wine is slow at running builds.
See in particular
http://bugs.winehq.org/show_bug.cgi?id=21423
Please file an enhancement request at http://bugs.winehq.org with your test data, perhaps your bottleneck is different.
Speed/latency issues for development in a Wine environment
On 2010-06-12 01:51-0500 DanKegel wrote:
WineHq providing the facilities to report testers build efforts rather than
having them scattered over the different projects. In particular
CMake-based build systems (such as for KDE and many other projects such as
PLplot, and CMake itself) have a ctest facility that allows interested users
to report nightly test results to a simple dashboard hosted anywhere. So
there is a huge amount of potential here with projects with CMake-based
build systems to have a central reporting facility at WineHQ that
automatically publishes all users nightly build test efforts to the web.
There is a big caveat at the moment, however, for this idea. Many/all
versions of CMake up to and including the latest, 2.8.1 use a
GetShortPathName() / GetLongPathName() trick to get the actual case for a
filename, and that generates a hash collision between the names of two CMake
language-support files (see http://bugs.winehq.org/show_bug.cgi?id=22286).
<aside> I think this was dismissed as CMake just being unlucky in their
choice of file names, but an alternative explanation is that the
GetShortPathName hash function is poorly implemented within Wine so the
probability of collisions is higher than it should be.
</aside>
CMake has now replaced that trick in their git version by code that does not
call GetShortPathName (see
http://cmake.org/gitweb?p=cmake.git;a=c ... 7be27f49d3).
Until the corresponding patch (located at
http://cmake.org/gitweb?p=cmake.git;a=p ... 7be27f49d3)
becomes part of an official CMake release, you can bootstrap a patched CMake
under Wine by renaming the file with the hash collision (which disables
Fortran support) so that CMake can build a full patched version of itself
(which includes Fortran support). Anyhow, that's the method I am using for
my own CMake/MinGW/Wine tests, and why a Wiki cookbook (still being written)
is fairly essential for anyone else trying to follow me before the patched
version of CMake is released.
slower on Wine than Windows (and presumably also slower on Wine than Linux).
I have found that every simple command I try under wine (such as "cmake
--version", "gcc --version", and "mingw32-make --version") all seem to have
a large startup latency near 0.25 seconds that cannot be associated with
file access times. So I expect both your timing tests and mine for building
software under Wine see a combination of the startup latency issue for the
thousands of commands that are run in a typical build as well as the file
access time issue.
Alan
__________________________
Alan W. Irwin
Astronomical research affiliation with Department of Physics and Astronomy,
University of Victoria (astrowww.phys.uvic.ca).
Programming affiliations with the FreeEOS equation-of-state implementation
for stellar interiors (freeeos.sf.net); PLplot scientific plotting software
package (plplot.org); the libLASi project (unifont.org/lasi); the Loads of
Linux Links project (loll.sf.net); and the Linux Brochure Project
(lbproject.sf.net).
__________________________
Linux-powered Science
__________________________
That's an interesting idea, but I would suggest taking it a lot further withI've done a number of similar builds
( see http://wiki.winehq.org/UnitTestSuites )
WineHq providing the facilities to report testers build efforts rather than
having them scattered over the different projects. In particular
CMake-based build systems (such as for KDE and many other projects such as
PLplot, and CMake itself) have a ctest facility that allows interested users
to report nightly test results to a simple dashboard hosted anywhere. So
there is a huge amount of potential here with projects with CMake-based
build systems to have a central reporting facility at WineHQ that
automatically publishes all users nightly build test efforts to the web.
There is a big caveat at the moment, however, for this idea. Many/all
versions of CMake up to and including the latest, 2.8.1 use a
GetShortPathName() / GetLongPathName() trick to get the actual case for a
filename, and that generates a hash collision between the names of two CMake
language-support files (see http://bugs.winehq.org/show_bug.cgi?id=22286).
<aside> I think this was dismissed as CMake just being unlucky in their
choice of file names, but an alternative explanation is that the
GetShortPathName hash function is poorly implemented within Wine so the
probability of collisions is higher than it should be.
</aside>
CMake has now replaced that trick in their git version by code that does not
call GetShortPathName (see
http://cmake.org/gitweb?p=cmake.git;a=c ... 7be27f49d3).
Until the corresponding patch (located at
http://cmake.org/gitweb?p=cmake.git;a=p ... 7be27f49d3)
becomes part of an official CMake release, you can bootstrap a patched CMake
under Wine by renaming the file with the hash collision (which disables
Fortran support) so that CMake can build a full patched version of itself
(which includes Fortran support). Anyhow, that's the method I am using for
my own CMake/MinGW/Wine tests, and why a Wiki cookbook (still being written)
is fairly essential for anyone else trying to follow me before the patched
version of CMake is released.
Will do. To summarize the situation, you have found file access times areand agree Wine is slow at running builds.
See in particular
http://bugs.winehq.org/show_bug.cgi?id=21423
Please file an enhancement request at http://bugs.winehq.org with your test data, perhaps your bottleneck is different.
slower on Wine than Windows (and presumably also slower on Wine than Linux).
I have found that every simple command I try under wine (such as "cmake
--version", "gcc --version", and "mingw32-make --version") all seem to have
a large startup latency near 0.25 seconds that cannot be associated with
file access times. So I expect both your timing tests and mine for building
software under Wine see a combination of the startup latency issue for the
thousands of commands that are run in a typical build as well as the file
access time issue.
Alan
__________________________
Alan W. Irwin
Astronomical research affiliation with Department of Physics and Astronomy,
University of Victoria (astrowww.phys.uvic.ca).
Programming affiliations with the FreeEOS equation-of-state implementation
for stellar interiors (freeeos.sf.net); PLplot scientific plotting software
package (plplot.org); the libLASi project (unifont.org/lasi); the Loads of
Linux Links project (loll.sf.net); and the Linux Brochure Project
(lbproject.sf.net).
__________________________
Linux-powered Science
__________________________
Speed/latency issues for development in a Wine environment
On 2010-06-11 18:09-0700 Alan W. Irwin wrote:
The previous timing numbers were done with wine-1.1.42. Following a
suggestion made off-list to me, I have built wine-1.2-rc3 today (with
CFLAG=-O3) and immediately noticed the start-up latency for simple commands
(e.g., cmake --version) was reduced from about 0.25 s to about 0.15 s. I
don't know how the previous Debian wine-1.1.42 packages were optimized so
that difference may just be due to an optimization difference or an actual
improvement in the Wine code from 1.1.42 to 1.2-rc3. However, to see how -O3
optimized wine-1.2-rc3 does with the full tests, I have redone my timing
tests for I. (cmake), II (make install), and III (make install latency) and
the following reductions have been observed.
real 8m18.041s
real 12m4.198s
real 0m51.045s
For reference here are the corresponding Linux numbers which are still
much better than the corresponding Wine-1.2-rc3 results.
simple commands strongly correlates with timing results for build steps I,
II, and III on Wine. This makes sense since we know that several thousand
commands are run for, e.g., build step II so startup latency of hundreds of
milliseconds per command must be a real build performance killer.
So what can be done to reduce the simple startup latency from the current
~150 ms on Wine down much closer to the Linux startup latency for the same
commands which is typically 1 ms? Simple commands requiring 150 ms to start
must have an awful lot of extra start baggage. Is all that baggage really
needed for command-line commands such as cmake, mingw32-make, gcc, etc.?
Alan
__________________________
Alan W. Irwin
Astronomical research affiliation with Department of Physics and Astronomy,
University of Victoria (astrowww.phys.uvic.ca).
Programming affiliations with the FreeEOS equation-of-state implementation
for stellar interiors (freeeos.sf.net); PLplot scientific plotting software
package (plplot.org); the libLASi project (unifont.org/lasi); the Loads of
Linux Links project (loll.sf.net); and the Linux Brochure Project
(lbproject.sf.net).
__________________________
Linux-powered Science
__________________________
The previous timing numbers were done with wine-1.1.42. Following a
suggestion made off-list to me, I have built wine-1.2-rc3 today (with
CFLAG=-O3) and immediately noticed the start-up latency for simple commands
(e.g., cmake --version) was reduced from about 0.25 s to about 0.15 s. I
don't know how the previous Debian wine-1.1.42 packages were optimized so
that difference may just be due to an optimization difference or an actual
improvement in the Wine code from 1.1.42 to 1.2-rc3. However, to see how -O3
optimized wine-1.2-rc3 does with the full tests, I have redone my timing
tests for I. (cmake), II (make install), and III (make install latency) and
the following reductions have been observed.
==>Wine I.
wine@raven> time wine cmake -G "MinGW Makefiles" \
"-DCMAKE_INSTALL_PREFIX=z:/home/wine/cmake/install1" \
z:/home/software/cmake/cmake-2.8.1_patched/ >& cmake.out
real 14m17.124s
real 8m18.041s
==>Wine II.
wine@raven> time wine mingw32-make install >& make_install.out
real 20m13.421s
real 12m4.198s
==>Wine III.
wine@raven> time wine mingw32-make install >& make_install.out1
real 1m39.018s
real 0m51.045s
For reference here are the corresponding Linux numbers which are still
much better than the corresponding Wine-1.2-rc3 results.
To summarize these findings, it appears that changes in startup latency forLinux I.
wine@raven> time cmake -G "Unix Makefiles" \
"-DCMAKE_INSTALL_PREFIX=/home/wine/cmake/install_linux" \
/home/software/cmake/cmake-2.8.1_patched/ >& cmake.out
real 0m34.015s
Linux II.
wine@raven> time make install >& make_install.out
real 2m12.805s
Linux III.
wine@raven> time make install >& make_install.out1
real 0m0.869s
simple commands strongly correlates with timing results for build steps I,
II, and III on Wine. This makes sense since we know that several thousand
commands are run for, e.g., build step II so startup latency of hundreds of
milliseconds per command must be a real build performance killer.
So what can be done to reduce the simple startup latency from the current
~150 ms on Wine down much closer to the Linux startup latency for the same
commands which is typically 1 ms? Simple commands requiring 150 ms to start
must have an awful lot of extra start baggage. Is all that baggage really
needed for command-line commands such as cmake, mingw32-make, gcc, etc.?
Alan
__________________________
Alan W. Irwin
Astronomical research affiliation with Department of Physics and Astronomy,
University of Victoria (astrowww.phys.uvic.ca).
Programming affiliations with the FreeEOS equation-of-state implementation
for stellar interiors (freeeos.sf.net); PLplot scientific plotting software
package (plplot.org); the libLASi project (unifont.org/lasi); the Loads of
Linux Links project (loll.sf.net); and the Linux Brochure Project
(lbproject.sf.net).
__________________________
Linux-powered Science
__________________________
Speed/latency issues for development in a Wine environment
I will be unsubscribing from wine-users and the rest of this thread will be
posted to wine-devel with the same subject line.
Alan
__________________________
Alan W. Irwin
Astronomical research affiliation with Department of Physics and Astronomy,
University of Victoria (astrowww.phys.uvic.ca).
Programming affiliations with the FreeEOS equation-of-state implementation
for stellar interiors (freeeos.sf.net); PLplot scientific plotting software
package (plplot.org); the libLASi project (unifont.org/lasi); the Loads of
Linux Links project (loll.sf.net); and the Linux Brochure Project
(lbproject.sf.net).
__________________________
Linux-powered Science
__________________________
posted to wine-devel with the same subject line.
Alan
__________________________
Alan W. Irwin
Astronomical research affiliation with Department of Physics and Astronomy,
University of Victoria (astrowww.phys.uvic.ca).
Programming affiliations with the FreeEOS equation-of-state implementation
for stellar interiors (freeeos.sf.net); PLplot scientific plotting software
package (plplot.org); the libLASi project (unifont.org/lasi); the Loads of
Linux Links project (loll.sf.net); and the Linux Brochure Project
(lbproject.sf.net).
__________________________
Linux-powered Science
__________________________