Categories
Linux

Giving Away Software For Free Costs More Than You Would Think (Part 3)

For those of you less inclined to surfing all over my website, here’s a quick recap: Part 1: Did you know that the Ubuntu software repository contains 23,164 packages? I also introduce the Constructive Cost Model (COCOMO), and finally, reference an article quoting that Debian ‘cost’ 1.9 billion dollars to develop back in year 2000. […]

For those of you less inclined to surfing all over my website, here’s a quick recap:

Part 1
: Did you know that the Ubuntu software repository contains 23,164 packages? I also introduce the Constructive Cost Model (COCOMO), and finally, reference an article quoting that Debian ‘cost’ 1.9 billion dollars to develop back in year 2000.

Part 2: I detail my plan of attack to find out how much the Ubuntu distribution ‘cost’ to develop using COCOMO and SLOCCount by David Wheeler. Without his excellent program, this article would not be possible.

Downloading the source code took nearly 2 days, due to the nature of apt-get, you must first build dependencies, then download the actual source code. There were some annoying programs that once downloaded, require user input (such as LILO) and that paused the downloading process until I provided some human input.

It took over 8 hours for SLOCCount to process all 51,447 files downloaded, with a 3.2GHz Intel Core 2 Duo processor.

I started this project with a brand new Gutsy Gibbon installation. By the time all was said and done,my hard drive contained 92.8 GB of gzipped source files, patches, graphics, etc – everything needed to build Ubuntu from scratch- and installed nearly 10 GB of dependencies.

Opening Nautilus and browsing to /usr/src makes my computer freeze for nearly 2 minutes while my 10,000rpm SATA hard drive reads the directory structure. When running the LS command on /usr/src recursively, it takes over 20 minutes to finish running.

Computer software engineers make approximately 75,000 yearly salary according to the Bureau Of Labor Statistics.

Without further ado, here are numbers you came for:

Source Lines Of Code: 121,131,661
Person-Months: 1,208,967.6
Person-Days: 24,179,353
Person-Hours: 193,434,823
Duration Years: 18.4
Duration Months: 221.0
Duration Days: 4419
Duration Hours: 35,354

Total Price: $7,033,290,160

Ubuntu Gutsy Gibbon and all available software packages included in the software repositories would take over 7 BILLION dollars to re-write from the ground up. Don’t believe it? Remember, this number counts source code from projects such as Mozilla Firefox and OpenOffice.org which are behemoths in and of themselves.

If you enjoyed this article, please Digg it, or subscribe to my RSS feed – I’ve got a new article coming up this weekend that ‘violates’ all that is holy with Ubuntu Christian Edition.

Article Index:
Part 1: Introduction

Part 2: Method
Part 3: Conclusion

49 replies on “Giving Away Software For Free Costs More Than You Would Think (Part 3)”

Holy crap.
100Gb compressed? Any guesses on what that’d take if it was all decompressed and in use?
(No, dont do it, its not worth losing your pc for a year)

Nice work, I’d digg it twice if I could, and I’ll be pointing MS fanboys your way next time I hear someone bragging about how much work went into Vi$ta :oP

Building from the ground up, but this would be divided over many many years (since it is a “project” that is running for decades now). It would be interesting to compare this number to the amount of MS’s costs to produce/update windows.

121 million source lines of code costs $7 billion dollar?!? I don’t think so. What metric were you using for average lines of code generated per day? Industry standard can range from 10 to 1000. I’ve worked at NASA on mission critical software and the projects we worked on used 13 SLOCs/day as our metric taking into account extensive pre-planning and testing.

Using an average of 100 SLOCs/day would come out to 1.21 million days which comes out to roughly 3312 years. Multiply that by $75000/year and it comes out to around a quarter of a billion dollars. Still alot of money, but an order of magnitude smaller than your $7 billion.

I’m not sure I quite agree with your analasys as the ‘cost od ubuntu’ It seems to me, that all that source code would include things like the linux kernel source tree, the gnome or KDE source trees, and countless other packages that were not developed in any way by the ubuntu developers. The ubuntu folks, have done a great job of putting these packages together, and making the process of running linux, and installing the latest packages much much easier than it has ever been in the past. But I’m curious how many lines of code have been written specifically for the Ubuntu project, and I really doubt that it is anywhere near 121 million lines of code.

In addition, there are the inherent problems with using SLoC counts to determine the amount of effort that went into a project. You can believe that Ubuntu cost 7 billion dollars, but be aware then that debian the basis for ubuntu cost using this same logic cost on the order of 14 billion dollars since according to wikipedia (http://en.wikipedia.org/wiki/Source_lines_of_code) debian has 215 million lines of code. While this is all free software that is being given away crediting the ubuntu group an organisation that had its first release in 2004 with the development of the linux kernel source tree (started in 1991) seems just a bit out there to me.

Yeah, the SLoC assumption seems wrong… it assumes 100 SLoC per month instead of per day.

I think most of us would be out of jobs if we only produced 100 SLoCs per month.

The math I get is:

Lines of Code: 121,131,661
Man Days @ 100 lines/day: 1,211,316.61
Man Weeks @ 5 days/week: 242,263.322
Man Years @ 52 weeks/year: 4,659 (rounded up)
Total Cost @ $75k/year: $349,418,252.88

Why dont you try just running that on ubuntus patches, instead of taking into account every piece of code in there, ubuntu-devs didnt code ALL of that. You can find that information on Launchpad.net, Bazaar.

The point is not the work just the Ubuntu developers did, but what the collective as a whole would cost to develop.

“Without further adieu” should be “Without further ado”.

Sorry :S

Edit by Wayne on Oct 5 at 5:02 pm
Thanks bud!

He does site his source for his Constructive Cost Model.

Lines of code/month/day/hr may be debatable, but if you consider the complexity of this kind of code, you will be lucky for a dev to get in 20 to 50 lines/month in bug fixes, just from the time spent in researching the bug, build time and testing.

Who cares?

Linux = Beta software by design. Linux on the desktop – I hope I never see such an overated mess of missing dependencies, broken config files, and countless stupid forks of existing projects, etc, etc hit the big time.
And fuck all versions of the GPL as well.

Dear Mr. Linux:

Who cares? Last I checked – over 10,000 people cared enough to view this page, and you personally cared enough to comment.

*shrug*

Mr. Fuck Linux: If you’re looking for a more ‘engineered’ approach, take a look at the BSDs (FreeBSD, NetBSD, OpenBSD). I’ve never had a dependency problem using FreeBSD ports. Yet, on Windows, it’s commonplace to run into DLL hell, especially with some of the Windows “recovery” tools.

Some Linux/BSD tools are not repeatable on Windows. Can you upgrade IIS live? On Linux/BSD, you can download, re-compile, and test Apache while the previous version is running. A quick service restart (even a graceful restart) and viola — new version. Almost every Windows update requires a full reboot even in 2007. Can you configure your kernel (the Windows kernel) to suit your specific needs? Nope. There are hundreds of things that open source software can do that Windows cannot even touch (in terms of security, stability, etc).

I think you’ve got a real problem here… Here’s what the time command shows for 2 different hard drives here:

— IDE/Maxtor/7500 RPM/300G
/data$ time find . > xxx.txt

real 0m31.682s
user 0m0.192s
sys 0m1.372s
/data$ wc -l xxx.txt
208681 xxx.txt
/data$ df /data
Filesystem Type Size Used Avail Use% Mounted on
/dev/sdb1 ext3 294G 121G 159G 44% /data

— SATA/IBM/7500 RPM/320G
/oldstuff$ time find . > xxx.txt

real 1m27.638s
user 0m0.938s
sys 0m2.678s
/oldstuff$ wc -l xxx.txt
988363 xxx.txt
/oldstuff$ df /oldstuff
Filesystem Type Size Used Avail Use% Mounted on
/dev/sdb1 ext3 313G 273G 27G 87% /data

Now each are on a different machine (the first one on an AMD64, and the second on IA64, both with 2G of ram), but i never had such hangs on for listing some files. There’s a directory in there with over 110k files (fonts, ttf files). Both drives are formatted in ext3. They both are shown in less than 2 minutes with nautilus and konqueror (which is better in my opinion), in detailled list view mode and with all plugins shuts off. (no preview)

one thing that may make the difference is that for the last 10 years, and ~more~ probably, in my /etc/fstab where i ALWAYS add:
/data ext3 defaults,noatime,data=writeback 0 2

Could you try it to make new numbers with those parameters?

your site is cool but the sliding rss icon is WAY too distracting. i went to the frontpage and scrolling down
i couldnt stand half way through..

You forgot a major factor — most of these programs have throw away TONS of lines of code. The actual number of lines written so far is way beyond what you see today.

Heh, i was reading my post again hoping someone replied to me, and i noticed stupid errors. When i wrote:
/dev/sdb1 ext3 313G 273G 27G 87% /data
it is:
/dev/hdc1 ext3 313G 273G 27G 87% /oldstuff
as the disk is on another marchine i copied/pasted the previous df output and just filled in the disk size , free space and mount point…

and for the /etc/fstab settings it’s not:
/data ext3 defaults,noatime,data=writeback 0 2
but:
UUID=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx /data ext3 defaults,noatime,data=writeback 0 2
or, if you prefer:
/dev/sdb1 /data ext3 defaults,noatime,data=writeback 0 2

sorry for the mistakes, i’m still interested in the results you get with those settings… i’m quite sure the noatime flag will make things at least 100% faster.

I liked this article a lot. It just goes to show what can be achieved when everyone is willing to pitch in to help create a resource we can all use, like a linux operating system.

Reminds me of the (slightly cheesy) “spirit of ubuntu” translation from the ubuntu guys – “i am what i am because of who we are”.

I wonder how much money has been pumped into Vista if you take into account the previous windows kernels (as far back as perhaps win95 or win3.1).

With directories this large you need to turn on ext2 directory indexing.

tune2fs -O dir_index

It defaults to off since is slows things down for normally sized directories.

It does not matter that Canonical did not write all code, only that it exists and make make Linux/Ubuntu what has become.

Interesting article.

Regardless to the exact numbers I think your general point is very valid. Trying to write the amount of code available in your basic Linux distro would cost a fortune to produce if you had to pay for it. You’d not only be paying for the code but all the infrastructure involved in the production of the code. No small thing.

Though I may not know the exact figures to the cost of hosting free download for Ubuntu, but I believe there are covered-up costs for these expenses. Ubuntu is one of leading number of downloads in term of users, and also they distribute free CD to encourage users to use Ubuntu.

As much as marketing is concerned, there is certainly no lack of interest from advertisers. This will keeps on coming until Ubuntu stops its own policy of enticing users with its latest package repository.

For all those people trying to do the numbers yourself, take a look at SLOCCOUNT and get a few clues.

There are overheads involved when paying someone $75000 a year.

Dear Linux fanboys,

Please stop pontificating on *nix centric websites such as slashdot about the evils of Microsoft and spend more, much more time sorting out the problems with Linux which are satill being overlooked.

When Fedora was seemingly the up and coming distro I installed it on my second x86 machine, the install went fine and I restarted the PC. Imagine my horror when logginng into my account I see that the entire display is shifted to the left about 3/4 of an inch. The solution? Fire up some obscure app via the command line called XVidTune, adjust the display, save the settings to a text file, log in as root, and edit XOrg.conf to add the settings in the text file and hope I haven’t messed things up. Why, with all the millions of man hours put in by talented Linux coders
can something like this not be addressed? I’ve never had to jump through any hoops like that on my Windows XP machine. Why is it that even freeware on Windows is usually much more polished than an equivalent app under Linux? And what about development environments,
I’ve yet to see anything on Linux that can match Microsoft’s Visual Studio, even their free offerings
in the Visual Studio Express range are far superior to anything I’ve used under Linux which were clunky and lacking in power and features. I would have thought that Linux developers would by now have produced something to rival Visual Studio. Your strength
in diversity is also your weakness – too many half
finished projects where the developers simply lost interest and moved on, as I said in my earlier post “Linux – Beta software by design”. I’ve been dabbling with Linux since the days when Mandriva was then known as Mandrake, I’ve also tried Debian, PCLinuxOS, Fedora, CentOS, and others I can’t even remember and I’ve always gone back to Windows in the end, There’s simply too many issues with the various distros, it’s not worth the hassle in my opinion.

SLOCcount seems to be rather out of reality.

I created a two-liner “Hello World” and it said that it had cost $40!

$ sloccount hello.c

Total Estimated Cost to Develop = $ 40
(average salary = $56,286/year, overhead = 2.40).

Wow, great article. To bad it only rated one MS shillbot spouting the same old non-starters that the shills of yesterday tried. What, did they have you in mothballs somewhere Fuck Linux? And why would a MS fanboy be running XP still? Something wrong with Vista??? HEHEHEHE.

I think the $7B is under-estimated, rather than over. It doesn’t take into account the fact that most of this code was not written in an instant, but represents years, even decades in the case of Xwindow, the Linux kernel, gcc, and other basic components common to not only Linux but also BSD. Just how big is EMACS now, anyway? :^)

Indeed, one shill. One annoying, ignorant fool. All that is clear is that his copy/paste button works.

@ Bob Robertson:
Why do you consider me as being “One shill. One annoying, ignorant fool” for recounting my woes with Linux and my subsequent decisions thereafter to go back to Windows? Dont’t you understand I just want to get on with the tasks at hand whatever they may be, and not have to spend hours fiddling around with the OS just to get things working. Yes, my copy/paste button is working as are the rest of my Windows apps unlike a lot of the Linux beta-ware I tried to use
that seemingly comes as standard with many a distro.
Linux on the desktop – I hope I never see the day.

“Linux = Beta software by design.” Can I patent that?

Yes, you can patent that with GPL license :D.

Anyway, if linux is “beta software by design” why it run on almost all web servers over whole freaking world wide web? Web servers can afford to have bugs, simply because of the valuable informations contained in most of them.
Can you imagine world wide web ran by windows? I don’t think so.

@kedafi:

I was talking about Linux on the desktop, not in a server environment. I’ve found that there are just too many issues with it on the desktop to make it worth it. Even Torvalds himself has criticised the Gnome community for their way of doing things. There’s too much diversity and a bewildering array of options for the average user to be bothered with. More focus please you Linux fanboys.

Mr Fuck Linux is probably suffering from a case of “won’t work for me”. This is a well proven ESP condition, where the results are slated towards the negative.

I think I have the same problem with Windows. No matter how nice I am to it, it always seems to crash, get malware, and goes slower and slower. I have to get all these add ons to clean the registry, adware, viruses, and disused programs. Every time I add new hardware I have to reboot, and I have to do that for some applications as well. I also cant figure out where the other virtual desktops are hidden.
I can’t use the GIMP because the virus checker thinks it is malware. It doesn’t just work for me. Now I have to go out and buy new hardware as Vista wont run on my current system, and I need it for work related factors.

Ubuntu on the other hand is perfectly stable,. Apart from kernel upgrade reboots, the system has been up for over a year. I can install new hardware and applications without a reboot. I can share the system with other users either locally or remotely. I can change the network on the fly.

I really don’t understand why Mr Fuck Linux has all this trouble with Ubuntu and I have huge problems with Windows. Like I said, maybe its ESP.

Linux, like all operating systems, sucks. But it sucks so much less than anything else, it’s not even funny.

@Fuck Linux

How many years ago did you try Fedora? Is it still the “up and coming” distro? Have you tried some of the latest? Give the newest Ubuntu a try, it has some great video configuration enhancements.

Some of the things you are saying about windows apps vs linux apps are just plain false.

You seem to be into Visual Studio, have they finally adopted an ANSI compliant C++ compiler? When I was working on the VS team back in ’01, they had not, nor was there any plan to do so. I haven’t kept up on VS since then. Has Microsoft released the source control program that they used internally? It was really cool and didn’t have the problems of Visual Source Strife (MS slang).

Don’t judge Linux by the sum of open source projects. Sure there is a lot of schwag, but that’s why distros were created.

From what you claim you have tried, I would guess that you somehow expect Linux to work like Windows. You would probably not be able to figure out a Mac with that approach.

If people call Microsoft ‘evil’, it isn’t because Microsoft couldn’t code it’s way out of a paper bag yet can somehow sell that garbage. It is because some clever folks have collectively written and shared something that is more flexible and stable than anything ever dreamed up on any MS Campus, free soda and all. Then, when this free and uncontrollable thing starts to compete, Microsoft does every dirty underhanded thing possible to kill it. From bogus patent claims to paying trolls to spam blogs like this using names like “Fuck Linux” and making completely ignorant claims like “obscure” and “Beta software by design”.

I work at a financial institution and No Way In Hell would I ever trust any server produced by Microsoft to sit on our DMZ. I don’t want my member’s data to become pwnd by some teenage hacker who could download metasploit.

@ Rev. Spaminator:

I am more than willing to take money from Microsoft if they are willing to give it to me, however that is unlikely to happen as I have no association with them whatsoever apart from being a user of Windows XP and their Visual Studio express tools and in paticular C#.
It is the best IDE I have ever used, it has not crashed once on me, and when I installed the XNA framework for games development it just worked first time without any more configuration on my part. But contrast that with the following experiences I had with KDevelop on Linux – earlier this year I installed PCLInuxOS 2007 final and among other things I intended to have a go at one of them was coding in C++ & SDL (Simple Direct Media Layer) as it looked like a promising framework for games development. So via synaptic I install KDevelop and the relevant dependecies, but when I go to use make I get an error message telling me it’s not installed even though I know it is. Anyway I fire up synaptic again and tick the relevant boxes to grab the files, but still no luck, I get the same cryptic error message telling me it’s not installed, blah, blah, blah. What the hell is so fundamentaly different about KDEvelop as opposed to C# express edition that it cannot just configure whatever it needs to work properly without all this carry on?

Read on for another tale of Linux woe, this time it’s Graphics cards.

The PCLinuxOS machine I mentioned above has an nVidia 5600 FX graphics card sporting 256 MB RAM and via synaptic I installed the relevant drivers for it, upon reboot I set the resolution to something around 1600 X 900 to take advantage of my 22″ widescreen monitor. But no matter what I do the resolution drops back down to 1024 X 768 at each reboot- what the hell is going on? I know the card is capable, I try, and I try, but no joy. I’d salvaged a PC from work at this time and just to see how things went I installed PCLinuxOS 2007 final on it. The machine had a GeForce 2 MX with 32 MB ram onboard. I boot into the desktop for the first time and to my utter annoyance I see that the display resoultion is so high the card can barely handle it!!!!! Do you really expect people to adopt Linux when nonsense like this is left unresolved? I’ve not had anything like that happen under XP Pro. I’d never advise anyone to install Linux – no way, it’s not worth the hassle to get things to work the way they should. What’s the point of free software if it’s such a pain to use?

You can create statistics for anything.

This is a fundamentally flawed extrapolation. I would argue a few contrary points; 1). That the total number of lines would be ~20% smaller 2). That all the GNU components do not contribute to the dollar amount, because they have no cost.

Writing a 1-for-1 GNU/Ubuntu/Linux operating system clone from scratch, at a level similar in efficiency to the existing model may require that many paper hours, but the raw dollars would change. You are targetting a specific goal, rather than accepting the natural result, which would require more intensive effort all the way around.

@Fuck Linux

If you need help with video drivers, ask. Every OS can be funky about hardware and drivers. You just need to approach the problem from a different angle.

You prefer VS, and for C# and .NET that is a good option. I do most of my coding without an IDE, so I prefer a solid editor with good code highlighting. If you are looking for a good IDE for Linux, again, just ask. Name calling won’t get you much help from anyone.

You missed the point about the trolling comment. This appears to be a Linux oriented Blog, yet you insist on posting comments that trash Linux.

While I agree that there are tons of great free software out there, such as Mozilla Firefox, Filezilla, and the Apache web server, I am not much of a Linux fan. Yes, I have tried it before, and I agree that the Linux command like pretty much beats everything else. However, everytime I install Linux, bad things begin to happen. I have had, in the course of three years, two monitors and one sound card destroyed by Linux errors. Not to mention a lot of hardware that could not be installed or that was a real pain to install (entire weeks even…).

So, for the time being, I will stick with Windows, and yes, the plethora of good free software available for it such as Mozilla, Filezilla and Apache. Linux can wait a bit longer, though. Maybe I will install it again when I don’t mind taking nine hours just to print out a document. Or when it greatly improves to the point of being super friendly, which will probably happen, eventually.

Well F**k linux I am a regular desktop user on linux quite happy with my ubuntu I have a p3 600 and and old rage pro card, I have transparent windows, can M$ do that? I agree there are too many half finshed projects out there and *nix could use a little more direction. But that is part of the appeal of linux freedom of choice, and transparency in what your machine is doing. Can the same be said for Windows? not to mention all the other ignored bugs in windows mentioned above.

XP sucks. My laptop died with SP3. I’m on Ubuntu now. It still fucking sucks. Linux will never become big, because the average user will never be able to wade through dozens of wikis and forum posts because your shoddily-coded crap open-source program doesn’t work. At least with closed source you have developers who have an incentive to keep on working if the customers bail. I don’t get how nice the GUI is, and how secure it is, and how anti-monopolistic Linux is. The average user will not fucking type out commands to install programs.

Get with the times, you overrated open-source devs. Stop with your elitist nonsense and start fixing your damn bugs.

Leave a Reply

Your email address will not be published. Required fields are marked *