Saturday, February 28, 2009

Driving me insane

You know what I see a lot lately? Some luser posting some comment somewhere saying something like:

Look how far Linux has come.. it used to be that we had no drivers and you had to really choose your hardware. Nowadays, most of the drivers are there out of the box. This is better than windows!

Another fantastic example of lusers in la-la land.

In case y’all haven’t noticed, the value that a real desktop OS provides is not just in the drivers. Actually, I’d go so far as to saying it’s mostly not in the drivers. Just take a look at the Mac. People are willing to pay oodles of money for that stuff and it has the fewest drivers of any major platform.

Drivers are only just the beginning. And actually, sometimes they’re the easiest part. There’s plenty of room for standard Linux fuck-up at higher layers. Audio, for example. Mostly working alsa drivers you have (and besides, mostly everything is hda-intel these days), but a userlevel piece to manage sound? PulseAudio? Yay!

But for some reason y’all like to focus on the drivers. You know why lusers do that? Because it just happens to be the problem that people notice first. Your install Linux on your machine, your hardware doesn’t fucking show up. That’s immediate fail. Maybe some day you’ll get to a place where your hardware does show up. But does that then instantly make Linux as good as Windows or OSX? Please.

I’m actually excited to see this train-wreck happen. Once y’all have drivers, the fight will move to the next layer up. And like I said, it’s a lot harder at that layer. At least hardware doesn’t change, and most of the time, drivers just expose hardware functions. But providing sane, stable API’s, utilities, configuration GUI’s, and access to those functions to 3rd party apps with high levels of integration? Well, if X and PulseAudio are any indication, lusers will be at this for a loooong time to come.

802 flames:

«Oldest   ‹Older   1 – 200 of 802   Newer›   Newest»
Anonymous said...

The driver thing is a matter of market share. So it'll be hard for freetards to fix the thing.

Face it, which company (unless it's run by freetards) will care about the -1% market share of Linux on the desktop.

Ged said...

sounds like you've been reading the comments on "Windows Server 2008 One Year On Hit Or Miss?" on slashdot.

I had exactly the same thoughts when I read that comment too

Ged said...

for the record, this is what I was talking about

http://tech.slashdot.org/comments.pl?sid=1144517&cid=27023749

Anonymous said...

Companies run by non-freetards care also about the problems with Linux:

1. Hidden costs in support
2. Fragmentation
3. Fanatical element among users
4. Patents
5. Intellectual property

Those companies saw what happened to the Amiga and other contenders on the desktop. Will those companies take the risk? Probably not.

Then, there's the financial slowdown risk.

What's happens when the economy is fucked up?

Open source companies are easily assimilated by others. An nice example of this is MySQL.

And the economy is only getting worst.

So open source companies, watch out!

Alvarez said...

Man you have lost your mojo, and you know it...
This post is stupid, boring and false; your shit used to be funny and true.

I'll do like you and just say FUCK YOU with no apparent reason nor arguments.

Happy weekend.

Anonymous said...

Slashdot?

Not even Pamela Jones, public relations queen, takes that shit seriously!

"I was forever reading /. [Slashdot]
comments about legal news and most of the comments would be way off..."

Anonymous said...

Besides the server is a non-starter.

So the driver thing doesn't apply here.

Anonymous said...

Freetards (notice lately, usually angrily writing through broken English) will come on here and talk about how it's not their fault, but it's the fault of the corporations.

Anonymous said...

anything to say on opensolaris.?maybe freebsd?

yeah, yeah offtopic for the linux hater; but maybe you will come up with something ;)

Anonymous said...

The problems with PulseAudio in Ubuntu are mostly fixed in Ubuntu 8.10

Anonymous said...

But providing sane, stable API’s,

When has the GTK+/GObject API changed recently?

utilities, configuration GUI’s

What utilities are lacking?

and access to those functions to 3rd party apps with high levels of integration?

I don't know, you mean like an API?

This post has some potential. Come on LH please go into detail. Thanks.

Anonymous said...

This is so true. When first trying out linux, one of my hardware doesn't work and I have to look for one online after using the update feature which doesn't fix it. Yup keep it up with the wonderful OS system support linux. You should be proud.

Alvarez: You claim that his post is false and stupid. Can you prove it or not?

Anonymous said...

Usually the burden of proof is on the person making the claims.

Anonymous said...

I'm looking for his disprove but he didn't post it so I want him to post the explanation of his claims.

Exactly! said...

>> But providing sane, stable API’s,
>When has the GTK+/GObject API changed recently?

Linux Hater is right. Even Gnome developers agree on that: http://www.gnome.org/~federico/news-2009-01.html#29


P.S. A topic for another post: How come Gtk+ (paired with e.g. pygtk or something) call itself multiplatform,
when it looks like... shit on Windows?
http://www.wpclipart.com/images/wpclipper_screen_Vista.jpg
Look at those icons! And you must have admin rights to try that, even using the old and unmaintained "all-in-one" installer!
http://aruiz.typepad.com/siliconisland/2006/12/allinone_win32_.html

Felix said...

Haha, good point.

As for me, I still wonder why MPC HC in XP can play any video file, whyle Mplayer fail, decode with errors and crashes in ubuntu in every 10th file.

The funnies part in this, that freetard community always blame the encoders who does not follows standards. Still "unstandardized" files have no problems in Windows.

I guess this is some kind of linux magic.

Anonymous said...


Public relations queen is losing her sanity


Poor PJ...

Anonymous said...

What are you guys complaining about, haven't you figured it out yet?

With LinuxOS, it's always implement 5% first (in this casedrivers), the other 95% is considered not too important (OS itself) and project is finished!

Besides, the only Linux fans do anyways is surf and commandline anyways!

Anonymous said...

@February 28, 2009 3:17 PM

Yeah that looks like a standard Windows app to me.

Anonymous said...

Pluralizing without apostrophes - also known as WRITING IN ENGLISH - seems to not drive this guy insane at all. Maybe he's used to this pathetic pidgin, which all but ensures he's American.

Anonymous said...

I wonder how pj looks like... if she is worth the lusers' basement sperm

not that they are too selective though

Anonymous said...

What about the fact that Intel has an open source driver department? This fact has driven Windows bat shit crazy. If Linux has driver support for more and more hardware, provided by the hardware company, then yes the fight moves up to applications, GUI, and games. With more free applications that can keep up with MS crap (open office for word, eclipse for MS visual studio, and PHP for ASP), GUIs that are easily better than Windows (simply because Linux does not crash often, or take up more resources, or shit brick when trying to extract files), and with more games coming out on the Linux OS (Savage 2) and better game support for older games via Dos Box, Cossover, and Cedega - Windows is losing more ground everyday. The biggest thing is that majority of user just want to surf the web, check email, listen to music, and watch movies. Linux, with the addition of several small apps can do this for nothing compared to $300 for Windows 7. Also majority of the Windows market is using XP. The upgrade path for XP is VISTA then Windows 7. Who the fuck wants to pay for an OS that you aren't even going to use (aka Vista)? No one. Linux has free upgrades, free community support, and free apps. Free as in FREEDOM.

The Blue Fox said...

Ha ha ha, this driver thing is so true. When I first switch to Linux almost all of my hardware stopped working: my wireless card, my video card, my tv tuner, heck even my external hard drive. It took me about half a year to get all that crap working on Linux.


To this day I still use Linux as my main OS, but that's only because I like tinkering with it. I've reached the point where I have no problems with it, but every sane person would have gone back to Windows right away.

Yeah, so Linux is very far from getting the average user interested. Even I still have a Windows partition, because frankly Linux will sometimes break after an update and I won't have the energy to fix it.

Anonymous said...

I found that with Ubuntu at least, if it works, then it just works out of the box or with the driver manager, but if it doesn't then you are pretty much fucked. Because Ubuntu comes with almost all the drivers available for Linux.

On my laptop (which came with Vista), 8.04 wouldn't work for shit (no wifi, bad video quality, no suspend/resume, etc.). Ubuntu 8.10 works very well however, everything works except the webcam. Actually I've been using Windows for a year on the thing and I didn't even realize (or didn't care) that I had a built-in webcam until I tried to use "Cheese".

Anyway I am pretty happy with Ubuntu's hardware dedication overall. Obviously it can be better but it's still pretty amazing, considering a didn't pay a dime for this OS.

Anonymous said...

The level of "supported" drivers is disingenuous. More and more the "driver" is really just a conduit for a software engine, meaning the peripheral hardware does less and the computer does more.

This practice was vilified in the 90s for mostly good reason: weak hardware, inefficient OSes, and lack of interrupt standards led to measly modems sucking up 25% of machine resources. Ten years later, it doesn't make much sense to load peripherals with hardware as that makes it more expensive for the consumer. Besides, whether he's scanning, printing, or transferring pictures from a camera, he's not using the computer anyway; he's waiting for the task to finish so he can move on.

Linux developers consider the "driver" to be the part that sends electrical codes to the device and receive a response. That next layer up, the software part that does the real work, is summarily glossed over because it's hard, like a previous anonymous stated. Therefore, in Linux, your hardware may be "supported", but it'll be at the minimum possible level to make that claim. Feature parity is about ten years off.

Anonymous said...

By the way, in order to get my Kodak camera working in Ubuntu 8.10 I had to follow this thread:

http://ubuntuforums.org/showthread.php?t=346840

Anonymous said...

@February 28, 2009 8:24 PM

Not that I disagree with you, but care you provide some examples of this?

Anonymous said...

If you mean examples of "electrical signaling" drivers, take a look at the last five years of SANE. Pretty much anything that gets a rating above "it works, kinda" was either released in the 90s or is one of the few remaining hardware based models that costs more than some computers.

Video drivers are notorious for this as it is readily understood that video cards have become increasingly general purpose hardware with the magic happening in software.

Finally, take USB anything. Yeah, anything governed by HID or mass storage works (maybe), but chances are the utility buttons, volume knobs, etc. are useless. Speaking of which, does that front mounting thing by Creative Labs work properly?

If you mean examples of the developers' mentality, you'll have to wait. I attempted to find an article I read some time back, probably on Slashdot, without success. Basically, the article said that the Linux driver team was out of work because everything had been accomplished. Therein they explained their limited view of what it meant for hardware to be "supported" in Linux or what was supportable. From what I recall, this article drew the ire of even Linux loyalists, who know better than to claim something is done just because the most primitive interface exists.

Anonymous said...

Why do fucking freetards always bring up Open Office as a cost saving measure or a reason to switch to Lunex? It fucking comes on windows too retards, and if your a user whos less knowledgeable, Open Office Windows version by Novell is sponsored by Microsoft. So just fuck up and finally come out of your moms basement and buy a MAC if you hate MS. Another thing freetards always bring up on a dual boot machine in particular is viruses, "I had clamwin (shitty free open source wannabe antivirus) installed but the malware showed up anyway", hey dick buy norton already, its like 50 bucks for a year and it actually works.

Anonymous said...

Norton is acts like a virus. It takes up resources (ass loads), it keeps you from doing what you want to do, and its a way for corporations to track what you are doing. Yaaay, you fail at life.

Anonymous said...

Oh no, corporations are monitoring users on their network. Get back to work, fucktard.

Anonymous said...

I don't know much about scanners, but I am not sure how a scanner which does most of it's work in the driver is even possible.

In winmodems, the "modem" is really just a shitty sound card with a phone jack output instead of regular line out. All the signal processing is done on the driver and this is very CPU intensive in general. But since the the modem requires no signal processing ASIC it is quite cheaper to make.

In winprinters, GDI is used to rasterize the print job into a bitmap. This really has not cought on much however, since it results in bad print quality and very slow performance for a minimal reduction in cost you only find this in the cheapest of printers.

autumnlover said...

Lately I check latest alpha od Ubuntu 9.04. Floppy support f***ed up in 8.10 is still not present OOTB in 9.04, while it works fine in Debian Lenny. Thats the way "Ubuntu just works" ROTFL!

Anonymous said...

@Felix

"As for me, I still wonder why MPC HC in XP can play any video file, whyle Mplayer fail, decode with errors and crashes in ubuntu in every 10th file.

The funnies part in this, that freetard community always blame the encoders who does not follows standards."

No, the funniest part is that MPC HC is also written by the freetards you're trying to bash.

Anonymous said...

Freetards don't want to get to that point. They're too happy now blaming all their shortcomings on hardware manufacturers.

That's the lintard's lemma: blame always lies somewhere else. They don't want that taken away from them.

Anonymous said...

Yes but the web looks better (fonts again) and works perfectly fine under windows, why do I want to downgrade to Linux to do the same thing and even likely less? You do know that watching DVD under linux is like watching a screen tear freakathon right?
You do know that even installing msttf fonts does nto fix the issue with font hinting, which is done badly under Linux.

And what if I want to do more, like create a DVD and DVD menu system at the same time? Where are your Sony Vegas Video equivalents? And sorry but Linux and Gaming is an oxymoron. You can't begin to even compare the quality of windows games vs Linux games. And if you did, hell the same game is also available under windows too, plus we get teh benefit of multiple antialiasing modes/anistropic filtering modes available due to mature 3D drivers.

"The biggest thing is that majority of user just want to surf the web, check email, listen to music, and watch movies"

Anonymous said...

The driver problem will not be solved (at least for graphics) as long as there is a binaryAPINonsense.txt (or something like that it was called) in the kernel docs, written by the all time luser Greg K Hartman.

When it comes to audio Linux died for me when Pulseaudio was out.

I still use Linux, because I don't really have an alternative on the server side..

Anonymous said...

I honestly cant believe that this site exists , if you spent as much time writing your own software instead of poo-poo-ing other peoples work you might have a decent system but no , you'll sit at home on your virusfest labelling people 'freetard' and 'luser' - well fuck you , i write FREE software , you write shit. As for 'drivers' - everything i ever encounter works perfectly

P.S. get a real hobby

Anonymous said...

Technology has helped linux in the last years but unfortunately linux didn't took advantage of that.

Remember the old times that there were numerous incompatibilities with the numerous sound cards on the market?

Remember the old times that X could never detect CRT resolutions and Hz? You had to mess around with the stupid xorg.conf just to make the monitor work in a sane way.

Now 99% of people have hda-intel sound card and LCD monitors that all work just fine with 60hz and have a native resolution (now even if X manages again to make wrong detection the monitor will work).


While sound started to work in linux (every app finally had an alsa output), PulseAudio came and messed up everything again because it doesn't fully support alsa input.

While an X desktop output worked in the old days, "pre-alpha-experimental" Compiz appeared and for a who knows why reason became default in all the popular desktop distros. What the hell? Compiz breaks opengl applications + video overlay. Since Xorg is not ready for compositing why did they include it as a stable feature that every freetard will enable, and then ask the ubuntu forums for help?


In the old days we had KDE3 and we could get some work done. Now we have KISSSSSSS Gnome as the only usable DE and KDE4 doesn't make sense, whatever your needs.

Linux is great, until you start X.

Anonymous said...

I still don't understand PulseAudio and why, for example, the asshole moderators over at UbuntuForums were pimping that broken piece of shit when it first came out. PulseAudio turned all the half-assed Loonix apps into quarter-assed Loonix apps.

Anonymous said...

Anyone else notice how long Slashdot takes to load when you hit the front page? Linux experts indeed.

Anonymous said...

Freetards don't want to get to that point. They're too happy now blaming all their shortcomings on hardware manufacturers.

Yeah, pretty much. I was amused when Intel and ATI opened pretty much everything hardware wise and the freetards' reaction was pretty much, "Huh? Why doesn't it do 3D like DirectX now?" Turns out the hardware doesn't do so much after all.

Now, even after getting everything they asked for, freetards are back to blaming hardware manufacturers for not providing the 3D interface software that's optimized for DirectX anyway.

What happens if vendors actually release their software? Well, they're still evil for writing the whole thing for Windows/DirectX instead of dumping money into fixing X11, which would also be seen as an evil move since it's a corporation trying to "monopolize" a core Linux component.

You can't please these people, and they don't spend money anyway. Why any company even attempts to extend an olive branch is beyond me.

Anonymous said...

I honestly cant believe that this site exists , if you spent as much time writing your own software instead of poo-poo-ing other peoples work you might have a decent system but no , you'll sit at home on your virusfest labelling people 'freetard' and 'luser' - well fuck you , i write FREE software , you write shit. As for 'drivers' - everything i ever encounter works perfectly

That's exactly what I think of lintards. If they spent half the time on fixing their hopelessly broken system instead of bad-mouthing Microsoft, they wouldn't have to. We'd all be using it, willingly.

Anonymous said...

@March 1, 2009 9:34 AM

Point us to your resume of accomplishments. Thank you.

Anonymous said...

Linux users only use Linux for three things: posting complaints about "evil M$" on blogs, telling everyone how much smarter they are because they are running Linux*, and endlessly customizing their desktop. True or no?

* This is a lot like the guy you know who doesn't watch TV, and goes out of his way to let everyone know he doesn't watch TV. Linux users always have to point out that they are running Linux.

Anonymous said...

It often goes beyond "I'm running Linux". Statements on the order of "I know what I'm talking about because I run Ubuntu Faggot Ferret with custom compiled ATI drivers to go with custom compiled Compiz and I view all my web pages in vi because automatic HTML parsing is for n00bs" are common.

Anonymous said...

@ Anonymous FEBRUARY 28, 2009 1:31 PM

>> utilities, configuration GUI’s

> What utilities are lacking?

for one, i have yet to see a *working*, widely advertised and universally adopted ( by all distributions) equivalent of Windows' device manager, that is a standard system wise tool that allows one to:
1) enumerate installed platform components ( better with category, connection, IRQ, etc sorting and grouping ) AND see their properties
2) for each component, view the currently associated driver
3) for a device which has been detected but cannot be used yet, select (and install) a driver of user choice
4) for a selected device already bound to a driver, *change* the driver to use
5) update the currently used driver to a more recent version;
6) for 3)... 5), to use either an automated or a manual procedure for locating and selecting the driver to use, and to select a driver not matching the device id if the user really means to
7) to configure driver related options writingto to the appropriate configuration files and sysfs/cfgfs entries
8) to have driver related files created during driver setup, and reverted during driver uninstall
9) have the driver installed either as external modules or as kernel builtins, depending on user choice
10) for in- kernel drivers, additionally manage kernel recompile and setup when updating or replacing a driver
11) allow to "move" kernel builtin drivers to externally modules and viceversa (configure and rebuild the the kernel)
(this is linux land, where the kernel can be built the way you want - if driver code is available AND kernel sources are installed nothing would prevent doing that too in the same tool, and it would seem quite logical to me)

AFAIK:
hal device manager only does 1 and 2,
Ubuntu's restricted device manager does 3 and 4 (but only for a limited number of ghetto-ized devices)
a project aiming to do at least 3 ,4 and 5 was started a couple years ago but is now bitrotting ( or is it alive? has some distribution picked it up and i haven't noticed?)
driver installation is done with insmod (manually and knowledgeably) with no general purpose tool assisting the user in the "device detected, load a driver" operation - especially, there seems to be no tool to validate the driver the user is going to use, since the module must be loaded and initialized for it to actually check compatibility between the device and itself, with all the risk inherent to the possibility to choose an unsuitable driver

no other OS apart from windows seems to have thought deal with the pain of the installation and configuration of devices AFTER the system has been setup - one of the most often admistrative tasks performed by a desktop user, is the most neglegted one...

Anonymous said...

I honestly cant believe that this site exists , if you spent as much time writing your own software instead of poo-poo-ing other peoples work you might have a decent system but no , you'll sit at home on your virusfest labelling people 'freetard' and 'luser' - well fuck you , i write FREE software , you write shit. As for 'drivers' - everything i ever encounter works perfectly...

Open source software such as Drupal, Open-Realty, Joomla (although it's a bit bloated) are great Web applications.

The rest of the open source software sucks.

You guys should stick to the server and forget about your fucked up distros.

You'll never get good market share on the desktop.

Freetards will always be coursed with the Amiga syndrome.

yOSHi314 said...

"Face it, which company (unless it's run by freetards) will care about the -1% market share of Linux on the desktop."

one that doesn't necessarily focus on the needs of desktop users. linux is widely used outside of desktops, and i think it'll stay this way for a while longer.

why do you think ati and nvidia are making linux drivers anyway?

they started making them when there was even less linux on desktop. try to answer yourself that question.

hint : you do realize there are other markets beside desktop pc's, right?

when there is a need for certain software - companies will follow.

Anonymous said...

"you do realize there are other markets beside desktop pc's, right?"

Now the rationalization of failure begins. Loonix Lusers didn't want the desktop after all, right?

Anonymous said...

no other OS apart from windows seems to have thought deal with the pain of the installation and configuration of devices AFTER the system has been setup - one of the most often admistrative tasks performed by a desktop user, is the most neglegted one...

The lintards will tell you that you're supposed to upgrade your whole distribution anyway to be able to configure and use a new device. That's their idea of great user experience.

Anonymous said...

The fact that you have to upgrade to get a new release of Pidgin or whatever other poorly implemented, feature-lacking clone of other FREE (as AIM and Yahoo Messenger are) software is indeed a deep design flaw. Imagine the criticism from Lintards if you had to upgrade from WinXP to Vista to go from Firefox 2 to Firefox 3.

Anonymous said...

why do you think ati and nvidia are making linux drivers anyway?

Because they wanted to move into the high end graphics workstation business that was falling apart due to SGI's stagnation.

Do you really believe they give a shit about consumer level linux?

Anonymous said...

You can always download the newest Debian package for Pidgin. In Ubuntu, if you have Ubuntu backports enabled it will do this automatically for you.

Anonymous said...

Well ATI didn't get their act together until consumer level Linux users started bitching at them. So I think they do care about consumer Linux, in fact that's a much bigger market then Linux graphical workstations, even at 1%.

Anonymous said...

The fact that you have to upgrade to get a new release of Pidgin or whatever other poorly implemented,

What are you talking about? Who started this nonsense? Using firefox 2 with a recent distribution? The LH's criticism regarding this issue is wrong, such a package has been made for other distributions and there never was a technical reason for it to be impossible anyway. Get a grip, people!

Anonymous said...

Using firefox 2 with a recent distribution?

The issue is that lintards only consider something to be recent if it has been released in the past six months. Anything older than that is answered with something like "sorry pal, upgrade your distro"

Anonymous said...

Will "downloading the newest Debian package" resolve all those dependencies, especially if the latest libgtkspell0 or some other bullshit is required so that my instant messages can be spell checked? Honestly, I don't know, I haven't tried since the last time upgrading a single package fucked a whole bunch of shit up.

Anonymous said...

there were some mistakes in my previous post, sorry... luckily it was undestandable enough

The lintards will tell you that you're supposed to upgrade your whole distribution anyway to be able to configure and use a new device. That's their idea of great user experience.

yeah, exactly... and the fact one has to upgrade the whole kernel (with all associated core environment) to gain the one (just merged) driver you need for one previously unsupported device, is the single thing i hate the most about linux, together with the fact that the only option is to stick with a distribution and hope it will provide support for your release and your kernel ( that is, relying on the individual distro maintaining its own kernel tree, actually a fork, and backporting drivers and fixes)

and a problem that could mostly be avoided, if drivers were decouple from the kernel proper (from the point of view of design and deployment) and if it were possible to install "any" driver on "any" kernel
even if we version the ABI and put restrictive requirements on the abi revision's lifespan (say, one to 3 years, then major breakage is to be expected ) on the supported platform(s) (currently, desktops are mostly if not all, x86 - with most if not all machines made in the last 4-5 years, having a 64 bit compatible processor), and on the driver build environment (if the compiler is a variance point, let's say all drivers for Driver ABI v X are to be built with GCC v Y, this would cut away most problems) this would hold true

gkh and his purely political and utterly stupid stance "a stable aBI will never happen", are some of the most disgraceful things that could happen to linux on the desktop...

Anonymous said...

Freetards will always be cursed with the Amiga syndrome.

Poor guys.

HA HA HA!

Anonymous said...

yeah, exactly... and the fact one has to upgrade the whole kernel (with all associated core environment) to gain the one (just merged) driver you need for one previously unsupported device, is the single thing i hate the most about linux,
What? I've used out of kernel tree drivers before, they're supplied by the distribution developers.

together with the fact that the only option is to stick with a distribution and hope it will provide support for your release and your kernel ( that is, relying on the individual distro maintaining its own kernel tree, actually a fork, and backporting drivers and fixes)
Distributions no longer fork the kernel, they ship it with a small number of patches for drivers and bugs. You can always change the distribution kernel without much trouble, I've been doing this for years. What's the big deal?

gkh and his purely political and utterly stupid stance "a stable aBI will never happen", are some of the most disgraceful things that could happen to linux on the desktop
AtomBIOS and the fact that most hardware has open-source drivers prove you wrong.

Anonymous said...

I didn't even know the kernel had an (non-syscall) ABI. I thought it was a monolithic kernel. Does anyone here know what the fuck they are talking about or are they bitching about things they don't understand?

Anonymous said...

http://www.osnews.com/story/21062/Acer_Aspire_One_Linux_and_Windows

Juicy stuff.

Anonymous said...

DoesntWorkForMe(tm) as about as valid an argument as WorksForMe(tm)

Anonymous said...

I wonder if OSNews will publish my "story" of Linux just working on the HP laptop I have.

Anonymous said...

I had to double check the date on that OSNEWS article, it sounds like similar articles I've read on OSNEWS from years ago, including the SAME type of whining pro-linux comments.

Linux fails then commenters either saying things like "well it worked for me after the following ten steps" or "it's not our fault the hardware isn't supported" or "Windows isn't perfect either" (what does that have to do with Linux being pure shit?) etc.

Same ol' Lintard stuff, yet this is from TODAY and on a low-powered netbook, which were touted as the arena that Linux would be perfect for.

Anonymous said...

Gem of a comment from that OSNews thread: "While I agree with everything this blog post said, I still think Linux is very cool because you can get the source code for everything"

Yeah, as if the people who whine the loudest about source code have ever written a single line of code or even modified someone else's.

Linux Lusers, when's the last time you modified the source code of an app and then recompiled it to do something you specifically wanted or add a feature to it? Yeah I thought so.

Anonymous said...

I like this comment:

I'd like to see a re-review on getting Windows XP working; only, instead of using OEM Windows CDs, which have been made specifically for the netbook with all proper drivers, use the official Microsoft retail CD and fetch any needed drivers from either Acer's or a specific hardware vendor's site on another computer

Yeah, let's make Linux look better by intentionally handicapping the Windows install process with a method that almost nobody uses. Hey, why stop there? Why don't we use the 2001 RTM version? You know what, I'll bet it still goes smoother than whatever "state-of-the-art" Linux distro.

Besides, by now, most people still trying to install non-OEM Windows have enough sense to slipstream whatever required Service Packs and drivers. There's even easy-to-use tools that effortlessly pack the image with enough drivers to allow one CD to install on just about any PC manufactured in the past decade. Modem drivers are commonly missing, and sound is hit or miss when installing XP onto Vista computers that aren't supposed to support that. But, hell, that's about three galaxies more advanced than whatever Linux provides.

wbk said...

Actually I've been using Windows for a year on the thing and I didn't even realize (or didn't care) that I had a built-in webcam until I tried to use "Cheese".

LOL typical luser response. YouDon'tReallyNeedThatFeature(TM)

Anonymous said...

More like "I don't really need that feature".

Anonymous said...

"You can always download the newest Debian package for Pidgin.. have backports enabled"

The newest deb may not resolve dependencies, or backports may not have the latest version. And heaven help you if it's new software with audio hooks like vlcplayer or flash. And after you install it, messes up your alsa/pulseaudio configuration and you're really shit fucked.

It's 2009, it should be as easy as uninstalling the old version and installing the new thru double click. But the problem with Linux is that applications and OS components do not have clear seperation or protection from each other. In Windows, all you do to get codec support is install Klite codec pack and you're done. You don't have to worry abpout a codec possibly fucking up anything else to prevent you from booting or whatever. In Linux you can do the same, but often disatrous things can happen because the restriced extras may not be as up to date as the distro you are putting it on (more interdependencies).

Look at some of the latest distros out now, why keep putting old versions of software when the latest versions are already out?

After real world trying out Ubuntu/Various Linux for months it was just hair pulling and dependency nightmare hell.
Even the quality of the linux freeware was not up to par with windows comparables.

Again, if you're needs are simple, Linux is fine. But then so is Solaris, FreeBSD, reactos, Amiga.
An OS is only useful if the strength of it's media/applications are ones that truly make people productive and happy.

"You can always download the newest Debian package for Pidgin.. have backports enabled"

Anonymous said...

Those of you linux users who do not need that nice vista/xp license on your laptop, pelase do send it to me, I will make sure it's put to good use! :)

Anonymous said...

You can't legally transfer OEM licenses of Windows.

Anonymous said...

GDebi can handle dependencies for *.deb files. If the dependency can not be met automatically it's very unlikely getdeb would offer a single .deb for the package.

Anonymous said...

I miss oiaohm's freetardism and loooong posts...

... I only read these when im taking a dump

Anonymous said...

I would rather be somebody's bitch in prison than go another day running LinuxOS.

"You can't legally transfer OEM licenses of Windows."

Anonymous said...

I only read these when im taking a dump

To ensure you have enough paper to wipe yourself?

Anonymous said...

I'd like to the actual professional who needs in no particular order these programs to work and I mean especially in the case of the architect/cad programs to display true 3d models and view points for fly by movies and client presentations;
AutoCAD (all versions espec: Revit Architecture)
Chief Architect, and 20/20 Design.
Guess what Linux can't, Ubuntu definitely can't even under Wine or any other emulation software. The came goddamn system would lock up so fast it became a game for me to guess the time each running instance of the program would last. Then when I finally got it stable the system locked out with a "memory parity issue" blue screen of death in linux yes it happened. Basically I just Fuck this and installed Win 7 Beta, after the first auto update the drivers were automatically applied and no more issues.

Anonymous said...

I wonder if Stallman thinks the software for guided missiles should be open source.

Anonymous said...

@March 1, 2009 7:29 PM

You have got some strange fucking priorities man.

Anonymous said...

To be in compliance with the GPL all you have to do is ship the source code with the cruise missile.

Anonymous said...

You have talked without saying anything useful, just troll.

That's what you do best, you're a fucking troll, go get a fucking life, or kill yourself.

Stupid idiot.

Anonymous said...

Drivers in linux are amazing...
I have an X-Fi XtremeGamer(xtreme gamer? under linux? lol)

Now, what is the first thing lusers yell when it comes to supporting hardware? "Just give us the specs and we will write the drivers ourselves."
ALSA recived specs, but not much happened(someone was able to produce a semi-working driver but it never got into the main tree).

Creative released a binary only driver which worked at the time. With next ubuntu release it would not work since someone just had to poke around in the kernel and change something.

Some time later, Creative finally released the driver under gpl. What happens? Development of the alternate driver in alsa stopped, nothing more.
Creative's driver does not get into alsa tree either, since it is not "good enough". Nobody wants to improve it though.

So here we have what every luser ever could ask for: Specs released and a gpl reference implementation. What happens? Nothing.

Yes, it is possible now to compile the creative driver yourself to get the soundcard working, but that is thanks to Creative not the lusers.

http://mailman.alsa-project.org/pipermail/alsa-devel/2008-November/012380.html

Now this is why some companies dont want to start dealing with making linux drivers...

luser: AAh, release a driver for the 0.000001 percent of your customers who use linux.
manufacturer:ok
luser:aah, your driver sucks, release specs and we will write drivers ourselves.
manufacturer:ok
luser:aah, writing drivers from scratch is not funny, we want gpl drivers.
manufacturer:ok
luser:aah, your gpl drivers are buggy. Since we cannot improve them ourselves, we want to cooperate with you so you write the driver and the open source community can take the credit for it.
manufacturer:no, we dont want lusers yelling at our developers, they have work to do.
luser:aah, we wont ever buy your products again.
manufacturer:screw you and your hobby os

Anonymous said...

But the problem with Linux is that applications and OS components do not have clear seperation or protection from each other.

LOL. Dependency hell occurs when you're trying to do something stupid. If there were no warning, you'd have a broken system, so I'd say that OS component dependencies are well defined.

yOSHi314 said...

"Creative released a binary only driver which worked at the time. With next ubuntu release it would not work since someone just had to poke around in the kernel and change something."

a driver has never a priority over the kernel. if something in the kernel has to change and break the drivers - the drivers have to be fixed up. it's always like this - the driver mainainers rebase their drivers against some change in the kernel breaks something.

it's creative's fault for not following with new driver, not the other way around. besides, that driver was crap anyway.

Anonymous said...

yoshi and oiaohm,

I think you should avoid using even a nickname, or you'll be lynched by the mob here for no reason. Being anonymous is an advantage in cases like this, people (usually) focus more on what you say.

Anonymous said...

a driver has never a priority over the kernel. if something in the kernel has to change and break the drivers - the drivers have to be fixed up.

Sure. That way, kernel developers can follow the spirit of "release early, release broken, and we'll fix it when someone complains loud enough" to the letter. That's why they'll never provide stable interfaces: they're too much work, and they'd have to support their mistakes for an extended period of time... meaning that they'd have to work even harder to get it right the first time. That's not their style, of course, and they don't give a fuck about desktop Linux anyway, so why should they?

Just keep using Windows or buy a Mac. Problem solved.

oiaohm said...

I wish people would just forget about binary kernel drivers. Ever wonder why you cannot install particular programs with each other in Windows without killing the system.

Some copy protections drivers conflict with some anti-virus software.

For all the gain Window gets from closed source drivers it also gains a lot of pain.

Ok Open source not great at everything.

Also you are behind the game. These days non complete drivers are allowed in the main kernel tree. So avoiding the break driver issue.

Also you are idiots. People you call Lusers are not the ones that drive the need for open source drivers. Hardware markers release open source drivers for the simple reason some markets demand the right to audit every section of code they use. Even Microsoft has to obey those restrictions. Even worse MS has had to great some groups the right to alter there OS for there own Internal use.

Now for one min if you think MS is pure closed source you are also a idiot. Difference here is that governments and big companies get the right to see and alter MS source code. Home users miss out.

AMD VIA and INTEL are all fighting in the secure computing market. Nvidia is fighting in the Gaming market.

Anonymous said...

Some copy protections drivers conflict with some anti-virus software.

If that's the only thing you can come up with, I think things are looking pretty good on the Windows side.

For all the gain Window gets from closed source drivers it also gains a lot of pain.

90% of the market says the gains outweigh the pains.

Now for one min if you think MS is pure closed source you are also a idiot. Difference here is that governments and big companies get the right to see and alter MS source code. Home users miss out.

Because home users don't give a fuck, maybe?

Get your head out of your ass and wake the fuck up.

Anonymous said...

"LOL. Dependency hell occurs when you're trying to do something stupid. If there were no warning, you'd have a broken system, so I'd say that OS component dependencies are well defined."

O really? Ok, then why is it when you download the latest Ubuntu distro, now add your latest Adobe flash firefox plugin/enable restricted extras, now do the fix to watch dvd movies. 10 out of 10 times, most likely your flash player plugin does not like your new dvd fix because it requires a certain version of vlcplayer/alsa/pulseaudio. And so you must now go to sound control and change emulation from pulse to alsa, and pray it works again. So you can either watch dvd movies and have no flash sound, or have flash sound and no dvd movies. What kind of choice is that?

Anonymous said...

90% of the market says the gains outweigh the pains.

Copying the way Windows does things because of market share is stupid. People have been saying for decades that a stable kernel ABI like the one in Windows is the wrong solution to the problem. The market has adjusted to this (AtomBIOS, most kernel drivers are open-source), why do you insist you're right?

Anonymous said...

And so you must now go to sound control and change emulation from pulse to alsa, and pray it works again. So you can either watch dvd movies and have no flash sound, or have flash sound and no dvd movies. What kind of choice is that?

You're essentially beta-testing Ubuntu for this, so you hit a problem there. You are right here: in this case package dependencies aren't set correctly, this is the packager's fault. In my system pulseaudio is disabled (I have no need for it) and sound works fine. If you use the latest linux distributions, be prepared to do beta-testing and QA.

Anonymous said...

If you use any desktop linux distribution, be prepared to do beta-testing and QA.

There, FTFY.

Anonymous said...

Copying the way Windows does things because of market share is stupid.

If you don't care that people can actually use the system. I guess you're right.

Anonymous said...

I like the conversation in the X-Fi thread. In response to Creative releasing specs and GPLed driver:

"that would be a straight hit in the face of every Linux X-Fi user and the OpenSource community."

"Then show your appreciation to Creative by never ever buying any of their products again. That's what I'm going to do."

"Befor this desaster I recommended
Creative to everyone... but nope, no longer..."

No wonder Linux drivers suck. Even when companies try to do the "right" thing they still get crucified. And that's what you idiots don't get: companies don't want to spend money on a continuing basis to become a part of your "community", hang out on IRC and mailing lists, and babysit the driver for life because Linux can't decide how to playback audio after 20 years of development.

This is Linux's problem, not Creative's. I can't believe how blind you people are.

Anonymous said...

you should avoid using even a nickname, or you'll be lynched by the mob here for no reason.

blackbelt_jones? Is that you? Seriously, what do you people expect when coming to a place called "Linux Hater" and aggressively extolling the virtues of your broken OS, laced with insults and explicatives? Cookies and sunshine? At least we're honest and not pretending to be impartial unlike Slashdot and Digg.

Anonymous said...

If you attempt to use any old-hat functionality like sound, video, drivers, or keyboards in a Linux distribution, be prepared to do beta-testing and QA.

Modified your statement since it was under the GPL. Wait, it wasn't? Well, I don't care since all software should be GPL and you're an evil bigot if you think otherwise.

Anonymous said...

[Stable interfaces are] too much work, and they'd have to support their mistakes for an extended period of time... meaning that they'd have to work even harder to get it right the first time.

Furthermore, exactly what kind of devastating mistakes would they make? It's not like OS interfaces are new, uncharted territory. Most of this stuff was done in the 80s.

Yfrwlf said...

Patents can fuck off, especially software patents. Everyone should "violate" them as much as possible, until countries get rid of them. Just had to get that out of my system.

Any way, yes, now that Linux has somewhat decent driver support, usually of course more support out of the box than Windows, but whatever, but now hopefully things will start to focus on the more important levels, as in, all of the rest, so that users have a *completed* desktop experience. Of course, this won't be *really* good until there are more Linux programs for more choice in that department (games!!!!), which is slowly happening, but just...slowly.

You said it right when you spoke about APIs and interoperability. KDE, fucking play nice with Gnome, and Gnome devs, fucking play nice with KDE. Interoperability is easy, it's called where ever you have a differing opinion, you make it an *option* and let the *user* be the one to decide.

For example, lets say that Gnome devs want program configuration and log files and such placed in ~/.gnome, and that KDE devs want them placed in ~/.configure or in ~/. How about, you LET THE FUCKING USER DECIDE. It's called you make a STANDARD, where you have an OPTION in a file that allows the definition of WHERE they are to be placed. Then, if a program say, wanted to read or configure or change one of those configuration files or whatnot, it could simply look in this standardized configuration/API/system file and bingo! It knows where to look and how to do things!

It's called interoperability and standards! FUCKING IMPLEMENT THEM.

Yfrwlf said...

BTW thank you for this website. Hopefully it'll help complaints reach those who have the knowledge to change things for the better. ^^

Anonymous said...

When will the microkernel vs monolithic kernel debate ever end?

Anonymous said...

Once Duke Nukem Forever gets a Hurd port.

Anonymous said...

Speaking of Hurd, there's yet another stillborn FOSS project that was supposed to rock our world and supplant Windows 95. Turns out they did the microkernel thing so well that they shrunk it out of existence.

Anonymous said...

Patents can fuck off, especially software patents. Everyone should "violate" them as much as possible...

Oh yeah?

Tell that to Pamela Jones of Groklaw and she'll tell you to fuck off!

Substance abuse management system and method


That's the patent regitered by PJ's buddies from MedAbiliti.

More info

Have you wonder why Pamela Jones began Groklaw?

She began Groklaw as a public relation apparatus to indirectly protect XM Network from FUD generatared by SCO v IBM.

Anonymous said...

why you complain about tv tuner not working. you dont need to watch tv on a computer LOL silly wintard buy a tv!!! and if you want play games buy a Wii or playstation, a computer is not for games as playstaion is!! a computer is a tool not for entertainment!!

i can do everything i want using linux and i dont have to worry about virus or spywares or bill gates controlling my computer because i have FREEDOM do you understand.

linux and freedom is the future, microsoft is collapsing in the economy and cannot survive much longer.

Freddy said...

@Exactly!:
P.S. A topic for another post: How come Gtk+ (paired with e.g. pygtk or
something) call itself multiplatform,
when it looks like... shit on Windows?


It is multiplatform because it looks the same shit on Linux.

And by the way, YES LINUX HAS A GUI and YES IT SUCKS ON THE USABILITY AND COHERENCE SIDE.

Want to copy from OSX? Well, copy the right things, not the fancy shit.

wbk said...

a computer is a tool not for entertainment!!
Wow. Just WoW.

Anonymous said...

"... because i have FREEDOM."

No point on discussing 'freedom' according to freetards.

By looking at Linux market share in the desktop, it seems computer users aren't buying into that 'freedom' bullshit.

HA HA HA!

Anonymous said...

you dont need to watch tv on a computer LOL silly wintard buy a tv!!! and if you want play games buy a Wii or playstation, a computer is not for games as playstaion is!! a computer is a tool not for entertainment!!

YouDontReallyNeedThatAnyway(TM)

Linux, freedom to do everything its coders want you to. For everything else, there's Windows.

Anonymous said...

The Linux commandments to Desktop Development

* Program a new minesweeper
* Have a wank
* Focus on gloss, not on debugging
* Focus on more gloss, not on documenting
* Copy, don't innovate
* Ignore feedback
* Steer clear from real apps
* destroy anything which resembles a workflow

Anonymous said...

"linux and freedom is the future, microsoft is collapsing in the economy and cannot survive much longer." By ignorant freetard

Similar idiotic comments were being made by Amigoids in the late 80s, early 90s.

Long live the curse of the Amiga syndrome!

Anonymous said...

Oh God, it was even worse because the Amiga actually did things other systems didn't. Imagine the freetards' egos if only Linux had video editing capability but the interfacing still sucked. It'd be the Amigidiots all over plus millions more.

Freetards are closer to OS/2 wankers. It, too, could run Windows 3.1 apps perfectly, changed the interface in pointless ways between versions, and was difficult to develop for (and pointless since it ran all the 16 bit Windows and DOS apps). However, it had no software, and nothing to show except for ATMs and a weirdly fanatical fanbase, especially considering they all rooted for Microsoft over IBM five years prior.

Anonymous said...

"... and a weirdly fanatical fanbase..."

Mostly IBMers:

IBM Is Offering Workers Prizes to Hawk OS/2

Paul B. Carroll, Staff Reporter
The Wall Street Journal

March 27, 1992

International Business Machines Corp.'s sales force is already bigger than many armies, but as IBM prepares to start shipping its much-maligned OS/2 operating system, it has decided it needs reinforcements.

So IBM is about to launch a program that will attempt to turn all its 344,000 employees into salesmen for the personal-computer software, which is in a fight for its life against Microsoft Corp.'s Windows juggernaut.

IBM will offer employees incentives ranging from medals to IBM software or hardware to cash, depending on how much effort they put into OS/2. In return, says Lucy Baney, an IBM marketing executive, the company will ask employees to approach their neighbors, their dentists, their schoolboards. Armed with brochures and talking points, the IBMers will sing the praises of OS/2 as the solution to people's personal-computing needs.

IBM is pulling out most of the stops in advertising and pricing, too, as it prepares for one of the stiffest marketing battles the personal-computer industry has seen. IBM must not only overcome Microsoft's considerable momentum but must also face a Microsoft marketing effort that, while very different and more low-key, is just as intense in support of Microsoft's latest version of Windows. In fact, the situation here is the reverse of the one IBM normally sees; Microsoft is the entrenched power and IBM the struggling competitor attempting to dislodge it.

"There's a very serious commitment to energize the whole company behind the product," says Fernand Sarrat, the top OS/2 marketing executive.

Although he declines to be specific on IBM's advertising plans, he says that "there isn't an IBM U.S. ad campaign that will receive anywhere near the dollars that OS/2 does" this year. That easily puts OS/2 advertising spending into the tens of millions of dollars, not counting the money IBM will spend on extensive international ads.

Mr. Sarrat says the campaign will be informational rather than the sort of macho advertising that has been rumored in the trade press; one slogan that was reportedly considered was "Curtains for Windows." But Mr. Sarrat adds: "It's not that we'll be namby-pamby. That's for damn sure."

The campaign will really start rolling next month. IBM's new version of OS/2 will be available to corporate customers next week, meeting IBM's commitment to deliver it by the end of March, but it won't be widely available in retail outlets until the second half of April. So even though Microsoft has already begun its campaign, in advance of its April 6 introduction of Windows 3.1, Mr. Sarrat says IBM has decided to wait a bit.

On pricing, he says that people who have the latest version of OS/2 will get the new version free. Many users of Microsoft's Windows and DOS will also get huge discounts off the list price of $195, but Mr. Sarrat declines to be specific, lest he tip his hand to Microsoft. (Windows 3.1 will have a list price of $150.)

IBM's pricing plan means it will be taking losses on many of its initial OS/2 sales. Software securities analysts have estimated that IBM must pay Microsoft a royalty of about $20 on each copy of OS/2, because it contains Windows software. They have said it also costs $30 or more to produce each copy of OS/2. And those figures don't include any of IBM's marketing expenses, any of the corporate overhead that eats up more than 30% of IBM's revenue or any of the OS/2 development expenses that have totaled hundreds of millions of dollars.

"This is a long-term war," Mr. Sarrat says.

He predicts that IBM will sell millions of copies of OS/2 this year, even though it has sold something less than one million copies in the five years since OS/2's introduction. Mr. Sarrat even goes so far as to predict that within a few years OS/2 will be outselling Windows, which Rick Sherlund, a securities analyst at Goldman Sachs, predicts will sell 11 million to 12 million copies in the fiscal year ending June 30 and 17 million copies the following year.

"It won't happen this year or next year," Mr. Sarrat says, "but after next year it's fair game."

In contrast to IBM, Microsoft will spend most of its effort "making sure people have a good experience" with its new version of Windows, says Steve Ballmer, a Microsoft executive vice president.

Microsoft will spend plenty of money on advertising, including its first television campaign. Mr. Ballmer says a published estimate of a $31 million marketing effort "is probably low even as a U.S. number." Microsoft will also be aggressive on pricing, offering upgrades to the new version for $50 initially.

But Mr. Ballmer says most of Microsoft's effort will go into a huge program to train computer dealers, to offer workshops to heavy Windows users and to help people get information on how to use the product. Patty Stonesifer, a Microsoft vice president in charge of customer support, says that Microsoft has 500 people available to answer telephone callers' questions on Windows, up from 70 when the prior version of Windows came out in May 1990. She says Microsoft has also invested heavily in an electronic bulletin board to keep users up to date on problems that surface with the software and to provide the latest tips on how to use Windows better.

"Making Windows easier to use will be a demand generator in itself," she says.

Microsoft has also mounted an aggressive public-relations campaign in recent weeks, having executives do waves of interviews to address OS/2. The executives have argued, in particular, that while OS/2 may make sense for IBM's traditional corporate users, it is too complex and requires too much memory to attract the broad mass of users who have been drawn to Windows.

Still, Mr. Ballmer acknowledges that many people in the computer industry and many users "are rooting for some healthy competition. People want a healthy, knockdown, drag-out fight. But we haven't shipped, and IBM hasn't shipped. In the next few weeks, we'll see what happens."

Copyright Dow Jones & Company Inc

Anonymous said...

Yeah the Microsoft of the 90s was a machine. But the Microsoft of today is a dead shark, similar to IBM of the 90s.

Anonymous said...

Interesting article; it shows how out of touch IBM was. But what about the wackos with active ports of Firefox, Virtualbox, and Scribus? That's way, way beyond IBM sponsored marketing.

Anonymous said...

That article helps understand the PR apparatus behind freetards.

Follow the money and you'll reach IBM.

Anonymous said...

Wolvix GNU/Linux receives US$200.00

We are pleased to announce that the recipient of the February 2009 DistroWatch.com donation is Wolvix GNU/Linux, a Slackware-based
desktop distribution and live CD.

Cool, IBM stopped being generous to freetards.

Apparently, IBM isn't throwing any more shit against the wall. Hopefully this will help kill new distros.

LOL

Anonymous said...

Oh yeah?

Tell that to Pamela Jones of Groklaw and she'll tell you to fuck off!


This goes straight against those that claim MS astroturfers don't bother with blog posts. It also gives hints about the motive of LH and LHR. MS is getting desperate: they have no way to fight linux, dividing the community seems to be the only way to at least delay the inevitable.

Anonymous said...

That article helps understand the PR apparatus behind freetards.

Follow the money and you'll reach IBM.

This doesn't stick, buddy. Better find a new one.

Anonymous said...

It also gives hints about the motive of LH and LHR. MS is getting desperate: they have no way to fight linux, dividing the community seems to be the only way to at least delay the inevitable.

That's quite hilarious. The so-called community excels at that on their own. In fact, it's the only thing they truly excel at.

Anonymous said...

Guys... please don't feed the trolls in here. The more attention you give them the more they grow... Leave them alone. Their time is precious for their every day malware hunt.

If LH was serious or at least had serious arguments would be right here to discuss. But he isn't (like any good troll that respects itself).


Greetings from Greece

Anonymous said...

If LH was serious or at least had serious arguments would be right here to discuss. But he isn't (like any good troll that respects itself).

There are no good arguments when it comes to criticize Linux. Since its a gift from the Gods to us, mere poor mortals, it has no possible fault.

And if you say otherwise, you are a Troll. I mean, a heretic.

Anonymous said...

Linux is the new OS/2!

Dear LinuxOS developers, please abandon ship called Linux and take BSD or Solaris, pretty it up like Apple did, and make some hard core choices without Linus or Stallman or other beholders of the linux throne in your way.

Make a concerted effort and create a real alternative for us.

Linux had almost 20years and FAILED and WILL FAIL as a desktop OS for years to come.

Anonymous said...

@March 3, 2009 6:24 AM

Actually, in large part they already have. Just ignore the linux os for a second.

I take the notable example of an application that got it start on linux, but has moved to primarly a windows application. X-chat. How ironic is that?

But in the end, the same thing has already happened with every notable application. They invision a linux killer app, find out that the market is on windows, and the linux version languishes after that.

Im sure the people on this blog can fill in some more examples. Also, I would love to hear the counter examples, applications that got their start on windows, then migrated to linux. I can't think of any of those. There has to be at least one.

Anonymous said...

@March 3, 2009 6:24 AM

Actually, in large part they already have. Just ignore the linux os for a second.

I take the notable example of an application that got it start on linux, but has moved to primarly a windows application. X-chat. How ironic is that?

But in the end, the same thing has already happened with every notable application. They invision a linux killer app, find out that the market is on windows, and the linux version languishes after that.

Im sure the people on this blog can fill in some more examples. Also, I would love to hear the counter examples, applications that got their start on windows, then migrated to linux. I can't think of any of those. There has to be at least one.

Anonymous said...

I have examples:

Firefox: back in the Mozilla Seamonkey Milestone days, most of the testers were Linux users. Windows already had IE 5, but Linux had nothing other than the rotting corpse of Netscape 4. So even crashy alpha builds were often preferable. I think we all know who Mozilla cares about now.

ffmpeg/mencoder: Quietly, these have become one of the most successful open source projects to date. I say quietly because many people use them and aren't aware of it. ffmpeg now forms the nucleus of a powerful Windows media framework, and both ffmpeg and mencoder are used extensively in video encoding, amateur and professional. We all know who has video editing and who does.

Apache: While not really a Windows stronghold, the Apache team devoted a ridiculous amount of resources to make Apache competitive on NT, which led to Apache 2.0. Early on, there was no performance benefit at all to fork() based systems, all the work of Apache 2.0 was in threading, which is what Windows uses.

Anonymous said...

Actually a program forked from Nvu called Kompozer is out for Windows and Linux (version 0.8a1). The Linux version seems to have a lot of bugs just running, and it also crashes often when it does run. I've been testing the windows counterpart of the same build and it works fine.

Again, development on Linux is a conundrum. There is just no way it is a good stable API platform for freeware or commercial application development. The LinuxOS is an impediment to itself. As a simple server that's fine, but even that can be done with Solaris/BSD.

As an application developer,you're better off creating for windows; If your program is good enough, you have a larger base for donations or possibility of another company buying rights to your program. Plus it will flat out look way better than it ever will under LinuxOS.

Now if only the BSD community can pull together and create a unified DE that is NOT KDE, (maybe Gnome is OK if it was altered a little more), then it may be the next best alternative platform than Linux.

Anonymous said...

"MS is getting desperate: they have no way to fight linux, dividing the community seems to be the only way to at least delay the inevitable."

Sorry buddy, it's being divided ever since Stallman preached for free documentation and free manuals. Remember O'Reilly pist off at Stallman?

Tell me the real motive for the rise of the open source movement v the free software movement:

From: r...@gnu.ai.mit.edu (Richard Stallman)
Subject: Re: "Programming with GNU Software", 35% off at Readme .Doc!
Date: 1997/03/07
X-Deja-AN: 223707344
Sender: gnu-emacs-sources-requ...@prep.ai.mit.edu
References: <331e2970.45156041@news.digex.net>
x-gateway: relay6.UU.NET from gnu-emacs-sources to gnu.emacs.sources; Fri, 7 Mar 1997 01:14:25 EST
Reply-To: r...@gnu.ai.mit.edu
Newsgroups: gnu.emacs.sources


Please do not use GNU newsgroups to advertise books (or software)
that users are not free to copy and redistribute.

A good operating system includes documentation. If the operating
system is free, this documentation needs to be free as well. The GNU
system and other free operating systems do not have all the
documentation that they need; there are major gaps.

One of the reasons these gaps persist is that many people think that
non-free documentation is sufficient--people do not recogize the gaps
as a real problem, because they think that non-free documentation is
sufficient. When users buy non-free manuals about GNU software, this
just encourages writing more of them--instead of the free manuals
(free in the sense of freedom) that we need.

So the GNU project hopes that you will not buy such books if you can
possibly help it.

The decision of what you buy is up to you. But whatever you decide,
please don't advertise these books on our newsgroups and mailing lists.

**********

From: Richard Stallman r...@gnu.org
Subject: Free Software Needs Free Documentation
Date: 1998/08/05
X-Deja-AN: 378396727
Distribution: world
Approved: info-...@gnu.org
Followup-To: gnu.utils.bug
X-Complaints-To: usenet@entertainment-tonight.ai.mit.edu
X-Trace: entertainment-tonight.ai.mit.edu 902374873 21285 18.43.0.47 (6 Aug 1998 03:41:13 GMT)
Organization: MIT Artificial Intelligence Lab
Reply-To: r...@gnu.org
NNTP-Posting-Date: 6 Aug 1998 03:41:13 GMT

[Please repost this wherever it is appropriate.]

A couple of weeks ago we invited people to send in nominations for the
Free Software Award (see http://www.gnu.org/gnu/award.html). Some of
the many replies we've received nominate a person for publishing
manuals that *are not free*.

This highlights the fact that many in the free software community
don't realize that there is an issue about whether manuals are
free--don't realize that a non-free manual, like a non-free program,
fails to contribute to our community.

Here's an article which explains the issue. If you know someone who
is writing a manual for a free program, please forward it to him or
her.


Free Software and Free Manuals
-- Richard Stallman

The biggest deficiency in our free operating systems is not in the
software--it is the lack of good free manuals that we can include in
our systems. Many of our most important programs do not come with
manuals. Documentation is an essential part of any software package;
when an important free software package does not come with a free
manual, that is a major gap. We have many such gaps today.

Once upon a time, many years ago, I thought I would learn Perl. I got
a copy of a free manual, but I found it hard to read. When I asked
Perl users about alternatives, they told me that there were better
introductory manuals--but those were not free.

Why was this? The authors of the good manuals had written them for
O'Reilly Associates, which published them with restrictive terms--no
copying, no modification, source files not available--which exclude
them from the free software community.

That wasn't the first time this sort of thing has happened, and (to
our community's great loss) it was far from the last. Proprietary
manual publishers have enticed a great many authors to restrict their
manuals since then. Many times I have heard a GNU user eagerly tell
me about a manual that he is writing, with which he expects to help
the GNU project--and then had my hopes dashed, as he proceeded to
explain that he had signed a contract with a publisher that would
restrict it so that we cannot use it.

Given that writing good English is a rare skill among programmers, we
can ill afford to lose manuals this way.

Free documentation, like free software, is a matter of freedom, not
price. The problem with these manuals was not that O'Reilly
Associates charged a price for printed copies--that in itself is fine.
(The Free Software Foundation sells printed copies of free GNU
manuals, too.) But GNU manuals are available in source code form,
while these manuals are available only on paper. GNU manuals come
with permission to copy and modify; the Perl manuals do not. These
restrictions are the problem.

The criterion for a free manual is pretty much the same as for free
software: it is a matter of giving all users certain freedoms.
Redistribution (including commercial redistribution) must be
permitted, so that the manual can accompany every copy of the program,
on-line or on paper. Permission for modification is crucial too.

As a general rule, I don't believe that it is essential for people to
have permission to modify all sorts of articles and books. The issues
for writings are not necessarily the same as those for software. For
example, I don't think you or I are obliged to give permission to
modify articles like this one, which describe our actions and our
views.

But there is a particular reason why the freedom to modify is crucial
for documentation for free software. When people exercise their right
to modify the software, and add or change its features, if they are
conscientious they will change the manual too--so they can provide
accurate and usable documentation with the modified program. A manual
which forbids programmers to be conscientious and finish the job, or
more precisely requires them to write a new manual from scratch if
they change the program, does not fill our community's needs.

While a blanket prohibition on modification is unacceptable, some
kinds of limits on the method of modification pose no problem. For
example, requirements to preserve the original author's copyright
notice, the distribution terms, or the list of authors, are ok. It is
also no problem to require modified versions to include notice that
they were modified, even to have entire sections that may not be
deleted or changed, as long as these sections deal with nontechnical
topics. (Some GNU manuals have them.)

These kinds of restrictions are not a problem because, as a practical
matter, they don't stop the conscientious programmer from adapting the
manual to fit the modified program. In other words, they don't block
the free software community from doing its thing with the program and
the manual together.

However, it must be possible to modify all the *technical* content of
the manual; otherwise, the restrictions do block the community, the
manual is not free, and so we need another manual.

Unfortunately, it is often hard to find someone to write another
manual when a proprietary manual exists. The obstacle is that many
users think that a proprietary manual is good enough--so they don't
see the need to write a free manual. They do not see that the free
operating system has a gap that needs filling.

Why do users think that proprietary manuals are good enough? Some
have not considered the issue. I hope this article will do something
to change that.

Other users consider proprietary manuals acceptable for the same
reason so many people consider proprietary software acceptable: they
judge in purely practical terms, not using freedom as a criterion.
These people are entitled to their opinions, but since those opinions
spring from values which do not include freedom, they are no guide for
those of us who do value freedom.

Please spread the word about this issue. We continue to lose manuals
to proprietary publishing. If we spread the word that proprietary
manuals are not sufficient, perhaps the next person who wants to help
GNU by writing documentation will realize, before it is too late, that
he must above all make it free.

We can also encourage commercial publishers to sell free, copylefted
manuals instead of proprietary ones. One way you can help this is to
check the distribution terms of a manual before you buy it, and
prefer copylefted manuals to non-copylefted ones.


Copyright 1998 Richard Stallman
Verbatim copying and distribution is permitted in any medium
provided this notice is preserved.

**********

From: David Downie
To: ask_tim@oreilly.com
I have a question for Tim O'Reilly.

Aparently Stallman attacked you at one of your conferences, and pointed out that ALL of the arguments for open source SOFTWARE can be applied equally to DOCUMENTATION ie BOOKS. ie YOUR BOOKS.

How do you respond to the argument that all of your support is just to line your own pockets through your refusal to release your books under the GPL? You have the best unix books, Open Source is unix, it is in your best interests for the whole world to learn unix and hence buy your books.

It is an interesting argument, and I haven't heard what your response was (although I did hear it was quite lengthy). I would appreciate it if you could outline your rebuttal either personally or on your web page (and point me to it).

Regards

David Downie

**********

David--

>I have a question for Tim O'Reilly.

>Aparently Stallman attacked you at once of your conferences, and pointed out that ALL of the arguments for open source SOFTWARE can be applied equally to DOCUMENTATION ie BOOKS. ie YOUR BOOKS.

This isn't quite right. He argued that to the extent that documentation can be considered a necessary part of a product, that it ought to be free, so people can modify it when they modify the software. I agree with him about basic documentation for a product--and, for example, if you look at the Perl distribution, there is a lot of good free documentation that is part of it. All free software authors need to take seriously the job of documenting their work, and users of a product can helpfully contribute documentation in the same way as they contribute bug fixes or improvements.

When he takes the next step and says that authors of value-added books have an *obligation* to make them free as well--there we part company.

>How do you respond to the argument that all of your support is just to line your own pockets through your refusal to release your books under the GPL? You have the best unix books, Open Source is unix, it is in your best interests for the whole world to learn unix and hence buy your books.

Richard thinks there is a moral imperative underlying the free redistribution of software, and now, by extension, other information. Richard feels that since there isn't any physical cost associated with copying software, limiting free redistribution is a form of extortion. I on the other hand feel that it's immoral to try to compel someone else to give you something they've created without compensating them in some way. That is, when software is freed, it is a gift, not the result of an obligation. I found Richard's comments at the Open Source Developer's Day, where he called John Ousterhout a parasite because he now wants to build proprietary tools on top of tcl, a defining moment. This is akin to children feeling that their parents owe them an inheritance, or people on welfare feeling that the government owes them a handout. Richard should be grateful for what John has already given, not castigating him because he doesn't want to give even more.

I believe that the imperative underlying Open Source is pragmatic rather than moral. That is, it's a great development methodology that leads to better software and leads to the formation of communities around that software. Eric Raymond's paper, The Cathedral and the Bazaar, sets out a lot of these pragmatic arguments for Open Source software.

The pragmatic arguments for open source software include:



You've developed the product to "scratch your own itch" and because bits are easily copyable, there is no incremental cost, and possible incremental benefit, from giving other people access to your work.


By releasing as Open Source a tool that you originally developed for your own use you gain a cadre of co-developers who will give you back bug fixes, additional functionality, and the gifts of their own work on the same terms. This is a value-exchange, just not a monetary one. And yes, there are "free riders" on the system, but even these free riders, who contribute no code, contribute their attention and the good word, thus spreading the use of the package and finding you more potential contributors. (Effectively, they are carriers of your meme.) They also give you their esteem. As Raymond points out, in a "gift culture", you get status as a result of what you give away. Status has always been important in the hacker community.


By releasing Open Source software, you contribute to the overall health of the programming community, who will give you products of their own work. You benefit from the entire ecosystem, not just from the input of the community into your own specific project.
Now, let's look at those arguments as they apply to books.

First, very few people write a book to "scratch their own itch." Instead, people generally write books for two reasons, to serve other people (a noble goal), or to earn something (either money or esteem, or both) for themselves. So one of the fundamental motives underlying free software is often missing. If someone wants to write a book because they believe in the Open Source or free software movement, more power to them. I'll be happy to publish that book if I think it's a good book. But I'm not going to tell my authors that they have an obligation to do so!

In fact, we have published several books under copyleft (most notably the Linux Network Administrator's Guide) at the request of the authors, and are prepared to do so for other books, provided that the authors are aware of the negative impact that we believe this will have on their sales (and consequent royalties.) For instance, when we published the Linux NAG, it was republished by both SSC (as a standalone book) and Macmillan (as part of a "Linux Bible" or some such), neither of whom paid royalties to the author. Their costs were therefore lower--but more to the point, they took away some of the market for the book. The Linux NAG has always done less well for us than other Linux books for which we'd expect comparable sales, and perhaps partly as a result, the author was demotivated to continue updating the book. (We've now got someone else working on a new edition.)

This experience seems to undercut the argument that just like free software, freely available documentation will "spread the word" and result in wider use of the product, thus greater book sales. This does not seem to be the case in practice. Apart from the fact that there are significant costs in book publishing and distribution that are not necessarily there in software, you have to take into account the differences in distribution methods for the two products. While most free software is downloaded directly by the user, or put on a machine by a vendor, books are typically purchased through an extremely inefficient system. Only about 10% of our books are sold directly to the end customer. The greatest majority are sold through indirect channels like bookstores, libraries, and corporations. And the buying patterns of these middlemen are not under the control of the publisher. There is far more product than available shelf space, so books that sell below a certain threshhold are returned and not reordered. So, for example, if there are three versions of the same book, they may well divide the market when they are first released, leading the middlemen to decide that there's not enough demand to keep any of the titles in stock.

In order to combat this problem, we're working (with a number of other publishers) on a license that will allow binary redistribution of a book's contents with the software, but will reserve the right to print, publish and sell printed copies to the original publisher. But mind you, even with that license, we're going to be experimenting, not going at it wholesale. The point is that if Open Source is really a better model, it ought to lead to better results. If it doesn't, then we need to learn from experience where it fits and where it doesn't.

Giving away books as open source would certainly seem to meet the third benefit I listed above, of contributing to the overall health of the community, but in fact, I'm not sure that even that is true. I believe that without the economic incentive that a publisher offers, far fewer books would be written, and that the net gain for the community would be less.

At bottom, this is true even of software, I think. There are value-added features that the Open Source community doesn't seem to be good at providing, and that commercial companies can do a better job at. New companies like Sendmail, Inc., Scriptics, and ActiveState are exploring a hybrid business model, which relies on Open Source as an important technique for spreading the word, fostering innovation, improving product quality, and giving back to the community, but reserves certain products as value added so that they can make money.

At the end of the day, I see Open Source as a free market economy of ideas. If you look at free markets, there are all kinds of value exchanges, not just exchanges of money. But if you take away value exchange altogether, and mandate forced sharing, you vitiate the very thing that you hope to create.

I believe that the generosity that is at the heart of the Open Source community is a kind of enlightened self-interest, not a moral imperative. It behooves all of us to find the right balance between what we give away and what we keep for ourselves. An awful lot depends on your goals. You pick the hat to fit the head.

I give great kudos to Richard and to anyone else who gives away their software without any thought of personal gain. I also give great kudos to people who figure out how to build something valuable enough that other people want to pay for it. And I see no reason to tell people they should move to one side of the balance or the other.

Well, that's not quite true. I do believe very strongly that there are some things that are common goods, that belong to everyone by right. A good analogy is the environmental movement. Even if we don't own the environment, we have some rights in it. And someone who clear cuts the forest and doesn't replant it, or pollutes and doesn't clean up after themselves, is wronging the rest of us. In a similar way, someone who takes something that was created by others and tries to take away the freedom that was granted by the original creator is the equivalent of an environmental criminal.

But here, once again, I part company with Richard. I far prefer the BSD style licenses to the GPL. If I build proprietary added-value on top of someone else's work, I don't take anything away from them or their users. They have no obligation to use my added value. Where the "environmental crime" comes in is when you try to actually deny people the right to use the original package. This is the kind of thing we saw talked about in the Halloween Document, where Microsoft was talking about subverting open standards.

In short, I feel that undercutting an open standard so that other people's products don't work is wrong. But simply adding value, and offering a derivative product is fine--as long as that was the intent of the creator of that product.

A really good example of this is the X Window System. It was designed with the express intent that people would build proprietary products on top of it. So when GPL fans deride the license because it allowed people to build things that weren't contributed back to the community, they are deriding the intent of the creators. It's a little bit like Puritans complaining because other people are having fun.

That being said, there are definitely issues with the BSD style license that the GPL addresses. In particular, it is possible for the market to fragment more easily. But like software, licenses need to evolve to more perfectly meet our needs. A lot of good work and thinking is going into this area, and I'm confident that we'll eventually find even better ways to encourage both the common good and the economic incentives that can be an important stimulus to the creation of value.

And once again, I think this is in the end a pragmatic or scientific issue, not a religious one. Open Source is like gravity. Our goal is to understand the laws that make it work, not to decree how we'd like it to work.

My goal is to encourage people to write great books that really help people, and I'm charting the best course I know how towards that goal. So far, I think the results speak pretty well for themselves.

--Tim O'Reilly

Anonymous said...

"Dear LinuxOS developers, please abandon ship called Linux..."

"Actually, in large part they already have. Just ignore the linux os for a
second.

"I take the notable example of an application that got it start on linux, but has moved to primarly a windows application."

GIMP for Windows

Anonymous said...

Freetards' advocacy tactic number 1:

"There are no good arguments when it comes to criticize Linux. Since its a gift from the Gods to us, mere poor mortals, it has no possible fault.

"And if you say otherwise, you are a Troll. I mean, a heretic."

Anonymous said...

A 'community' divided

Torvalds v Stallman - Fast forward to minute 11:20 in the following video:
Revolution OS: Free Software goes Free Enterprise  [52.8MB WMV]

Stallman v Torvalds - Fast forward to minute 5:36 in the following video:
Revolution OS: The Revolution goes Prime Time  [47.7MB WMV]

Notice how Torvalds ignores Stallman during Stallman's presentation

Revolution OS

Revolution OS: What is Linux? What is Open Source  [49MB WMV]

Revolution OS: Free Software goes Free Enterprise  [52.8MB WMV]

Revolution OS: Free Software and Netscape's big gamble  [40.9MB WMV]

Revolution OS: SVLUG, BALUG, LALUG, Linux User Groups  [25.7MB WMV]

Revolution OS: The Revolution goes Prime Time  [47.7MB WMV]

Anonymous said...

Conclusion -- Computers (in general) suck. All of them without an exception. WIntel sucks for money, Apple -- for a bit more money, OSS sucks free of charge (virtually). Non-necessary escalation of HW/SW power is the reason why we've got such a crap like DRM and why media companies are trying to control every single step of the real or even potential watcher or listener. My mobile has the power of 1980 supercomputer... what for?... I want back to vinyl times ;)

Anonymous said...

Nah, it's just Linux that sucks.

Anonymous said...

As I stated in the previous discussion, my favorite moment of Revolution OS is 48 minutes in when Stallman rants about how software quality is unimportant and implores FOSS developers to concentrate on evangelism instead.

So don't listen to the people attributing this kind of behavior to a minority or concentrated sect. These orders come straight from the top: FOSS harasses by design.

Anonymous said...

Yeah, Linux sucks. It sucks as much as windows. Windows xp after time of usage is becoming slower and slower, despite my efforts to clean registry manually. Linux is useful for programing stuff and (fast load time) using the internet. Linux (few errors wich took about few days to repair) is same important for me as windows (shutdown takes 30 sec.) . if both OS is used as they should, both perform well (my friend always have some issues with windows , he reinstalls windows almost every three months, but for me it's usable OS, Linux has some annoying features, but I easily overcome them). I think all of the bloggers writing there are fools. If you are Linux fan work for it's succession, if you are Windows user, don't blame fool user, they will come to the dead end without critics (or maybe not). But remember if you are not tried other OS you can't criticize other OS. Still you have to choose: 1. Be simple, foolish consumer like normal guy 2. Be zealous creator. If you are unreasonably attacking Linux, you are not better than FOSS zealot, you are just in other side of baricades...

Anonymous said...

Yeah, Linux sucks. It sucks as much as windows. Windows xp after time of usage is becoming slower and slower, despite my efforts to clean registry manually. Linux is useful for programing stuff and (fast load time) using the internet. Linux (few errors wich took about few days to repair) is same important for me as windows (shutdown takes 30 sec.) . if both OS is used as they should, both perform well (my friend always have some issues with windows , he reinstalls windows almost every three months, but for me it's usable OS, Linux has some annoying features, but I easily overcome them)

This idea that Windows and Linux are equally valid desktop systems (let alone superior proclaimed by deluded Lintards) is a fallacy that has to die.

Anonymous said...

Poor Stallman, unable to take his own advice regarding free software and recognition. He probably cries himself to sleep every night mumbling "GNU/Linux... it's GNU/Linux..."

Kharkhalash said...

I don't know, you mean like an API?

This post has some potential. Come on LH please go into detail. Thanks.


The topic has been beaten to death over and over again here.

There's no stable, unifying ABI/API in Linux. Chances are there never will be if things keep up as they are; the kernel devs consider it a feature.

Look up the problems with things like VMware breaking between point releases (since hooks and whatnot are removed between versions for no apparent reason). this is especially true in the realm of audio, where there are several competing, everchanging subsystems OSS, ALSA, SDL, PulseAudio, ESD, Esound, Jackd, etc.

Compare that to say, OS X where there's cocoa and it's core framworks:CoreAudio, CoreImage, etc. Or Windows which has DirectX and .NET, as well as the legacy win32 API. Or FreeBSD and Solaris which have their stable core systems. Hell, FreeBSD even manages to retain better ABI compatability with Linux (via the Linuxulator) than Linux manages to keep with itself (which refutes Greg KH's insistence that retaining a stable ABI in Linux is impossible: clearly it isn't). The truth is, as suggested in Greg KH's readme on the issue, is that it's intentional and a means to try to coax manufacturers into releasing driver code to the kernel devs. They're sacrificing usability to push personal politics.

KDE does this to a certain degree with KParts, but it's on too high a level for it to be useful, given the competing toolkits (tcltk, gtk, efl, Motif, foxpro, ncurses, etc). combine that with the mess that is the audio subsystem, and the graphics subystem (why does X11 have its own set of drivers, in addition to the kernel drivers? Plus the various graphics subsystems: OpenGL, SDL, etc). Plus,there's the added clustefuck that is doing away with the devel branch of the kernel (in the old days, even numbers (2.2, 2.4, etc) were stable releases, and development was done in the odd numbered brqanches (2.3, 2.5, etc). Since 2.6 all development happens live in the "stable" branch, subsystems are changed completely, others are purged, replaced, etc between point releases!

So a third party is trying to write an app. For OSX they'd write it in Cocoa and be done with it. It'll even continue to run with little effort, or later releases of the OS. Hell, on Snow Leopard, the developer doesn't even need to worry about taking paralellization or tapping into the GPU into account, since Cocoa will handle both on its own.

On Windows, they'd write it in .NET or Win32, and be done with it, and it'll continue to run via backward compatability subsystems.

On Linux, they'd have to cope with the complete and utter clusterfuck that is deciding to go with either QT, GTK, TK, EFL, Motif or Foxpro as a toolkit, then they'd have to decide between any one of a dozen of the various audio subsystems, weather to use the X11 drivers or the kernel drivers, weather to go with SDL or GL for video, then cross their fingers and hope that the kernel devs don't decide to randomly remove a hook they may have been using for no particular reason, or hope that something they're using isn't drastically changed between releases.

Furthermore there's the testing clusterfuck with LSB being a joke with nobody taking it seriously, and each distribution having its own set of core libraries, it's own versions of said libraries, one may have pulse, while the other uses ALSA, and one may be using a different version of Pulse, etc. multiply that for every layer the would be app may require access to. There's the issue of different version of the same distro, etc.

Take a look at Opera's download page, if you want to see just how ridiculous trying to write, test and support a third party application on Linux is: 44 versions of the same package for x86 Linux, another 11 for x86_64 linux, plus another 24 for PPC Linux, 79 versions of the same packages PER supported version of their product, per product, and there are still dozens of distributions which aren't directly supported.

It's a fucking nightmare.

Anonymous said...

"... my favorite moment of Revolution OS is 48 minutes in when Stallman rants about how software quality is unimportant and implores FOSS developers to concentrate on evangelism instead."

Fast forward to minute 8 in the following video to view and listen to Stallman's rant:
Revolution OS: Free Software and Netscape's big gamble  [40.9MB WMV]

Stallman knows that if free software developers focus on quality this will lead to infringement of patents and copyrights.

Anonymous said...

"Yeah, Linux sucks. It sucks as much as windows. Windows xp after time of usage is becoming slower and slower, despite my efforts to clean registry manually."

Bloated registry affecting performance is a non-issue with Vista since it switched to registry virtualization. There are a lot of improvements Vista has over XP, many behind the scenes and under the hood. Also, you never said what your computer specs are. If you only have 1gig, consider adding another gig or two. DDR2 ram is very affordable. You can get 4gig for under 50bucks easily.

Speed is only a portion of the "experience". What good is an OS if it boots fast and has no applications? No stable platform to develop for? No APIs like directX? Remember BeOS? It booted up very fast but there were few to non-existent apps. We all know what happened to that OS.

Anonymous said...

Kharkhalash, that was one of the best posts I've read that showed what the purpose of this site is all about.

It's not about brainless bashing without purpose, it's trying to force the issue and hope for soem change so we can find a true alternative.

Anonymous said...

"Take a look at Opera's download page, if you want to see just how ridiculous trying to write, test and support a third party application on Linux is: 44 versions of the same package for x86 Linux, another 11 for x86_64 linux, plus another 24 for PPC Linux, 79 versions of the same packages PER supported version of their product, per product..."

Thank you for showing us a great example of one of the major big, big problems with Linux: FRAGMENTATION.

Opera 9.64 for Linux i386 versions

And yes, open source developers call this feature!

Anonymous said...

@March 3, 2009 12:59 PM

The irony is that were Hurd ever to become a viable system, its name would need to be Debian/Hurd since the former has done most of the work on it by now. Imagine Stallman's reaction at being at the other end of that argument.

Anonymous said...

Opera is a joke in and of itself.

Anonymous said...

Opera is a joke in and of itself.

BlameLiesOnOthers(TM)

Anonymous said...

It's not about brainless bashing without purpose, it's trying to force the issue and hope for soem change so we can find a true alternative.

The bashing that goes on here is far from without purpose. Where you hold out hope, I use this place for catharsis. Not only do I think Linux is hopeless, I think it reached that point 5+ years ago.

At this point, Linux is an impediment to progress. On the desktop, it will never ever rival Windows or Mac OS (not even retired versions). FreeBSD and Solaris do the whole UNIX thing way better, and now Microsoft and Apple eat more and more into the embedded market along with long time favorites QNX and VxWorks.

FUD? I guess. I'd rather all the evangelism be spent pumping FreeBSD and the actually interesting Sun technologies and finding a fresher, more credible desktop threat to back. Linux is old news, and the whole thing is just a repeat of 70s-90s UNIX with the same inevitable conclusion.

oiaohm said...

" ME
For all the gain Window gets from closed source drivers it also gains a lot of pain.
PERSON WHO WILL NOT GIVE THERE NAME.
90% of the market says the gains outweigh the pains."

This just shows the class of fools I am dealing with. Why has MS invested large amounts of money building a .net based OS.

Simple reason there is no way to cure the pain of the NT binary driver issue bar kill this complete OS. Atombios style things is the way forward. Drivers done as platform neutral bytecode so they can be kept isolated from harming the rest of the system. ATI video cards is the testing ground for it.

Call for the right things not the wrong. Any OS stupid enough to follow the path of NT is walking straight into the same defects.

I don't argue with Linux Hater when he is right. Linux Hater at least manages to be in the ball park lots posting here from both sides are not even in the same city as the problem.

Anonymous said...

There's no stable, unifying ABI/API in Linux. Chances are there never will be if things keep up as they are; the kernel devs consider it a feature.

Read what IBM said about Linux development in 2004:

"Historically, there never was a formal source code management or revision control system for the Linux kernel... This lack of revision control often left gaping holes between releases, where nobody really knew which changes were in, whether they were merged properly, or what new things to expect in the upcoming release."

Link

Anonymous said...

I will agree with you here, I no longer find Linux as "hopeful" as I did before.

Now it seems on BSD/Solaris, there is still the issue of Gnome vs KDE, and already there are distros of each OS with different desktop environments. Maybe we will see the same Linux desktop mistakes repeated again?

"Linux is old news, and the whole thing is just a repeat of 70s-90s UNIX with the same inevitable conclusion."

Anonymous said...

Forget GNOME. Forget KDE. Forget the entire desktop. The only chance of survival for UNIX-like systems is to concentrate on what they do best: provide hard to use software that does powerful things in places where Windows doesn't fit.

Anonymous said...

This just shows the class of fools I am dealing with.

Yeah. You enlightened lintards know a lot about desktop operating systems. Your huge success is a testament to that *grin*

Anonymous said...

"This just shows the class of fools I am dealing with. Why has MS invested large amounts of money building a .net based OS."

Um maybe because .NET is a programming application platform and not just a driver level platform?

AtomBIOS, no matter how complete or whatever, will never EVER be the future for OS design. It will beneficial to Linux for sure, but that alone is not going to make it compete with something as old as XP, let alone Vista or OSX.

oiaohm said...

People wake up. All market types are merging.

Embedded were Linux had a reasonable home. These days even printers and modems are coming out with graphical interfaces.

There is no section for Linux to retreat where desktop does not apply any more. Also there is no where for MS to retreat to that embedded is not coming at them.

The market differences are ending. Result OS's has to change. Question what one ends up fitting best in the future market.

Anonymous said...

Forget GNOME. Forget KDE. Forget the entire desktop. The only chance of survival for UNIX-like systems is to concentrate on what they do best: provide hard to use software that does powerful things in places where Windows doesn't fit.

I think you are right. Let me write again here what I posted not long ago:

The mere premise is flawed: a free desktop operating system cannot be competitive.

It cannot be free as in freedom: a competitive OS needs vision, a direction. You can call it diversity and say it's a strength, but sorry, that's just bullshit. And chaos is just what you get when it's free and no one has the authority to call the shots and make hard decisions.

It cannot be free as in price: not only because massive amounts of money are needed for its development and you gotta sustain that, but also because the functionality and quality that users expect these days do not come for free. Patents for algorithms that make your fonts not suck and legal codecs out of the box have to be paid for.

Open source definitely has its place, but it's not desktop operating systems. It's just not gonna happen. Not now, not ever.

Anonymous said...

Embedded were Linux had a reasonable home. These days even printers and modems are coming out with graphical interfaces.

There is no section for Linux to retreat where desktop does not apply any more. Also there is no where for MS to retreat to that embedded is not coming at them.


DesktopsAreGoingAway(TM)

oiaohm said...

"Um maybe because .NET is a programming application platform and not just a driver level platform?"

.NET was specially altered to be able to create drivers. The issues between Windows drivers and the secuirty nightmares they cause is the complete reason MS spent time development a .net based OS.

XP and Vista may have the advantage. Problem here is not that much of an advantage. Overall hardware support Linux wins no question. XP and Vista don't operate on ARM, MIPS ... basically anything other than x86.

The market is in flux wake up to it.

Anonymous said...

"The market is in flux" = "The goal posts are being moved"

Anonymous said...

x86IsGoingAway(TM)

Anonymous said...

Actually you should be pissing in your pants; Intel is making the Atom platform smaller and smaller, able to run XP/Vista/OSX in a smaller affordable package for devices previously reserved for ARM/MIPS.

The Atom CPU is much easier to develop applications for, since it's straight x86, as more and more devices need smarter (not dumber)processors and faster development and debugging times. In time, it's very possible to see Atom powered ipods/phones/gps/multifunction devices. AtomBIOS is a dead end idea just like Linux Desktop.



"Overall hardware support Linux wins no question. XP and Vista don't operate on ARM, MIPS ... basically anything other than x86.The market is in flux wake up to it."

Anonymous said...

x86IsGoingAway(TM)

ROFL good trademark! I knew you'd pull out a good response to "OiDunnohm".

Anonymous said...

Thank goodness for that, I was getting worried that my vibrator wasn't going to be used for much else. Now thanks to Linux, I can surf and fuck myself at the same time!

"Overall hardware support Linux wins no question"

Anonymous said...

fixed it for ya:

"The Linux market is in flux wake up to it! And BTW the sky is falling! The Cubs win world series! Aliens attack from Mars!"

Anonymous said...

oiaohm, you have a habit of (inadvertently?) making Linux look even worse than what detractors posit, with your recent "separate platforms" bit as the most recent example. If what you say is true, don't you think those who have the most success in multiple markets (Apple, Microsoft) stand to succeed even further? Doesn't it seem unlikely that Linux would suddenly and magically become the most suitable system were every market to merge?

And Mr. "I don't spread myths" claims XP and Vista don't run on ARM when he knows very well that Windows Mobile commands an impressive marketshare on ARM devices.

Anonymous said...

I remember previous instances where the market was in flux.

1995: Everyone hates Windows 95! Year of the Linux Desktop!

1998: Everyone hates Windows 98! Year of the Linux Desktop!

2000: Everyone hates Windows ME! Year of the Linux Desktop!

2001: Everyone hates Windows XP! Year of the Linux Desktop!

2007: Everyone hates Windows Vista! Year of the Linux Desktop!

Anonymous said...

Dumb marketing Ideas 2009:
==========================

Hey guys! let's create a netbook but instead of using Intel atom, lets use ARM cpu! That way we can force people to use our version of crippled Linux, and they can't call us to ask how to install XP! Yes! this will cut down support calls!

Kharkhalash said...

Now it seems on BSD/Solaris, there is still the issue of Gnome vs KDE, and already there are distros of each OS with different desktop environments. Maybe we will see the same Linux desktop mistakes repeated again?

Doubtful, since neither Solaris nor the BSDs (this excludes the "desktop oriented" offshoots of FreeBSD, which are doomed to fail, mind you) are pitched as desktop operating systems, and more importantly, neither is designed to be a desktop OS.

Both Solaris and FreeBSD are designed primarily for server use, where the DE is irrelevent, and to a lesser degree, they're _pitched_ as workstation OSes as a _secondary_ function, where the DE is just left up to a matter of preference of the administrator/user.

Same goes for OpenBSD, which is designed for as pitched as a solution for mission critical, secure systems,not as desktops. Do you really need, or even care about a DE on a router or switch?

Then there's NetBSD which is mostly designed for the sake of consistency and (by resulting necessity) simplicity. Something that runs on everything under the sun, something that'll run on your random, exotic/obsolete hardware. It's not designed for desktop use, it's just designed to run. The DE is less important here.

Now, someone will think they're clever,and point out the obvious: There are people who run these OSes on desktop systems, but that doesn't change the fact that none of these systems are designed to be desktop systems. (and contrary to the Linux mentality, the distinction between a server OS and a Desktop OS isn't in the applications; installing Apache on Windows 98 doesn't make Win98 into a server _OS_, any more than installing KDE on FreeBSD or Solaris magically makes it into a desktop OS (which is why things like DesktopBSD are destined for failure). A server kernel is designed and optimized differently than a desktop kernel, because they have different priorities and serve different purposes.

The BSD and rest of the Unix world has, by and large, decided, and very wisely so, to continue doing what they're good at, and leave Desktop Unix/BSD to Apple, which is what they (Apple) are good at.

The mistakes of Linux won't be repeated in BSD and Solaris simply on the basis that unlike Linux, these systems aren't being pitched as a desktop, server and workstation OS all in one, and there's no attempt to design these systems to do all three, or really, when you get down to it, they aren't designed to do even two of the three.

Solaris may be capable of running desktop applications, but it's still designed (especially on Sparc), exclusively for the performance, reliability, security, stability and scalability demanded by its target market: the ultra high end server space and HPC, where the DE is irrelevant. The DE is just there in case you want to use it as a workstation or desktop, but is still ultimately irrelevant. It's the same with AIX: it runs CDE, but that doesn't magically make it a desktop OS.

Hell, KDE runs on QNX, too, and you certainly can use QNX as a desktop. It's just a waste of both a perfectly good desktop machine and a perfectly good QNX install.

Besides, the whole Gnome vs KDE desktop war ended a long time ago, and both lost.

the desktop Unix "war" as well as the desktop Unix DE "war" were over before either began, they were always one horse races. Apple and Aqua won both "wars", respectively.

Anonymous said...

Kharkhalash,

Prepare yourself for a long winded rant about how desktop, server, and embedded are all the same and the only reason we think differently is the Microsoft DRM conspiracy blah blah blah.

Anonymous said...

The Linux failure on the desktop is indeed an interesting item to analyze.

I think the desktop failure had to with a clash of two important groups in the Linux camp.

One group was made of arm chair programmers who were happy with their 'ugly' window managers.

The other group was made of new age hackers who wanted a Windows-Mac environment.

Red Hat wasn't able to play the interests of both groups.

Then Ubunto showed up when Red Hat was trying to decide what to do with their desktop dilemma.

Red Hat's window of opportunity on the desktop disappeared when freetards embraced Ubunto as their 'freedom'.

Red Hat was and is the biggest looser.

As the Linux server king today, Red Hat faces a pretty much fucked up future as it doesn't control the Linux desktop.

The second biggest looser is the freetard community as it doesn't have a winner desktop as the thing is 100% fragmented.

Poor Red Hat and poor freetard community.

alylu said...

Are you as sexy as you write?

In advance, thanks.

Anonymous said...

The problem with Linux zealots is that they don't know what they're talking about.

Anonymous said...

http://www.linfo.org/linux_myths.html

Anonymous said...

"Linux is useful for programing stuff"

Not really. The platform is a mess, the IDEs are a joke compared to Visual Studio, and the best thing to happen to linux development in a long ass time is a .NET clone! Programmers want a stable API and ABI that they can not have to worry about, and something they can make money on (we don't want to work for free idiots). I don't know where people get the idea that Linux is good for programming.

Anonymous said...

"The market is in flux" = "The goal posts are being moved"
Haven't you heard about the netbook market? About linux helping expand that market? About new netbooks on ARM? MS claiming that Win7 will run on netbooks? Do you live in a cave?

Anonymous said...

"Haven't you heard about the netbook market?"
Yes, we have.

oiaohm said...

x86 chips are flaws on there power usage compared to arm and mips. Problem is the flaws in power management in x86 come purely from the instruction set.

MS is not making a profit in the netbook market. Question is how long they can maintain that state before either having to retreat or die.

Beaware dell is embeding arm processes in laptops running Linux to give way longer runtimes than any x86 chip can do including atom.

Windows Mobile does not scale to desktop usage. This fake split is dieing. So far MS has not altered for it.

That computer world study is a little confusing when netbook makers are reporting between 30 to 60 percent Linux sales. Someone has to be wrong here. Somehow I would say computerworld.

I don't know where this stupid Idea that a OS has to be better to win comes from. If that was true everyone would not have used 95 instead would have used Mac OS or OS/2. Computer world would be way different.

The good enough factor. Is Linux good enough so to force MS out of making profit in the netbook market. Answer yes. Same effect of dos vs OS/2.

History is repeating itself. Linux year of desktop will most likely come after MS is staved to death.

Anonymous said...

Anonymous suggested:

http://www.linfo.org/linux_myths.html

Great Page. You just have to change "The Myth" for "The Truth" and vice versa.

Could be easily done with sed, should be a no-brainer for lintards.

Anonymous said...

"x86 chips are flaws on there power usage compared to arm and mips. Problem is the flaws in power management in x86 come purely from the instruction set."

I really doubt it has to do with the instruction set, since x86 CPUs use a microarchitecture design; the instruction set is really just an interface onto whatever Intel wants to put underneath it. The real reason Atom uses so much more power than ARM is because it's a faster CPU.

Recall that Intel is starting at the top and looking down, saying "how can we take our highish end Intel chips and make them smaller and more energy efficient". ARM chip manufactucturers, on the other hand, are working from the bottom up, saying "how can we take this low energy, small CPU and make it more faster".

"MS is not making a profit in the netbook market. Question is how long they can maintain that state before either having to retreat or die."

They are making a profit, just not as big a one as they do on ghigher end computers. Why do all freetards think Microsoft is on the verge of death? Wtf is wrong with you morons?

"That computer world study is a little confusing when netbook makers are reporting between 30 to 60 percent Linux sales. Someone has to be wrong here. Somehow I would say computerworld."

I doubt it; your numbers don't really agree with Linux usage share still being at sub-1%.

Anonymous said...

Really Microsoft's biggest competitor, as said recently by Steve Ballmer, is pirated copies of Windows. Usage of Linux might increase in the future, but even then it doesn't come close to the number of people running illegal copies of Windows.

Anonymous said...

MicrosoftIsDying(TM)

Anonymous said...

Intel pushing Atom beyond PC's, will compete will ARM soon:

http://tinyurl.com/cez2wk

Also from slashdot:

"Not only has Intel recruited another company to produce Atom CPUs, as covered earlier on Slashdot, the chipmaker also unveiled four Atom chips that will go into devices such as entertainment systems for cars, videoconferencing devices, robots and interactive kiosks. The Z500-series Atom processors are integrated chips the size of a penny that draw little power and do not require fans to operate. The chips draw 2.5 watts of power or less and run at speeds of between 1.10GHz and 1.6GHz. The chips offer integrated 2D and 3D graphics and will be manufactured using Intel's existing 45-nanometer process."

ARM is pissing in it's poopypants. Combine ATOM with CE/XP/OSX at a power draw of 2.5 watts and lower in the future. Nobody wants Linux on a smart device that can easily have OSX or XP or CE as better programming environments. Applications make the device complete, not the OS, and Linux has none of it. Look at iPod Touch/iPhone and winCE phone devices. There is an active third party programming community. There are tons more useful software on the iPod than tehre are on all teh linux distros combined.

Anonymous said...

"Linux is good for learning BASH"

Like when you bash your head on teh keyboard because your wife bought a new PC with Linux, when you could have gotten free Vista for 20 more and be happy and watch dvd and flash porn clips that have sound

oiaohm said...

I doubt it; your numbers don't really agree with Linux usage share still being at sub-1%.

Guess what netbooks funny enough over 99 percent of them never get connected to the internet. So no issue with them not showing up in web numbers at all.

This is the problem there is no clean way to say how many Linux machines are out there. Just like you cannot cleanly number the numbers of pirated software machines out there.

I am getting sick of the fools around here.

I really doubt it has to do with the instruction set, since x86 CPUs use a microarchitecture design; the instruction set is really just an interface onto whatever Intel wants to put underneath it. The real reason Atom uses so much more power than ARM is because it's a faster CPU.

The power problem is that x86 instruction set has to be translated to get performance. Arm and Mips don't have that problem they are native.

Fastest Atom is slower than the fastest ARM and the Fastest arm chip still pulls half the power of the Fastest Atom. So yep it suxs completely. Pull there powercharts some time.

Microsoft is shedding staff. That is not sign of a company doing well. There own internal reports blame the netbook market. Simple reason 20 dollars per machine is not enough to maintain there bloated system.

MS was forced to provide at 20 dollars per machine to stay in the netbook game. From dropping there price so low now they think they are going to be able to lift it back up.

Crush competitor by cutting prices really does not work against Linux. It gains market share short time. Long term it kills you bottom line.

Problem MS has they have given more companies more reasons to release computers with Linux on just to force MS price down.

Low enough price MS is basically dead man walking. It costs large amounts of money to maintain a OS.

Anonymous said...

"Guess what netbooks funny enough over 99 percent of them never get connected to the internet."

Then why are they called NETbooks?

Anonymous said...

YourStatsAreWrong(TM)

Kharkhalash said...

Windows Mobile does not scale to desktop usage. This fake split is dieing. So far MS has not altered for it.

Windows Mobile was never intended nor designed to run as a desktop system, it's not an issue of "scaling up to desktop usage". It's an issue of being designed for a specific use (mobile devices) and as such, on one hand sacrificing the flexibility fill multiple roles, but in doing so, gaining performance advantages in the role in focus.

It's the same predicament with Sparc/Solaris and Power/AIX being designed for ultra high end server space, where they have unrivaled performance, reliability, scalability and power, at the expense of being piss poor at everything else.

There's no "fake split" the distinctions between server/desktop/workstation/embedded were always there and always very real, very much in OS design, and more and more in CPU design. Compare a Sun Rock II or Niagara II to a standard embedded ARM chip, for example. The only thing they have in common is that they're both RISC-derived architectures. They're designed very differently, because they focus on to very different uses.

That computer world study is a little confusing when netbook makers are reporting between 30 to 60 percent Linux sales.

MSI reports something like a 75% return rate on Linux powered netbooks. Furthermore, sales figured don't show how many such systems were wiped and replaced with Windows.

Someone has to be wrong here. Somehow I would say computerworld.

Not necessarily. Someone may simply fail at reading between the lines or seeing the forest for the trees. I say it's you.

I don't know where this stupid Idea that a OS has to be better to win comes from. If that was true everyone would not have used 95 instead would have used Mac OS or OS/2. Computer world would be way different.

The game has changed. The was not as much entrenchment then as there was now. And the cost margin was much more drastic. Windows 3.x/x86 was competing head on with corporate Unix, and Microsoft (with Xenix) was already the dominant corporate Unix.

That was before Microsoft put a pc in every other home. Windows 3.x/9x was largely a reason for doing that, plus they scored that deal with IBM (note, that their previous partnership with Intel during the Xenix days helped with getting that).

Now, by the time OS/2 came around, and Mac OS was gaining traction, Windows was already entrenched and was on every PC in every other house. Dominating the market changed the game. It was no longer enough to be just good enough, or as good as Windows. When something is as entrenched as Windows was then, and still is now, you have be be vastly better, and vastly more convenient.

in the days of Windows '95, Windows had the edge on cost and convenience, they also had the benefit of being synonymous with "personal computer" thanks to Win3.x/DOS resulting in the PC becoming a household item. AND they score that OEM deal with IBM. Being first to the market gives you ridiculous leverage.

There was nothing to be better than when DOS/Windows first hit the market. There was no PC market. It was all corporate UNIX and mainframes. The Windows/intel package made computing accessible to the masses, there was nothing to compete against, and there was nothing to be better than. They weren't competing with the mainframes, that's what people don't understand.

By the time OS/2 and '95 came along, Microsoft already had the PC market by the balls. '95 had the upper hand in pure application support, and Windows was already at the point where PC means Windows. Apple shot themselves in the foot by going after better technology, and switching to the PowerPC architecture. By this point, Windows had already reached near-monopoly status, and the new, wildly different architecture meant that a lot of applications just couldn't run on ppc machines, furthermore, PPC cost more to manufacture due to
not being mass produced on the scale at which IBM was pumping out pentiums, in fact, PPC failed everywhere for those reasons, except on the Macintosh.

Think of it. a 486 PC cost something in the area of $3,000. A Mac cost even more, and same for the Amiga. Apple caught a lucky break with things like Illustrator and Corel Draw and Photoshop being originally developed for Mac OS.

The good enough factor. Is Linux good enough so to force MS out of making profit in the netbook market.

That's not a biproduct of Linux being good enough, it's a biproduct of Linux costing nothing. Don't confuse the two.

Answer yes. Same effect of dos vs OS/2.

There was no question to answer o_O. Furthermore, OS/2 didn't compete with DOS, it competed with Windows 3.1, and furthermore, 3.x was just a stopgap made for the sake of having something on the market while they developed NT. By the time 9x came around, NT was already taking over the corporate and workstation markets. Microsoft played it smart here and had NT running on x86, Sparc, MIPS, PPC and the Alpha. They achieved market dominance, then killed support for everything except x86 (thusly forcing Sun, DEC, SGI, Apple and IBM out of the mainstream workstation market). By the Time NT rolled out into workstation (win2k) and desktop (XP) modes, it was over for OS/2, which was still trying to compete with Windows 9x's inertia and market penetration.

By this point MS had both the market dominance, and better, newer tech.

History is repeating itself. Linux year of desktop will most likely come after MS is staved to death.

Your analogy is seriously flawed: OS/2 wasn't the entrenched, market dominating tech. And Microsoft didn't starve IBM to death.

I'll agree though, history will repeat itself, the entrenched, dominant tech will endure yet another challenge simply because "good enough" just isn't good enough anymore.

You're operating under the false pretense that the netbook alone is going to destroy the desktop and notebook markets. It isn't.

Furthermore, you're operating under the misguided delusion that there is no distinction between an embedded device and any other device. Slapping processor designed for embedded devices into a laptop doesn't make an embedded device. Tacking on an OS that is designed to be neither a desktop OS nor an embedded OS just adds to magnitude of this clusterfuck. Porting an OS to run on a chip designed for use in embedded devices doesn't magically mean that the OS is now designed to run on embedded devices. Just like stuffing an embedded chip into a laptop doesn't magically make ARM designed for a use it isn't designed for.

You're competing on the pretense that architecture exclusivity is enough to gain a stranglehold in a market when it isn't. embedded x86 is hot on the heels of ARM, and keep in mind that history is on the Atom's side here. x86 is what forced RISC architectures out of the workstation and desktop markets.

The only Markets RISC had ever beaten x86 in are market which there is no x86 designed to compete in (gaming consoles: all three current gen consoles run PPC or PPC-derived (ie. the Cell) archs, and the ultra-high end server space where Power and Sparc Dominate).

The bottom line, again, is that ARM is designed for embedded devices, and the netbook isn't an embedded device. This combination alone is a flaming clusterfuck waiting to fail.

Throw in the fact that Linux isn't expressly designed as an embedded OS (so why bother running it not only on a CPU designed for embedded devices, but one that isn't even being used for what it was designed to be used for?!)

Linux isn't expressly designed as a desktop/laptop (given that the desktop is essentially supposed to be a portable desktop) either.

It's an uber-mega-ultra-turbo fuckup of epic proportions just waiting to happen.

oiaohm said...

Netbooks is basically a name for a particular size of machine. When they were invented they were though to be used for internet access.

Wireless support mostly is used to sync with a persons home machine. Reason screens on them are normally too small to surf net well.

Guess what plans of mice and men. They are not being used for that. Most are being used as small note takers. Larger than a phone but small to carry compared to a full blown notebook. Yet notebooks are mostly not used for notetaking and are more likely to be on the internet than netbooks. Really if you could swap the two names it would make more logic.

Kharkhalash said...

A few correction, due to typing before having my coffee:

furthermore, PPC cost more to manufacture due to not being mass produced on the scale at which IBM was pumping out pentiums,

Should be "Intel was pumping out pentiums"

---

Also, OS/2 never actually competed with windows 9x. The was a joint MS/IBM project, and IBM always marketed it as the eventual replacement for Windows 9x. Except MS releasing NT make OS/2 obsolete.

---

Linux isn't expressly designed as a desktop/laptop (given that the desktop is essentially supposed to be a portable desktop) either.

This should be "given that the LAPTOP is essentially supposed to be a portable deskop, and the Netbook, in turn, a more portable laptop (and not an embedded device)).

oiaohm said...

MSI reports something like a 75% return rate on Linux powered netbooks. Furthermore, sales figured don't show how many such systems were wiped and replaced with Windows.

MSI return rates were that high at first. Compared to other netbook markers they are a abnormality. There are odd cases by particular makers from time time where they have 75 percent returns with windows as well. Normally due to some bit of hardware in the device not having right drivers.

Guess what MSI problem was same old same old. Release hardware not matched to the OS it was running. Also did not clearly mark them as different. Basically you can stuff up building anything and have high return rates. Its not a OS Dependant problem.

Also thinking Windows machines have higher specs than the Linux machines numbers don't show how many where wipped and coveted to Linux either.

server/desktop/workstation/embedded That is a fake split.

server and desktop versions of windows are only minor different. Linux Workstations really no different to what people call Linux Servers or Linux Desktop.

Embedded realtime support is needed in desktop and workstations so audio applications and the like run well.

Basically its a complete fake split there is no pure reason why they cannot use the same core OS. Good example here is a Iphone. Its using the exact same core OS as its bigger desktop relation.

There is simply no need for multi-able OS Kernel.

Sorry PPC workstations are still popular with Linux systems. What forced X86 into dominance is nothing more than Windows.

OS/2 didn't compete with DOS

That is my pure point. Dos become the dominate OS without ever once having to defeat OS/2. Simple it was cheaper and would do most of the jobs the people at the time needed. So no one was buying OS/2 so it died slow death. OS/2 was dieing before Windows 3.11 was released.

Kharkhalash said...

Guess what plans of mice and men. They are not being used for that.

And that's the whole point, isn't it? Trying topitch a system designed for one thing as a replacement for something completely different, while competing with devices design for that use won't work.


Most are being used as small note takers. Larger than a phone but small to carry compared to a full blown notebook.

I'll agree with that, aside from the reservation that my older G4 iBook is no hassle to cary in either a carrying case of my backpack, and allows me the screen size and power to things other than take notes when I'm not at home (since the iBook/OS X package was designed designed to be aportable iMac (as opposed to the powerbook, which was a portable powermac). A netbook, by design won't afford me that extra functionality.

Yet notebooks are mostly not used for notetaking and are more likely to be on the internet than netbooks. Really if you could swap the two names it would make more logic.

Agreed. However I feel the need to point out that this concession ultimately refutes the proposition that the netbook spells the end of Microsoft. It's a novelty device with limited uses. the whole idea of trying to make a hybrid, Frankenstein embedded/portable device consisting of parts that are a chip designed for embedded devices and an OS which is designed for neither component of the hybrid is doomed to fail.

If you want a hybrid embedded/portable device, you do it properly: You design a hybrid embedded/portable system and you market it as such. Not as an embedded device (it isn't one) not as a portable desktop (it isn't one either) but as a hybrid.

And you accept that you resign yourself to carving out a niche market, because it won't be able to compete with embedded devices in the embedded market (it isn't designed for it), nor will it compete in the portable desktop market (it isn't designed for it, either).

That's why it's tricky, you can't just tailor the ARM and try to make a bigger embedded device. Embedded devices have limitations that are incompatible with portable desktop use.

And you can't just take a desktop chip, stuff it in an embedded device and call it such. The limitations of a desktop design are incompatible with embedded use.

This is why the Atom will potentially make or break the netbook market, it seems to be developed expressly as a hybrid, if it succeeds it will eat ARM alive in the hybrid/netbook market.

BestToolForTheJob(TM) and JackOfAlltradesMasterOfNone(TM) aren't just catchy politically correct slogans. They're realities of system design, and they absolutely must be taken into account when designing a system.

Kharkhalash said...

Also thinking Windows machines have higher specs than the Linux machines numbers don't show how many where wipped and coveted to Linux either.

Except we can use general usage figures to extrapolate that the number isn't that great. Linux only holds 0.8% of the market, and Windows holds something in the neighborhood of 88%. In this case the NetApplications market share figures are absolutely relevant, since as was stated, 99% of netboooks are connected to the internet.

server/desktop/workstation/embedded That is a fake split.

You haven't been paying attention. Look more closely at the way Solaris/SPARC and IBM/AIX are designed, then compare it to the way general pupose packages like x86/Linux are designed.

server and desktop versions of windows are only minor different.

They share a kernel and a core system, yes, but there are a few caveats:

a) The server kernel and desktop kernel are optimized and set up differently, with different priorities.

b) The NT kernel is of a hybrid/microkernel design (meaning it it's a monolithic kernel, designed as though it were a microkernel, OS X uses the same design approach, btw) which offers it much more versatility in being build as a largely general purpose system. Jack of all trades, master of none, applies here with a subcaveat. The Hybrid design makes it better suited for multiple purposes than the pure monolithic design ,of say, Linux.

c) Windows is designed to be mostly a general purpose OS, with the added caveat that the hybrid kernel design offers much more versatility, given that it has most of the benefits, but none of the limitations of a true microkernel design.

Again, compare the design of Windows/x86 (general purpose), to Solaris/Sparc and AIX/Power (ultra high end server space), to QNX on ARM/PPC/MIPS (embedded, realtime).

They're designed and optimized very, very differently. As a result, they don't compete in the same markets.

You won't see QNX/ARM competing with Solaris/Sparc on the ultra high end, nor will you see the inverse in the embedded market.

You'll see standard x86 on the low, middle low/high end server markets, in the desktop market, and on small devices, but you will not see it make any serious inroads into the embedded/realtime or ultra high end server spaces. It just isn't designed for it.

Linux Workstations really no different to what people call Linux Servers or Linux Desktop.

I'll conced that there is no distinction the the Linux universe. But that's limited to Linux and is an anomaly. You also won't see Linux offer any serious competition to QNX/ARM or Solaris-Sparc/AIX-Power in their respective markets. It just isn't designed for it.

A multipurpose design will never, EVER be able to compete with a system designed for a specific task.

Linux/x86 (and don't kid yourself here, it may run on multiple archs, but its native arch has always been x86, just like Solaris was run on x86 for a decade, but it's native platform is Sparc) may mop the floor with Solaris/Sparc on the middle end server space, but that's to be expected, Solaris/Sparc isn't designed for it.

Now, compare Linux to a high performance, high reliability enterprise Solaris on a SunBlade, and laugh as Linux dies a quick death on loads so negligible by Solaris standards, that Solaris won't even bother to laugh at.

Compare Linux on a single multicore x86 (or even on the same chip!) to a single CPU Solaris Rock II (or Nigara II) ultraSparc in terms of a high end virtualization server or database server, and see how embarrassing the result is for Linux.

General purpose can't compete in a specific masrket, with something like Sun/Sparc or AIX/Power where the hardware itself is not only designed and tuned to perform that specific task,but it designed specifically to run that OS, and in turn, the OS is both designed and tuned expressely to perform such tasks and run on that specific hardware.

It's a no-brainer. The distinction exists, it always has, and it always will. Don't fool yourself into believing that Linux is anything other than a general purpose OS that isn't designed with general purpose concepts (otherwise it would be using a hybrid kernel), and spreads itself too thin by trying to do everything, and ends up doing nothing particularly well.

The Linux world may wish to pretend that this distinction is artificial, and can continue to fight it, but that doesn't make it true. the fact is that it even falls behind Windows as a general purpose, jack of all trades OS, because although it tries to be one, it isn't even designed to be that!

Embedded realtime support is needed in desktop and workstations so audio applications and the like run well.

No. Embedded realtime support is required in.... Embedded, realtime devices. Desktops require pre-emptivity and schedulers tuned for giving higher priority to things like audio and video, but not at the expense of everything else, to run well.

Workstations are not the same as desktops, they have different, more extreme requirements. A video/3d/graphics workstation (Think the old high end SGI workstations) require both the pre-emptivity and schduler requirements of the desktop, but give almost exclusive priority, and balls-to-the-wall optimization for that given task, plus IO fine tuned for video and 3d, plus RT filesystems fine tuned for excessively large, multimedia files (see XFS).

A Tezro for example was largely useless for anything but high end 3d/graphics, but nothing else came even remotely close to it for what it was designed for.

Similarily, and audio workstation (see the ProTools embedded devices, for example) require low latency, realtime support, full priority to sound and IO optimized for sound recording. Timing is EVERYTHING. Here, even an extra 0.1 ms of latency will ruin a recording.

Still insist that there's no distinction?

Basically its a complete fake split there is no pure reason why they cannot use the same core OS.

Yes, there is. Different uses have different priorities. I'm talking about specific systems designed to a given task, and you're citing general purpose systems on general purpose hardware that don't even compete in those specific markets.

Good example here is a Iphone. Its using the exact same core OS as its bigger desktop relation.

OS X is a horrible example to defend Linux with. OS X can do this,

a) because of the hybrid design.
b) because of the central, core frameworks. All you really need is to port Cocoa, and you're good to go.
c) because the iPhone OS is actually a stripped down OS X, but it doesn't matter, because all you need is Cocoa and its core frameworks.
c) Linux does NOT have these facilities (see one of my earlier posts in this thread)

There is simply no need for multi-able OS Kernel.

That's exactly what I've been trying to tell you! There's no reason to use a general purpose OS for a non general purpose purpose!

Sorry PPC workstations are still popular with Linux systems.

Sorry, that's bullshit. Nothing in the Linux worls is truly popular. PPC Linux is a niche within a 0.8% niche.

Had you said PPC is still popular amongst Apple systems, I'd agree (I'm typing this from my G4 iBook, and have 2 G3 iMacs, and a G4 iLamp in the other room). Honestly though, your argument is stupid and shortsighted.

Think of it, all of these PPC desktop systems are hardware that was purchased back before PPC was killed off, and are still in use. Nobody goes out and buys a brand new PPC desktop system, because they're out of production, they aren't made anymore. PPC on the desktop and workstation is dead.

This is absolutely obvious. It's not as if people threw out all the PPC macs or beBoxen when Apple announced they were abandoning PPC, and Be Inc went bankrupt.

Just like people who have them, still make use of their SGI Tezro and Octane workstations. You don't decommission a working investment for no reason. But that doesn't make these archs any less dead.

What forced X86 into dominance is nothing more than Windows.

Wrong. XENIX and x86 helped each other bring x86 to dominate the microprocessor market and XENIX to dominate the UNIX market. There were more XENIX systems in circulation that every other Unix combined (this includes the big guys like Sun, IBM, HP, DEC, etc) solely because only XENIX ran on x86.

x86 was already the dominant architecture before there was DOS or Windows. DOS is simply what took it from the enterprise into the household. Learn to accept history as it was, not how it must be to support your argument.

OS/2 didn't compete with DOS

That's exactly what I said, even before my corrections post. OS/2 came after DOS.

That is my pure point. Dos become the dominate OS without ever once having to defeat OS/2.

No, that was MY point. Specifically the point I used to refute yours.

DOS created a new market.

Simple it was cheaper and would do most of the jobs the people at the time needed. So no one was buying OS/2 so it died slow death. OS/2 was dieing before Windows 3.11 was released.

OS/2 was released post DOS. It was a joint MS/IBM effort. It was marketed as the next evolutionary step from Windows, and never competed against DOS on Windows. Do you not understand that MICROSOFT HELPED DEVELOP OS/2?

OS/2 was killed when MS released windows 9x, and even then, it was still pitched as "the next Windows" Literally, not even figuratively. Microsoft releasing NT made OS/2 obsolete, redundant and pointless.

Keep in Mind that NT is older than people give it credit for.It didn't start at version 4, and it didn't start with Win2k.

NT was released alongside windows 9x. 9x had little to do with OS/2's demise., IBM saw Windows pre-NT as the precursor to OS/2.

Anonymous said...

This is the problem there is no clean way to say how many Linux machines are out there. Just like you cannot cleanly number the numbers of pirated software machines out there.

How far off can the numbers be? Even at an order of magnitude, which, statistically, is a colossal error, Linux still fares under 10%.

I am getting sick of the fools around here.

Then leave. Why you people continually torture yourselves trying to convince us of something we already know is wrong is beyond me. Go back to calling bug reporters and kernel developers morons on mailing lists.

Anonymous said...

oiaohm,

So you're basically arguing that netbooks, of which "99%" are not Internet connected, currently represent a credible threat to the desktop model? And that Microsoft is so stressed from this tiny market that they near bankruptcy? C'mon, dude, this is just absurd, even for you. Microsoft's shedding employees because everybody is shedding employees. Maybe if you came out of your basement once in a while you'd see the global economy isn't so hot right now.

Anonymous said...

I wouldn't say Windows was fully entrenched by OS/2's availability. No one really used the first two windows versions, and nobody liked Windows 3 anyway. This was back when every computer user bitched about General Protection Faults and the Mac bomb, so we were perfectly willing to ditch what we had for anything better.

OS/2 failed because it was repeatedly late to the party with incremental upgrades, did not entice developers, and, due to lack of applications, didn't serve much of a role beyond running virtualized DOS and Win 3 sessions for stability purposes. And even that was largely wiped off the map by Windows NT.

Anonymous said...

Windows 95 taught us what a desktop consumer system needs to succeed, no matter how much it sucks technically:

- ease of use
- plenty of applications, stupid
- hardware support

14 years later, lintards have learnt nothing from Windows 95, failing miserably at each of those points.

But they still don't get it and they never will, and that's why I bet they're working on yet another package manager, a new file system, and a shiny compiz plugin which will SolveAllProblems(TM) once more.

Anonymous said...

@March 4, 2009 8:40 AM

You forgot: affordable

It's easy to forget that 3 GHz processors with 4 GB RAM were not always the case. The IBM PC platform offered barely enough to get work done and personal computers still cost three to five thousand dollars.

What many call "technological suck" I call "getting the job done on suboptimal hardware". The revisionists pretend NT or OS/2 were accessible to 1995 computing laymen, but at $100/MB RAM and needing 12 of them just to load the installer, it wasn't remotely feasible.

Anonymous said...

A lot of histronics about OS/2. IBM at the time envisions OS/2 as a suppliment to their mini computer line, while Microsoft was developing Windows NT as a replacement for minicomputers. And they did. By the late 90's mini computers were gone replaced by client server solutions.

Then we took a disasterous turn for 10 years back to thin clients (the web) and that resulted in the current morras of trying to make AJAX into a workable client server solution.... again. LOL.

Thankfully, it appears that application providers are finally abandoning AJAX as a client-server solution and proper internet enabled client-server applications are making a huge come back.

Google better get that g-drive thing working or they are gonna be left in the dust.

oiaohm said...

No. Embedded realtime support is required in.... Embedded, realtime devices. Desktops require pre-emptivity and schedulers tuned for giving higher priority to things like audio and video, but not at the expense of everything else, to run well.

Problem here Kharkhalash you are trying to say Specialised is required. I have a very big technical problem.

There is a huge mother of a overlap that for years the same stupid logic kept apart. Linux Server + alterations required to support realtime saw 400% percent boosts in through put also increases in stability.

They scheduler and locking across server, desktop, workstation, embedded and realtime all have to be design basically the same. If you don't you end up with crappy performance or instability.

Even on realtime items have to be que or some device end up in a bad state.

Why does desktop, workstation and server need real-time. Simple cpu less devices that are dumb as dish water and if not answered in time will damage self. There is no technical difference if you want everything to work.

All of them cannot use spinlocks if you want performance. Instead must use ques. Solarias and QNX has been ahead in that section.

Please go back and read your history again. XENIX did not make X86 dominate. Did not produce enough production volume to give it a major price advantage. Desktop market to be correct Dos and Windows gave the production volume. All the big Unix's were doing quite well into 1990 with there own unique processor chips after Xenix was long gone. Even when Xenix was around it did nothing to create x86 dominance. Basically Xenix did nothing bar become a dot in history. Linux on the other hand x86 built clusters that really stuffed there cpu market. Windows desktop had made x86 volume produced. Without Windows Linux would never have got the supercomputer market.

Volume is key. x86 + Windows locked created volume.

Anonymous said...

The .NET framework is M$ attempt to corner the market, but it fails! C# is a Java clone that is jittery as hell. ASP is an PHP clone that sucks cock. Also the Windows kernel is still written in C/C++ just like Unix and Linux but with a massive POS inserted in the code. What websites do you surf? EBay, Facebook, Amazon? All Linux/Unix Clouds because M$ can't support that much data even if Balmer and Gates sacrificed a virgin to the Satan every hour of everyday. You want to talk development? Linux/Unix were designed, built, and implemented by academics for academics. Linux/Unix is still the primary development platform for code around the world because it has faster and better compilers like GCC. Face it, Windows compilers are like Ford Mustangs and GCC is like a Corvette. One works, the other just looks pretty. I've done coding for years and never had a problem on a *nix box. US Government regs require code to be written and compiled on *nix boxes because it works better (plus security issues with Windows). Onto Bash/Shell scripting. Not as powerful as Perl or Ruby, but light years ahead of .BAT. Its closer to an actual comp language and has the power. Windows is always telling you what can and can not do with it - even on the servers. Windows and M$ can go fuck themselves it/they think they can tell me what to do on my fucking servers. LINUX FTW!

Anonymous said...

IDon'tKnowWhatTheFuckI'mTalkingAbout(TM)

Anonymous said...

Seriously, anyone who thinks Windows scripting ends at DOS batch files is an idiot.

«Oldest ‹Older   1 – 200 of 802   Newer› Newest»