Invalid characterset or character set not supported Linux - the easy and functional way.





Linux - the easy and functional way.
November 02, 2010

So here's a small capture of linux w/compiz fusion running on my Eee.

http://www.youtube.com/...

It's a basic Linux Mint 9 distro with a gnome desktop environment.

The menu-bar with the icons, similar to what you get in the Ubuntu Netbook Remix, was done by adding the "maximus" window-manager to start at boot. As well as adding the "window-picker-applet" to the gnome panel.

Presto, result is an efficient and good-looking desktop.

And none of this requires you to do anything magical, and all of it is available through the package manager or a graphical front-end.

(In the example, I'm running Openoffice. Opening an Office 2007 document, then exporting it to a pdf. Opera runs back there. Spotify (the windows-version) is running in the background through wine. And I'm using smplayer to play Gintama. The left top corner brings up the wall. The right corner picks up all the active and open windows - minimized windows only exist on the bar with the icons. But you can change that to what you want. Note that the capture is in 15fps, and running the encode (along with the rest) made the 3d effects take a hit.

I just managed to get Silent Storm running in 3d at about 4 frames per second.. that's what the icon is doing on the desktop :/ This is a problem when gaming on linux - graphics drivers for embedded graphics cards won't work well with direct addressing, or typical solutions that are reasonably quick in a windows-environment. In the same way, I can make h.264 codecs work fairly well in linux, but not nearly as well as they would "hardware-accelerated" under windows. This does depend on the particular graphics card, though, and how far the development of the graphics drivers is).

Anyway, see the bottom of the post for directions on how to install a live-image if you would want to do that.


------


So a while ago, I bought an Eee PC with an ssd disk. I bought it to write on. A laptop in the backpack sounds light, but even a MacBook weighs a ton if you're lugging it around all day. Besides, the battery life per performance isn't much better until you buy something in the €1000 class. Nvidia's and AMD's integrated bus solutions are not turning up any time soon either, thanks to how the industry works - so this was and unfortunately still is, the best solution for a lightweight "netbook".

Anyway. So I paid my Windows tax, and booted the thing up. Nice. Runs some games, and a psx-emulator.

While the Windows on it worked, it ate up my hard-drive. I don't know why. It tended to hang. It wanted updates, that I installed - which broke programs I used. Eventually it started to grate on my nerves to think about booting it up. Not a success.

Alternatives? Buy Leopard? Maybe - MacBooks use the same integrated chipset as the later Eee's, so why not..? ..but then on second thought...

So. Linux. I had a gentoo build on my other computer. But that's not really very useful for a laptop, since you compile source, and need some more processing power (and battery) if you want to install programs. I could compile on my other computer and set up my own repository, but... no, this is too much work.

So I started looking around for a linux-build. First attempt was elive. It's the e17 window manager running with compiz fusion (3d effects, and so on) - which is based on debian.

(Explanation at this point: Debian is the package repository channel for several large distributions, such as Suse, Ubuntu, and so on. They're well maintained, and active. The distribution, in this case "elive", then consists of a configuration of packages from debian, along with a desktop environment - e17 - running beside a composite window manager called compiz fusion. All of these parts exist in the debian package well. The windows equivalent would be to call the menu-navigation and the maximize, minimize window scheme the window manager. The desktop environment with the buttons on the desktop, the integrated file-presentation, etc, would be the desktop environment. And the operating system would be the functions underneath that you don't see. Windows does not have a package repository - instead you install separate programs that may or may not adhere to platform standard).

Elive was an interesting experiment, and I learned a bit from that, such as how dumb it is to create a new branch with specific implementations to get the build to work. This breaks against updated packages, and makes sure that the distro is useless if you update it later.

The second attempt was Fedora 10. This was supposed to have the Moblin core included (an ibm-developed concept that demonstrated a 10 second boot on an sse2 compatible system - very interesting use of parallel processing schemes, and... right). It actually did work - but Fedora never maintained their builds for stability - and absolutely not while thinking about using their build on netbooks.

There was an extra branch of kernels for use with netbooks of different kinds, that did support the wifi driver and graphics card. But since the package repository would be invalidated very quickly, and often impossible to actually build for the old kernels because of requirements in the new builds - this just became too much work.

I wondered about making something out of the Gentoo build I had - preconfigure the most useful programs, and then compile new programs overnight. That, too, would be overkill, though. No need to compile programs for a kernel that isn't actually patched specifically for my netbook anyway.

Ok. So what about ubuntu? I've never liked the Ubuntu folks. They're advertising for their build very effectively, I guess. And enforcing "open source" at the cost of functionality for some reason. Until it's just open source for the sake of open source.

Open Source never was about that - it was about maintaining a good platform with open standards. Forcing people to watch and listen to ogg (and preventing people from watching divx) won't do that.

It also is a horrible mess of bloat upon bloat. Like someone said, wistfully: if Ubuntu was called Windows, it would be exactly as popular.

But let's not be close-minded, and try it out - and it turns out that Ubuntu had made some strides ahead since last time. In fact, if you download a "live-image" (it's just a build you can run without installing it on the disc), and insert it in windows - you'll get instructions on how to take a look at their new netbook specific frontend, and things of that sort. Many useful things - some not so useful. But not bad.

I spent about a week trying to pare down the install, until I broke some resource the boot-manager was based on. Sad, sad. Still, it does work, and contains what you need after an install.

Last attempt was Linux Mint. http://www.linuxmint.com/

Turns out that someone else had had the exact same impressions as me before, and made a pared down build of the Ubuntu package.

So that's that. A very light and specific build, suited for a laptop. With a desktop environment of your choice (kde, gnome, fluxbox), with enough pre-installed packages that you don't need to download masses of packages to get started.

It is served, in other words.

-----

Recipe for installing a linux live-image:

You need:
1. USB-stick, 1-2GB (will be formatted),
(or a cd/dvd + burner)
2. Computer.
3. 30 minutes.

The process is reasonably simple. Go to, say:
http://www.linuxmint.com/

Download the live-image of your choice.

If you're in windows, or linux, you can for example download a program called "Unetbootin"(search for it on the net). This program places a live-image on an usb-stick, and then making it bootable. Useful for netbooks. The process is: pick the image locally, choose the usb-stick, complete.

Or simply burn the live-image to a cd with your cd-authoring tool of choice.

Once you're done, boot the computer and hit "esc" during the firs bios-prompt. Most computers will then let you choose the available media you wish to boot from. If not, you need to open the bios configuration (f1 at boot or something), and then pick the cd-rom or the usb-port as the first boot-partition. (But don't forget to turn it back later).

You can then boot up the build, have a look around, and see what it's like. If nothing scared you too badly at this point, installing it on your hard-drive shouldn't give you any trouble either.

Most recent blog posts from Jostein Johnsen...

Feedback
zippdementia zippdementia - November 02, 2010 (11:01 AM)
Linux is great. But I don't have the time right now to figure it out. Maybe next summer.
fleinn fleinn - November 02, 2010 (11:34 AM)
:) ..honestly.. it takes less time to get the system up to speed than a win-boot. That desktop isn't very far away from what it looks like when you boot it up. I had to download Opera from opera.com, and do the maximus and window-picker thing. Install smplayer/mplayer from the repository. Place the wall and the scale plugin from compiz to trigger on corner-events (through the graphical interface). Other than that, this is a standard setup.

Took me about 30 minutes to download the iso-image, run the install, and boot, connect to the internet.. Firefox and Thunderbird already installed, that kind of thing.

Don't think gnome or kde will be much trouble, either. It's mostly the same you have in Windows or Mac.

You actually don't have many crazy things that don't work now, either. It used to be that way, but not any more. Whether it is powersaving, routing sound through bluetooth, making some mic work on skype, etc. Gnome has an indicator bar for running programs, wine does well on most desktop applications.. So basically, get the program. Install. It works.

I had a hang in February. :)
Leroux Leroux - November 02, 2010 (03:52 PM)
Unless you're taking advantage of the open source code, there's not much reason to prefer it over OSX or Solaris besides cost.
fleinn fleinn - November 02, 2010 (10:51 PM)
..choice, I guess.

And.. you have the problem with the branching, like I had with elive. Some software you have running in OSX somewhere could become incompatible after an upgrade. But other programs that depend on the update forces you to upgrade. So programs would simply be left behind. ..and there's no compiler with OSX as far as I know, so.. No configuration files you can mess with either..

Most Mac-users won't worry about that, though. They use one music-player, they have iTunes, they have an iPod, they use the calendar, the mail-app and Safari - they use Adobe software, etc. Don't think about formats, alternative programs.. And that's really all there is to it.

But if you would like something else, you're not going to get it. For example - if I wanted the window-manager to behave differently in Mac, I can't expect programs to change appearance and behaviour even if I manage to install a new layer. Because the applications are keyed to Quartz, and that's that. You can't change your desktop environment and still use the same foundation, for example.

That's the same you have in Windows - you strip away some user interface, and suddenly you also lose program-functionality. It's not that it's simply hidden from you, the programs actually lose the function. It's not that it's a pain to configure, or something like that, it's just not there - you install an x-server (extremely unofficially) on Mac, and then you run that separately from NeXt, hopefully without issues. ..You don't have that problem when you base something on a purposely built standard that allows for all the parts to be switched out without much breakage.

So - I like using fluxbox, because it's extremely lightweight and doesn't do anything you don't ask it to do. But on the other hand, gnome is very practical. So I can install both, and choose the environment I want at bootup. Meanwhile, I can still install programs when using fluxbox, and get those programs registered with the right filetypes and events - and have them carry over to the other desktop environment. Because the environment is still just a graphical layer on top of the OS, instead of having some functionality keyed to a particular layout or configuration.

What does that mean for the user, though..? It means that you're not going to have to wait until the vendor finds the time to change something to your liking. And when something neat turns up, it will be available to you, and possible to integrate into your environment of choice. This also goes for either open and closed source programs, as long as they adhere to the interfaces well enough. Opera, for example, is closed source. The program is protected. But the graphical interface is based on Qt, and Qt integrates reasonably well with both windows and typical linux desktop environments. So it can float into the environment, and let composite window managers take care of tooltips, menu bars, and so on, while the elements in web-pages can follow a style familiar to the rest of the desktop. And you get programs of different levels of openness that way - that still follow the rest of the standards.

It also means that you just use the UI as a tool to perform certain tasks. Instead of "learning" "the OS".

And.. that's what we really want - good programs that really are multiplatform, and run equally well on all platforms there's a compiler for.. right?

There's also the.. stability thing. I thought, until I got a Mac myself, that a Mac would run without issues for about a decade at a time. That was wrong. The spinning wheel wasn't really that rare, even when running only Mac programs. That annoyed me a bit. Then there's the times when you insert an external device and it somehow fails to register properly, say with itunes - there's nothing you can do to check what's wrong. You can't check the modules, or find out what the program is depending on. It's just hidden to you.. and if it doesn't work, you're kind of stuck. My Zen player is also invisible to a Mac - it's certainly possible to make it discover the device. But if you want to transfer something to it, then you will be programming something yourself, and using the low-level file-access libraries to do it. There's no "let's just use the usb-device file transfer, and borrow something from the desktop environment" here.

Another nasty example is the iTunes storage format. You buy the songs. And.. well.. it's put, drm-free, in numbered catalogues somewhere down in the player's memory. It's not a problem if you only use iTunes. But if you occasionally play something else, or transfer files back and forth, it just stops you in your tracks. And that's it - there's nothing you can do. No hack, no nothing is going to fix this. It's just programmed this way, and if you want something else, you have to make it from the bottom up (or hope Apple makes an interface for it in their sdk).

So while Mac does have it's charm, it's not without drawbacks. ..And I mean, I'm not much of a fanboy in the first place, and made in China is made in China. Doesn't matter, it's just business. But.. I admit I have a problem with buying Apple products for.. you know.. ten times the price.. when I know I'm actually not paying for quality, but for licensing (take the "dsp"-disaster that went on a while back.. Or the entire high quality soldered pre-amplification on the iPod.. pick it open and look for yourself..). There are a good couple of examples of how that works in MacWorld. So when you know what the actual components are, and how much it really costs to make the thing - then it starts becoming difficult to buy into it.

Or, OS has to be extremely slick for me to buy it. .. of course - it is very slick. And it is preconfigured. So I can easily understand why people choose it. But it's not "basically a unix system, like Linux", so that there's no difference.. you know..

Sorry, long-winded again..
Leroux Leroux - November 03, 2010 (05:50 AM)
I'm not going to get into a pissing contest with someone that thinks there's no compilers for the OS X platform. Long live Linux!
fleinn fleinn - November 03, 2010 (06:24 AM)
I meant, you can't recompile source for old kernels or versions of libraries inside, for OSX. For programs supposed to run inside the environment. You're stuck with making sure those programs don't rely on core libraries.
Leroux Leroux - November 03, 2010 (06:47 AM)
Now that it's clearer that's definitely right. But that goes along with the point I conceded: Linux is great if you're taking advantage of its open source. I still don't foresee most users doing that, which is why I think the differences aren't significant, or swap one issue for one solution, since they're all built off of Unix.
fleinn fleinn - November 03, 2010 (08:24 AM)
Well.. no. That was what the rant was about :D

It's the difference between having transparent layers up to the actual program code, and between specialising code on the low level. Both can make sense, it's not that.

Since a lot of the programs on Mac were fairly lonesome in 64 bit land for a long time as well, what they did was kind of efficient.

And, you're right up to a point. That the users don't see much difference, because people don't read source-code and make their own programs. But when you do program something, the applications will have to be more consistently built up to work.

That's something the users benefit from. With consistent gui, as well as reliable, maintained and stable function calls.. ..And this is how it works to have so many different people making programs independent of each other.

eXTReMe Tracker
© 1998-2024 HonestGamers
None of the material contained within this site may be reproduced in any conceivable fashion without permission from the author(s) of said material. This site is not sponsored or endorsed by Nintendo, Sega, Sony, Microsoft, or any other such party. Opinions expressed on this site do not necessarily represent the opinion of site staff or sponsors.