Log in

View Full Version : Linux Corner


Pages : 1 [2] 3

bnt
18th Nov 2009, 22:30
Sorry - that was from memory. The "file name to write" step is there in case you want to write to another file name. In this case you don't, so you just hit Enter to save to the same file name. Then it's Ctrl-X to exit to the terminal before you enter any more commands.

rgbrock1
19th Nov 2009, 19:09
In addition to using the 'man' command at the terminal for help with commands, syntax, qualifiers etc., I offer this other, hopefully helpful, qualifier to 'man'.

#man -k

The -k qualifier is used to search all short descriptions and manual page names for a keyword. And then prints out any matches.

Example. Say you're looking for help on something to do with rpm but you're not sure exactly which 'man' page would be applicable.

You would enter #man -k rpm

Which would then return all the 'man' pages having to do with the rpm package manager.

I find the -k qualifier handy when looking for a man page I'm sure exists but not sure of its exact spelling.

Hope this is helpful. To someone anyway!!

four_two
20th Nov 2009, 20:19
Sorry - that was from memory. The "file name to write" step is there in case you want to write to another file name. In this case you don't, so you just hit Enter to save to the same file name. Then it's Ctrl-X to exit to the terminal before you enter any more commands.

Thanks for your help everything is back to normal now. So nice to see updates happening once again.

bnt
23rd Nov 2009, 13:23
So, do we think this thread is worth a Sticky? :8

mixture
23rd Nov 2009, 18:01
So, do we think this thread is worth a Sticky?

I do believe the mods made it clear that like the ex-Mac sticky, the thread should live and die based upon its own popularity.

I agree with that policy. Therefore no sticky would be my vote.

x213a
28th Nov 2009, 02:45
A while ago I posted a thread about being unable to installed Windows 7 on a laptop HDD that had a corrupted installation of Ubuntu on it. Nothing would let be wipe it or delete partitions. Everybody agreed it was buggered etc.
I have solved it. Heres how I did it:

I ran Testdisk off the Ultimate boot CD and analysed the disk. It showed various errors. One of the options presented to me was "recover superblocks". I did not, and still dont really know what they are but as I had nothing to lose I clicked it and away it went...
I was then able to scan the disk again and recover bad sectors. After that I was able to format the disk and install operating systems on it again.

Hope that helps somebody:ok:

Incidentally, just discovered a new bug on Ubuntu 9.10...It would not let me add application launchers to the desktop. The problem went away after a restart and I had about 20 open office icons on my desktop from my previous attempts. Anybody else encountered this?

mad_jock
28th Nov 2009, 09:24
Sounds like your desktop process hung not a problem with the OS a such more the desktop. Which one was it?

As for the super blocks did you try beating the disk to death with Fdisk and advanced options?

bnt
28th Nov 2009, 10:55
Those errors from TestDisk about superblocks - they only refer to the partition already on the disk, which is probably ext3 since Linux was on it. However, since the plan was to install Windows on that disk, there's zero point in worrying about that ext3 partition, since Windows can't use it and will expect you to delete it during installation. It's toast.

The only sensible option is to use fdisk on it, like mad_jock says, or a low-level format option. The Windows 7 installation is going to do this anyway, if it can, but I remember when I tried Vista a few years ago it couldn't recognise what was on a disk before and refused to wipe it - probably as a safety feature. I had to boot from a Linux "live" CD and wipe the disk. I call that the Aliens strategy: take off and nuke the thing from orbit. It's the only way to be sure. :cool:

x213a
28th Nov 2009, 15:54
Fdisk did not work and the live CD didnt either. It loaded but "hung" when re-partitioning. It would not let me delete partitions or change file system type or format either. I/O errors was the reason.

mad_jock
10th Dec 2009, 16:34
Right here is one for you all. A situation that someone wants me to define a solution for. And mods don't worry it's for an old aged pensioner to get an emergency call system working. They only have access to an anal retentive ISP which has the IP address down as a VoIP site and a 3G dongle isn't cutting the mustard. Its based in the UK and to my knowledge completely legal to do.

Situation

A certain box needs to get access to an IP address which is blocked by the ISP.

What I propose is connect said box via a cross over cable to a linux box via the eth0.

Then route that over to the wireless wlan0 which has a vpn connection on it thus bypassing the ISP blocked sites.

The dhcp on the eth0 is easy to take care of with the gateway being defined as the IP address being bound to the eth0 interface.

Now anyone know enough about openvpn on linux so you can force it to only use the wlan0 or will it just try both interfaces and use the one which it can connect to the server on?

after that I presume you would only need to start the routing using ip_forward and add the route in between the 2 networks.

Any comments on my cunning plan?

I know you could just buy a wireless VPN router but this way we should just be able to use an old laptop and a 3 quid cross over cable.

bnt
10th Dec 2009, 18:06
One thing missing from your description is where the VPN goes to on the Internet. When you say the other box "has a VPN", can I take it that means that it does so in conjunction with another system, out there on the Internet, beyond the ISP? If so, and you know how to set up the routing etc. (installing some router package), you ought be able to make it work - hard to tell from here.

However, I can think of other ways of doing it on the first system:
- install a VPN client on that system
- if the emergency alarm system uses HTTP (WWW) and supports a Proxy connection, then you could tell it to use one of the free proxies (http://www.xroxy.com/proxylist.htm) out there. The outgoing IP address, visible to the ISP, would be that of the proxy.
- Or, also if HTTP-based, install a Tor client (https://www.torproject.org/). This is also a proxy solution, but the proxy runs on your computer and can route traffic across multiple external proxies (so it's less likely to fail).

I'd be concerned about the amount of maintenance & monitoring any complicated solution might require. The more complex, the greater the chance ot all goes wrong at precisely the wrong time - especially if you have an ISP that thinks it's acceptable to decide what you can and can't do on the Internet. :mad:

mad_jock
10th Dec 2009, 19:25
The other end of the VPN is sorted 60 quid a year and there are about 20 world wide servers and the openvpn client is pretty robust in my experience and if one route fails will quite happily go looking for another server.

The box is just that with an Ethernet port, a webserver to do the setup on it, a speaker, microphone and I presume a DEC interface to the triggers. I have a sneaky feeling it does actually use VoIP for when its triggered and is pretty easy for the ISP to sniff that and block it. So the proxy option is out.

Once I have the guts of it sorted out its not actually very complicated. I will have basically created a VPN router, a couple of cron jobs to do the house keeping and restart the VPN connection every 24hours so we don't get any security certificate errors. Barring hardware failure it should be pretty robust, its not as if the coffin dodger is going to be using the laptop.

You don't need install any routing software on unix boxes to route its all under /proc/sys/net/ipv4/

7AC
20th Feb 2010, 15:07
I have an Acer Aspire running Linux.
On my last visit to Youtube it announced that I must upgrade Firefox to 3.6.
Is there anyone out there game enough to show me how to do this?
I've downloaded the update but I just can't get any farther.

call100
20th Feb 2010, 15:53
I know nothing about Linux....These guys might. They seem to be discussing the very issue. What is the ""Correct" way to upgrade to Firefox 3.6? - Linux Forums (http://www.linuxforums.org/forum/ubuntu-help/159187-what-correct-way-upgrade-firefox-3-6-a.html)

hellsbrink
20th Feb 2010, 20:23
Which version of Linux?

Guest 112233
21st Feb 2010, 19:51
I initially had some probs, again on an an Acer One - Running Ubuntu 9.10 - The screen kept darkening; as if the application was "Busy" and not under control of the O/S - There then followed an update from Canocial, about three days ago and all's well since. Use your Linux system update options to see if there's an operating system update waiting in the wings. I'm sorry that I do not know about anything other than Ubuntu.

CAT III

7AC
22nd Feb 2010, 08:21
This is starting to intrigue.
The computer is running Linpus Linux Lite version 1.0.3.E
I hope this makes sense.

le Pingouin
22nd Feb 2010, 10:32
Sorry, don't know Linpus but the generic instructions are:
Download Firefox from the Firefox download page (http://www.getfirefox.com/) to your home directory.
Open a Terminal and go to your home directory: cd ~
Extract the contents of the downloaded file:tar xjf firefox-*.tar.bz2
Close Firefox if it's open.
To start Firefox, run the firefox script in the firefox folder.~/firefox/firefoxFirefox should now start. You can then create an icon on your desktop to run this command. Copied from:
Installing Firefox on Linux (http://support.mozilla.com/en-US/kb/installing+Firefox+on+Linux)

A "global" installation (available to all users) is possible by installing as root user (a.k.a. super user) - the traditional location would be in /usr/local.

The disadvantage in a manual installation like this is you have to keep it updated manually - Linpus won't do it for you.

mad_jock
22nd Feb 2010, 11:58
first of all you need to open up the original aspire linux os

To do this go to Files > My Documents to open the File Manager. Then go to File > Terminal.

type

xfce-setting-show

Click on Desktop to get to the Desktop Preferencs and choose the Behavior tab.

Now mark under Menus the Show desktop menu on right click option.

Then type

sudo su-
passwd (enter what you want as the admin password)
yum upgrade

then leave it alone it might take ages once it comes back

it will come up with a list of upgrade packages and ask you if you want to upgrade. Say yes and leave it alone. it will eventually come back to a # then kill the window with the cross.

if that doesn't sort your problem come back and i will tell you how to add in software despo's

Now your desktop will be slightly different when you are finished if you now right click anywhere you will get a drop down windows style menu with all your programs etc which i found better than the simple interface.

I went for about a month using the aspire cut down linux then installed fedora full distrubution. If you want skype etc we can get that sorted for you as well.

7AC
22nd Feb 2010, 12:21
Thank folks, I'll give it a go tonight.

mad_jock
22nd Feb 2010, 13:21
O aye while were about it we might as well get you set up for a black screen of death.

Occassionally the bios goes tits up on aspire ones.

Aspire One Black Screen of Death | Eric.Chromick.com (http://eric.chromick.com/aa1/aspire-one-black-screen-of-death/)

That tells you how to deal with it.

Mac the Knife
22nd Feb 2010, 17:06
Why do you HAVE to upgrade?

On my main Linux install of Mepis (Debian based) Linux Firefox is 3.5.6 (works fine) and I'll just wait until the FF upgrade hits the Mepis depositories and then it'll upgrade itself automagically.

But, as the others hint above, you CAN quite easily if you don't mind jumping thru a couple of hoops.

Mac

mad_jock
22nd Feb 2010, 17:12
The linpus lite is slightly different to other linux.

Unless you go in and use yum the updates are only triggered by the aspire site.

I can't remember if i had to point it at the fedora main despositries before i could update or load gimp and other such things.

7AC
22nd Feb 2010, 19:33
Mad Jock,
Where do I type sudo su-

Twitcher
22nd Feb 2010, 21:33
did mine exactly as follows, worked first time... and am a total linux newbie:


Press Alt+F2 to show the Run program window. Check Run in terminal and click Run, which opens a terminal. If you're not familiar with the Linux command line just follow the instructions step by step. The easiest way is to simply paste the commands into the terminal with Ctrl+Shift+V. Most of them will only give feedback if an error occurs.

The first command uses wget to download Firefox 3.6 from an official mirror. You can edit the lang variable at the end, in this case en-US, if you'd like another language. All available languages are listed here, just hover over the download link to get the language code from the status bar.

wget -N "http://download.mozilla.org/?product=firefox-3.6&os=linux&lang=en-US"

The next step extracts the just downloaded file and modifies a link to point to it. In a few cases a connection refused error message may be triggered by sudo, which is not an error but a bug in sudo and can be safely ignored.

sudo tar -jxf firefox-3.6.tar.bz2 --directory /opt
sudo chown user -R /opt/firefox
sudo ln -fs /opt/firefox/firefox /usr/bin/firefox

The next step links all plug-ins (not to be confused with extensions) to it.

sudo ln -s /usr/lib/mozilla/plugins/* /opt/firefox/plugins

Launch the profile manager using the command below. Create a new profile, name it anything you like and select it. If you want to keep your bookmarks export them via the bookmark manager first. You can then delete the old default profile.

firefox -profilemanager -no-remote

As a bonus you can also change the icon to the official Firefox icon.

sudo sed '/Icon/ s/acs_//' -i /usr/share/applications/linpus-web.desktop

Finally reboot the AA1 to make the desktop aware of the new icon and browser location.

mad_jock
23rd Feb 2010, 01:39
Files > My Documents to open the File Manager. Then go to File > Terminal

In the window that comes up. Its like DOS box that you get with windows when you run cmd.

Or if its still running after you have unlocked the xoffice menu system the one you typed xfce-setting-show.

edited to add if you use yum it will be automatic and for ever more you will be able to use yum upgrade and it will sort all your software out.

Persoanlly I would have a think about installing the full fedora OS on it.

It's a good wee box just now but after you install the full OS it turns into a helva powerfull wee PC. Persoanlly I have added a LG Burner and USB speakers now it has about 20 odd DVD's ripped on it and a heap of music. Openoffice deals with work stuff.

Even if your not Linux savy after the install the OS looks after all the updates.

7AC
23rd Feb 2010, 10:41
Sorry Mad Jock,
but how do I install Fedora and Yum?
Bet you are sorry you said anything.

mad_jock
23rd Feb 2010, 11:43
You already have a cut down version of fedora install on it.

yum is also installed on it

After you mange to find the terminal you will be cooking on gas

So in the front screen go to files then click on my documents

After its there there will be a drop down menu with an option for terminal.

After you click on that a black background window will pop up.

In that window type

sudo su-
passwd (enter what you want as the admin password)
yum upgrade

I think its best we just leave you with the orginal desktop

7AC
23rd Feb 2010, 12:15
Sorry 'jock,
I've tried doing as you say but all I get is Connection not found.

Do I type everything in one line and how do I punctuate it?

green granite
23rd Feb 2010, 12:18
And people wonder why Windows is so popular. :E:E

mad_jock
23rd Feb 2010, 13:03
Make sure your attached to the internet

have you used live update yet?

7AC
23rd Feb 2010, 13:54
Jock, Yes, but there never seems to be anything worth updating.
How do I access and run Yum.

Twitcher, I've tried your instructions also to no avail, nothing happens.
Are there any books out here that would teach me the basics of this Linux thing?

mad_jock
23rd Feb 2010, 14:18
Run live update and install everything it suggests

I think your not connected to the internet properly

are you getting to the stage you have a command line ending #

7AC
23rd Feb 2010, 14:44
It has just updated a long list of things and in Terminal there is a $ in front of the cursor.
It still has firefox version 2.

mad_jock
23rd Feb 2010, 14:56
type

sudo su

and it should go to #

mad_jock
23rd Feb 2010, 15:02
then copy and paste the whole of below and hit return.

rpm -Uvh http://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-stable.noarch.rpm http://download1.rpmfusion.org/nonfree/fedora/rpmfusion-nonfree-release-stable.noarch.rpm

when it comes back to #

copy and paste

yum upgrade

7AC
23rd Feb 2010, 20:25
Eureka!
Thanks Mad Jock and Twitcher.
I owe you both a dram or two of the finest malt.
I followed your instructions tonight and it worked.
I have that slightly bewildered feeling one gets when these
computer things do what people say they will.
Thanks again fellas.

mad_jock
23rd Feb 2010, 20:46
Did you do it the manual way or did you mange to get yum to work?

Its been over a year since i had the linpus on my one and its just for personal reference if anyone else asks me. I can't really rememebr what i did to mine before I put the full fedora distro on it.

7AC
23rd Feb 2010, 21:21
Jock,
I started with your post 7 then did post 24 and then followed
Twitcher's post 13.
I have no idea what effects have taken place elsewhere under the bonnet or quite how to use yum again.
7AC

mad_jock
23rd Feb 2010, 21:51
yum is an installation manager. It goes off and finds out what's available for you to install and then if you install something it sorts out what other bits and bobs you need to run it and loads them as well.

After you have installed a program it will using the "yum update" command go off and check to see if there have been any updates and if so install them for you.

So say you wanted to have a different movie player you would get to the stage of having a terminal with the # and type "yum install vlc" it would bugger off and sort everything out and install it. And for ever more when you typed "yum upgrade" it would disappear off and get the latest version.

Doing it the manual way will mean that the firefox won't be in the list of applications yum will automatically upgrade.

Anyway glad its working now.

Twitcher
24th Feb 2010, 19:40
I'm glad you got it sorted :ok:
I found those instructions easy as they were written in a straightforward way, worked first time for me so thought I'd share.

Simonta
25th Feb 2010, 21:21
Folks

I'm reasonably familiar with *Nix, running Linux and Macs alongside Windows at home but tearing my hair out with this one.

I'm running 3 machines on Win7 RC and need to do something :-). Would love to upgrade them all but can't afford 3x licenses right now so decided to try Linux on my media centre. The PC is a few years old but has had various upgrades to keep it fettling along for media centre use. It runs very nicely under Win7.

My other Linux box is an old clunker, 1.8Ghz AMD Athlon, 1 GB RAM and an old integrated Intel GPU - used only for surfing, email and "futzing". Works fine.

The media PC has an AMD (ATI) Radeon 9800 Pro and a Creative Labs Xfi GamerXtreme sound card. I installed Mythbunutu as the pre-loaded nature of Myth TV onto Ubuntu seemed to make sense, especially as this is a "spouse friendly" media PC and to the very best of my abilities, the OS should be hidden for ever...

The rebuild has been a total pain. It seems that Linux just doesn't get along with ATI cards and getting SPDIF out from the sound card has completely defeated me. I class myself as expert on Windows, competent on *Nix and hardware and have been developing software for years but the archaic 1990's nature of buggering around with ALSA, kernel compiles and similar nonsense has worn me down.

So, the questions I have are:

1. Recognise the need to abandon the ATI but don't want to spend much. It's only a media PC so doesn't need to be a graphics screamer and any cost, added to the time I've wasted, rapidly adds up to the cost of the Windows license I'm trying to avoid. Seems that nVidia is the way to go. Any recommendations for cheap nVidia cards that will fit the bill or other GPU familiies that don't need the personal assistance of Linus Torvalds to get running?

2. Any clues (and yes, I've spent hours in Google) on how to get the SPDIF out running? I don't get any errors and from a UI and configuration perspective, Linux thinks it's enabled but there is nothing coming out of the jack.

3. Is there any advantage to sticking with Linux on my media PC? My original motivation was simply to avoid cost but so far, I have yet to see any advantage, in fact only disadvantages.

4. Given the above, I've had little chance to play with the Myth TV interface. Tried a couple of different skins but it seems to be a long way behind Windows Media Centre. Is this a fair conclusion or should I persevere?

I can get a Windows upgrade license for around £67 and I've already spent many hours trying to get Linux to work but if I have to spend more than about £40 on hardware, I'll just go ahead and buy Windows.

Thanks for any help/clues/pointers....

Cheers

mad_jock
26th Feb 2010, 17:43
I would try other flavours of linux using a live OS stick and see if any of the others work out of the box.

For 67 quid its a no brainer..... especially if the mrs uses it...... get the wallet out

Miserlou
27th Feb 2010, 18:45
Background.
I use a Mac, my wife has Windows XP on a Dell Lattitude and my son has LinuxMint 8 on his desk top.
Having been v.impressed by Mint I loaded it onto my neighbour's laptop, taking over the whole disk. Can't remember if she had XP or Vista on it, XP I think.
Anyway, she hates the Linux install and wants XP back.
And I can't. I have read much about MS not recognizing the hard disk and I have tried countless times to reinstall including partitioning in Mint. Then the NTFS formatting is unsuccessful.

Help!!!

mad_jock
27th Feb 2010, 19:42
low level format using fdisk.

Then try again.

Simonta
28th Feb 2010, 12:13
Hi mad_jock.

I tried SuSE, Fedora and Mandrake. No go. Thanks for the suggestion..

£62.50 shelled out for Win7 :ok:

Cheers

rgbrock1
2nd Mar 2010, 14:56
If anyone is interested in trying out a relatively newcomer to the Linux distros environment I would recommend Igelle 1.0 The distro comes out of Norway, I believe, is really tightly integrated and contains lots of eye candy.
And it boots up in less than 30 seconds. (On my desktop anyway.)
It comes as a Live CD so you can try it before installing it.

Check it out!!!

NRU74
7th Mar 2010, 19:57
Not sure if this is strictly a Linux question
I've got a PC with XP using a Netgear DG834PN Router with encrypted access
I've managed to configure the PC settings etc so the PC works OK
I've got a small ASUS EEE 700 loaded with Linux Xandros but I am not able to 'lock in' the c26 letter alphanumeric WPA code so I can log on painlessly without having to punch in all 26 digits each time
Anyone out there with a similar ASUS able to talk me through how to do it, preferably in the 'noddiest' of terms ?
Thanks
NRU74

DG101
8th Mar 2010, 19:54
Linux newbie here.

Over the weekend I installed Ubuntu on what had recently been a (broken) Win XP system. This was my first foray into the world of *ix, and the learning curve was fairly steep. The results, so far, have been such that I'm now considering dumping the Win OS completely; but there is just one small problem - Adobe Flash doesn't work.

Trying to install "install_flash_player_10_linux.deb" downloaded from get.adobe dot com gives the message "Error: Wrong architecture 'i386'". Admitted, the CPU is AMD64 .. but that hasn't been a problem up to now. Is there a suitable alternative I haven't yet found?

If not, I guess I'll just have to wait for a few days and install Win 7. (Unable to survive without iPlayer etc.).

aerobelly
8th Mar 2010, 22:11
Trying to install "install_flash_player_10_linux.deb" downloaded from get.adobe dot com gives the message "Error: Wrong architecture 'i386'". Admitted, the CPU is AMD64 .. but that hasn't been a problem up to now. Is there a suitable alternative I haven't yet found?

You could install a 32-bit copy of Ubuntu, but that's a massive performance hit. To keep the native 64-bit system you need to do one of two things - install the "32-bit compatability libraries", or find a software repository that includes Flash and will therefore install the whatever libraries it need automagically through the software installer. On my Ubuntu 9.10 system "medibuntu" supplies the non-free/proprietary software andf they have lots of "flash" packages -- can't guarantee they include what you need, but it's a start. Another possible place to check is the Penguin Liberation Front. Google will find tutorials on adding extra repositories to your software installer.

In the Linux world there are some distributions that make it difficult to install proprietary software, as Flash is, and Ubuntu and all other Debian-derived distros are included. The most relaxed distribution that I know for installation of proprietary s/w is Mandriva.


'b

DG101
8th Mar 2010, 23:02
Thanks Aero,

Having trawled this forum I subsequently downloaded SimplyMepis and I am running that from the CD - with mixed success. Flash is working - sorta. Video images move but the sound is terrible, to the point of being unintelligible. Will continue exploring what's available and make a decision on which flavour of OS will have the honour of driving me crazy. :ugh:

mad_jock
9th Mar 2010, 01:01
I would try installing the fedora distro

I started first of all with fedora and have had so little grief of it i have zero reason to change my flavour of distro

DG101
11th Mar 2010, 00:29
Thanks, MJ.

Apologies for my prolonged absence, but Win7 arrived and just needed to be installed and tested. After all, I've been using it's ancestors all the way back to Win95 (and OS/2 before that).

What a disappointment - not only did it manage to render my newly installed Mepis system unreachable by peremptorily overwriting GRUB in the MBR, but it refuses to recognize my sound card! And I'm less than impressed by the UI. It may stay on the machine as option B - if I get really stuck, but it ain't my OS of choice.

I'll probably give Fedora a try when I stop "playing" with Mepis, which is so far ahead of Ubuntu that I've recycled the live CD as a beer mat.

Anyone want to buy a set of Win7 installation DVDs? One (fairly) careful owner, low mileage (been used once), not even registered with MS. :}

N727NC
12th Mar 2010, 10:48
Yes, I'll buy your Win 7 disks - name a reasonable price.

rans6andrew
12th Mar 2010, 19:35
a friend has passed me a CD of Ubuntu 9.10 which I am trying to run in live mode, in a Dell laptop PC. The system boots up OK but I cannot persuade it to connect to our O2 wireless router. I have been through several Unix/Ubuntu websites and tried to follow the guidance on wireless setup but I don't really understand what the info presented is supposed to tell me. I am new to Unix, as an admin, but have used it as a user some 20 years ago. Please explain in words of one syllable what I should be doing.

Any ideas anyone?

So far, I understand that the wireless network hardware is 2200 and it is being seen by the software. It makes no attempt to connect or report any available networks. I can't cut and paste stuff to here as this is a different machine (win vista) so please don't ask me to put reams of stuff from the screen on the BB.

Thanks Andrew.

batninth
12th Mar 2010, 20:09
Rans6Andrew (excellent choice of aircraft if your name is reflective of that by the way :ok:)

I'm assuming you've seen the network manager icon at the top right of the window? If you've clicked on it, it will show you the wireless networks it can access. If network manager is doing it's job you shouldn't need to configure the wireless by hand.

If you have clicked on network manager & it tells you it can't find any networks then the wireless drivers aren't working correctly, but that would be unusual as Ubuntu 9.10 has a wide variety of wireless support.

Anyway, first question. Can Network Manager find the wireless connection at all?

rans6andrew
12th Mar 2010, 21:24
Batninth, thank you for your response. I was blinded by the science and failed to notice that the radio in the Dell was OFF when I typed iwconfig. doh!

Luckily the Fn + F2 key combination worked to enable the radio and then I started to see the available network icons......

It works.

Oh, and yes, I do have a Rans S6 503, have had since 2002, been all over in it. Took Er Indoors to France in it last September. Also built a Zodiac 601ul last year, the permit didn't come through in time for the French trip.

What do you fly?

Rans6....

rans6andrew
12th Mar 2010, 21:57
Now I have connectivity, what do I need to add to enable youtube videos to play?

Rans6...

MG23
13th Mar 2010, 02:09
Now I have connectivity, what do I need to add to enable youtube videos to play?

Flash. In Ubuntu you should be able to do something like 'sudo apt-get install flashplugin-installer' to install it and automatically keep it up to date.

batninth
13th Mar 2010, 17:44
Flash. In Ubuntu you should be able to do something like 'sudo apt-get install flashplugin-installer' to install it and automatically keep it up to dateI found the best way to enable flash content was to get the browser to ask & then let the package manager work it out. Package manager comes up with three options for playing flash, I'd suggest the real Adobe one is the better option although I've used the other open source alternatives & they work ok.

Ubuntu also goes and finds any required codecs, so the first time you go to a media file or media stream it'll take a few moments then will ask if you want it to go & get the media support. Again, I've not had any problems with this & haven't had to resort to command line since Ubuntu 7 to get media support working

BTW rans6andrew
What do you fly?Rans S6 582

MG23
13th Mar 2010, 18:58
Yeah, I knew there was some way to install Flash from the GUI but couldn't remember what it was :).

FullOppositeRudder
13th Mar 2010, 20:39
I've got a small ASUS EEE 700 loaded with Linux Xandros but I am not able to 'lock in' the c26 letter alphanumeric WPA code so I can log on painlessly without having to punch in all 26 digits each time
Anyone out there with a similar ASUS able to talk me through how to do it, preferably in the 'noddiest' of terms ?

Perhaps you have found a fix for this by now, but FWIW I have a somewhat similar combination of equipment here, and my 701 finds and logs on to the home wireless system by itself most of the time. However I use one of the shorter option WEP codes (10 characters) for another reason, and that may be a clue. It could be that trying a shorter code option for your wireless security may be more in line with the tiny machine's capability.

I hope you can find a fix - typing in all those characters each time is not a lot of fun on that smallish keyboard.

Regards,

FO Rudder

The late XV105
16th Mar 2010, 22:59
A question further to my post (http://www.pprune.org/computer-internet-issues-troubleshooting/399771-reading-s-m-r-t-results-over-network.html) about reading SMART temperature values from a remote Linux box (in my case a Western Digital MyBook World Edition II NAS - White lights not Blue light model).

Thanks to some great help I received I now have a nice script that outputs this to a semi colon delimited log file along with timestamp, a disk identifier (there are two in the NAS) and the average load for the past fifteen minutes.

16/03/2010;22:48;A;41;0.13
16/03/2010;22:48;B;41;0.13

Having for reasons of time constraints tried to avoid a science project and wanted a plug and play solution this time round, it's actually been a lot of fun learning some basic Linux.

What I now have is perfect for me to write an Excel macro to chart against, so now I want to automatically run the job every six hours. Unfortunately though WD have stripped the MyBook's BusyBox compilation to the bone and it doesn't have crontab. Of course I can install crontab, but before I do is there another easy way of scheduling my script to run on an un-expiring six hourly cycle, please?

mad_jock
17th Mar 2010, 08:05
You can just have it running permently so to speak. But then you get into the realms of counter hangups and stalls and possible issues reasources getting locked.

Crontab is by far the best way of doing it. Its what it is designed to do. And its been doing it without much change since Unix was invented.

The late XV105
17th Mar 2010, 09:26
Thanks, mad_jock. Pretty much what I expected, even though I hoped there was a similar function I'd missed. Crontab it needs to be. Cheers.

The late XV105
23rd Mar 2010, 13:29
An update to share how I got on in case any other PPRuNer needs to tread the same path.

First of all I ran the script found here (http://mybookworld.wikidot.com/forum/t-161883/migrating-from-bluering-to-whitelight#post-510951) to install both Optware and a bunch of useful features on my WD "White Lights" NAS. Included in these features wasn't just Cron itself but also a nice folder structure to allow any required executable job to be dropped in the appropriate one for the frequency required (/etc/cron.min, /etc/cron.5mins, /etc/cron.15mins, /etc/cron.30mins, /etc/cron.hourly, and so on)

I consequently have my temperature and load logging script sitting in /etc/cron.hourly and producing a new log entry every four hours thanks to a time-checking "IF" statement that I placed in the script.

Another feature included in the download was mini_sendmail so I have written another script that runs every 15 minutes and which checks if either of the two HDDS in the NAS are approaching their maximum design temperature of 55 deg C. If they are, the script connects the NAS to the mail.btinternet.com server to send me an e-mail (which I will receive wherever I am on my Blackberry).

If either of the HDD temperatures continues to climb and exceeds 55 deg C I will be sent a second mail, and if no cooling action is then taken within five minutes that results in the temperature dropping back to 55 deg C or below, the NAS will shut down.

On a scale of zero to hero I'm only just off the starting blocks but considering that a week or so ago I knew nothing about Linux other than that the NAS used a stripped down BusyBox flavour of it, I'm pleased with these practical tweaks.

BTW - I highly recommend WinSCP (http://winscp.net/eng/docs/introduction) for connecting to a remote Linux device. It is an open source free SFTP and FTP client for Windows that also supports the legacy SCP protocol and that allows safe and simple copying of files between a local and a remote computer. It saved me MASSES of time compared to navigating around in an SSH command console.

mixture
23rd Mar 2010, 17:47
XV105, you still messing around with that hack box ? :ok:

I'm very impressed by all your progress .....ready for your next project ?

It involves a copy of MRTG (also free & open source) and you having pretty little graphs of your HDD temperatures etc. :E

MRTG - Tobi Oetiker's MRTG - The Multi Router Traffic Grapher (http://oss.oetiker.ch/mrtg/)

NRU74
23rd Mar 2010, 18:38
FORudder
Sorry for belated response, but have cracked it and the laptop connects automatically now.Having wasted much time trawling ASUS sites on the web without success I eventually dug out the manual and followed the instructions.Doh.
Regards NRU

The late XV105
23rd Mar 2010, 20:34
Cool, thanks mixture!
I will take the bait! :)

Before then I have a little niggle to solve though; I run Memeo Backup on each PC and laptop on the home network, each backup being written to its own folder on the NAS. Works a treat.

This evening though I installed vsftpd as a small practical project to allow me to share chosen files in a secure way and noticed that the backups were all visible to the User created for secure ftp login. Of course the reason is that the backups are all marked "public" even though they are not explicitly sub folders of the public folder.

My problem is that as soon as I remove the "public" flag, Memeo Backup of course reports "destination not found". Unfortunately it doesn't have a "username and password" option for access to the now secure folder so I can't use one of the Userids that exists on the NAS.

Suggestions for how to achieve a "internally public" and "externally public" structure on the NAS?

For info /etc/vsftpd.conf already has chroot_local_user=YES which I thought was supposed to restrict the User I have created only to the directory I want them to see.

TVM!

BTW - vsftp installed easily, needed only minor "my NAS specific" setup doing, and works a treat.

The late XV105
23rd Mar 2010, 21:13
Solved! :)

I just learned about chroot jail and now have my FTP user securely locked inside!

/etc vsftpd.conf now contains chroot_list_enable=YES and I then created a file called /etc/vsftpd.chroot_list in which I put the Userid in question.

sea oxen
23rd Mar 2010, 23:20
XV105

Neat piece of software (vsftp), thanks for the tip. I run a slug on Sarge -I'd never heard of the Book World or Optware - the latter because I didn't take the Unslung route, I guess.

SO

rans6andrew
24th Mar 2010, 10:18
I have the Ubuntu live doing what I want on my laptop, created a "live" USB memory stick as this allows me to put the second battery back into the machine (it goes in to the CD slot). It all works just so.

When I tried to run the Ubuntu Live, from the USB memory stick, in my desktop machine it does not see our wireless network. How should I "enable the wireless"? On the laptop there is a simple keyboard shortcut for it, Fn+F2 toggles the radio on-off.

Thanks,

Rans6...

The late XV105
24th Mar 2010, 13:25
A pleasure to offer the tip, sea oxen. Having lifted the lid I've been amazed at what I've found and what started out with simple aims has become an all consuming interest!

Next challenge about to be posted (below) to help nudge me along my way :)

The late XV105
24th Mar 2010, 13:30
vsftpd is working fine in so much as I can happily upload files to my MyBook "White Lights" NAS and that the User (validated, not Anonymous) is locked in jail with the folder I have limited them to, but no files are actually visible.

For the record I'm using CuteFTP 8 as the FTP client and after login the folder appears empty. Dragging a file in to it results in FTP transfer though, and dragging it again results in a "duplicate file" message so I know the first one worked even though the folder still appears empty. Connecting to the folder via WinSCP or Windows Explorer (because it has CIFS enabled as well as FTP) confirms that the file exists.

Que?

Google shows that I am not alone and hints at PASV (passive) FTP as the problem but none of the "fixes" I've tried have worked for me.

Suggestions, please?


TVM,
XV

rgbrock1
24th Mar 2010, 13:41
XV:

Instead of using CuteFTP as the client why not try
gFTP? I've used the former and encountered the same problem as you did ie, files which are there no visable in the client.

mad_jock
24th Mar 2010, 15:32
XV stoping a big jessy using clients for FTP your a command line chap now.

Either write a script which grabs them all for you.

Or stick a webserver on your linux box and get creative. Either ftp from the server or write an application which displays your results.

Shunter
26th Mar 2010, 19:32
One would suggest that you investigate the default umask set on uploaded files. Sounds like vsftpd is setting 070 which allows you to write files but not read them. Should be in your vsftpd.conf file. This is common for anonymous uploads, but less so for authenticated sessions.

rans6andrew
23rd Apr 2010, 16:38
help!

I bought all of the bits n bobs to put together a new PC, chucked them together and got it up and running with Ubuntu live cd. Sort of OK but quite limiting as you can't save any settings or install any drivers. So I made a "ubuntu boot" memory stick which would not actually boot the machine into a stable state. The memory stick works just fine on my laptop, it allows firefox bookmarks and files to be saved, remembers options and setup etc.

If anyone knows why this might be I am interested but only as a secondary option.

So, I installed Ubuntu on the first hard disc and got it up and running, apparently stable although I can't actually set anything up to leave it doing something. I ran the memory test from the Ubuntu startup screen and it chugged all of the way through the mmemory with no errors. Then I enabled the networks and wireless driver (all point and click stuff) and got it to connect to our wireless router and the WWW and started to surf and find that it crashes, often, and re-boots and then offers to send an error log to somewhere.

The error appears to related to a file called ooops. Is there anything in the error log, if I can find it, which will help me to find out why the crashes happen?

I have a copy of which XP I could install but am reluctant to do this if there is a hardware fault on the system.

What would be a sensible thing to try next?

Rans6...

rgbrock1
23rd Apr 2010, 16:51
rans6:

Very unusual that Ubuntu, or any Linux distro, would cause a system crash.
Seems as if it could indeed be hardware.

Open a terminal and enter the command dmesg and see if you can spot any errors there.

mixture
23rd Apr 2010, 16:58
XV105,

My guess....


# Default umask for local users is 077. You may wish to change this to 022,
# if your users expect that (022 is used by most other ftpd's)
local_umask=022


If you want me to PM you an example working VSFTPD config file lemme know.

aerobelly
23rd Apr 2010, 22:05
I bought all of the bits n bobs to put together a new PC, chucked them together

I'd be happier to read "installed carefully, using anti-static strap and making sure all connectors were cleanly seated and nothing touched things they shouldn't".

Seriously, if you didn't use anti-static precautions while installing CPU and memory in particular I'd worry about b*****ing those. If the CPU fails you probably won't get it to boot at all, but corrupted memory can cause all sorts of weird errors that look like different problems entirely. I'd run Mtest86 for several (say 4) passes before clearing the memory of suspicion. That will take a few hours no matter what the speed of CPU, memory or buses.


So, I installed Ubuntu on the first hard disc and got it up and running, apparently stable although I can't actually set anything up to leave it doing something.

Do you mean it won't stay running with no-one logged in? Or it crashes every 5 minutes, or what?


I ran the memory test from the Ubuntu startup screen and it chugged all of the way through the mmemory with no errors. Then I enabled the networks and wireless driver (all point and click stuff) and got it to connect to our wireless router and the WWW and started to surf and find that it crashes, often, and re-boots and then offers to send an error log to somewhere.

If it crashes in much the same way every time it could be corrupted memory or corrupted disk. I have also had a machine (the one I'm typing on now actually) crash like that thanks to a spider's web and the dust it collected on the back of the motherboard, this shouldn't be your problem today though.

The error appears to related to a file called ooops. Is there anything in the error log, if I can find it, which will help me to find out why the crashes happen?

A crash of this sort doesn't have time to get into the logs. If you want to try, they are stored in /var/log -- you can ignore the ones with names ending in .gz although "zmore <filename>" will show them. The others will have the most recent boot at the bottom, so you may need to flick through the whole thing to find messages timestamped around the time of your crash. If the whole shebang just keels over in a flash don't hold out much hope though.

'b

rans6andrew
24th Apr 2010, 08:16
thanks aerobelly. Right, I did observe good static precautions, being in electronics design and proto building I have the necessary stuff, wrist straps, mains lead with only earth connection etc.

The machine boots up and stays up as long as the wireless network is not enabled. I don't have any applications that I can leave running to test everything stays good. I have run the Ubuntu memory test and it goes through the 6 Gb of ram repeatedly without error. As soon as I enable the wireless network it becomes unstable and reboots at irregular intervals, from a few seconds to 20 minutes. It has connected and downloaded a few (200+ !) updates for Ubuntu and I have got as far as getting Adobe X... mpeg player add-on for Firefox before the system re-booted.

The wireless card in the new system is a known working card pinched from an old, but still working, desktop machine.

This morning I swapped the wireless network cards between the new system and my Dell desktop, which I am using for this browsing session. This was fruitless as neither machine had the correct drivers for the cards so I have swapped them back.

in /var there is a "crash" directory with 3 files in. I'll search through them and dmesg to see what they reveal.

Thanks again,

Rans6... (very new to unix)

aerobelly
24th Apr 2010, 19:17
Right, I did observe good static precautions, being in electronics design and proto building I have the necessary stuff, wrist straps, mains lead with only earth connection etc.

Excellent, if only everyone did....


The machine boots up and stays up as long as the wireless network is not enabled. I don't have any applications that I can leave running to test everything stays good.

A trivial test that never completes is easy, but it won't thrash the memory & disk enough. I kick off graphics tasks that take up to 12 hours, but there's gigabytes of raw files involved. I think the answer is elsewhere though....



I have run the Ubuntu memory test and it goes through the 6 Gb of ram repeatedly without error. As soon as I enable the wireless network it becomes unstable and reboots at irregular intervals, from a few seconds to 20 minutes.

Yeees. The card and its interrupts (ISA card I assume), or its driver come under suspicion now.


It has connected and downloaded a few (200+ !) updates for Ubuntu

That's the way Linux updates & fixes work: little & often. Unlike the Windows way of huge and yearly. (Server-grade Windows is monthly updates -- look up "patch Tuesday".) If your e.g. office, multimedia or graphics applications came through Ubuntu they get updated at the same time automagically, so you don't have to go round all the vendors to find their latest patches.


The wireless card in the new system is a known working card pinched from an old, but still working, desktop machine.

This morning I swapped the wireless network cards between the new system and my Dell desktop, which I am using for this browsing session. This was fruitless as neither machine had the correct drivers for the cards so I have swapped them back.

Linux does suffer a real problem with add-on cards. The developers often refuse to sign their rights away to see the proprietary information needed to handle the card, and the manufacturers can't be bothered to provide proper Linux drivers. So the result is that someone takes a flying guess at how the card works. And that can result in problems. If you absolutely have to have wireless networking I'd research what vendors provide proper Linux drivers and get one of their supported cards. If it's not an absolute necessity then you'll improve your security (or save yourself a lot of work setting it up properly) by reverting to a wired network.

I have no need of wireless at home, and have in the past refused to countenance it for businesses, unless the hours (weeks?) of research and experiment into how to configure it securely could be resourced. I was paid to be paranoid, and I think I did a good job ;-)



in /var there is a "crash" directory with 3 files in. I'll search through them and dmesg to see what they reveal.

/var/crash is empty on this Ubuntu system :-
uptime 20:13:03 up 60 days, 3:07, 4 users, load average: 0.01, 0.03, 0.00

... so hang on in there, the problems are not unsolvable.



'b

rans6andrew
26th Apr 2010, 14:52
further to my previous info, the /var/crash files cannot be viewed as they are locked by the owner process that generated them, I forget the name, begins with app something or other.

The DMESG command gives a long script and the only thing that caught my eye is a number of timeouts of eth1, right at the end of the file. I don't know if eth1 is the wired network card or the wireless, can't check without interupting the memory test.

Also, the suggestion that it only crashes when the wireless network is on is a red herring. It can crash before it has even finished booting up. I ran it for about an hour, playing video off USB drive, wireless network disconnected, and it crashed again. When it rebooted it crashed almost straight away, and then after about 15 mins. I tried to send an error report after the last crash and it reported that my copy of Ubuntu is not an original source.

I have, since then, re-installed the Ubuntu to the hard disc, where it failed to boot several times. For over 3 hours now, I have had the machine running the memory test from the first option screen of the Ubuntu Live CD. It has gone through nearly 4 times with no errors.

The machine is an i7 930, with 6Gb ram on a Gigabyte x58 motherboard. Nothing in it seems to be running warm.

Is it possible that Ubuntu 9.1 is just not compatible with the spec of the machine?

rgbrock1
26th Apr 2010, 15:33
eth1 is more than likely the wireless card. eth0 is more than likely the wired NIC card.

mixture
26th Apr 2010, 16:44
eth1 is more than likely the wireless card. eth0 is more than likely the wired NIC card.

eh ? Since when ? he/she may just have more than one NIC ?

Depends whats going on in modprobe.... :cool:

BOAC
26th Apr 2010, 17:38
Teamviewer now released for Linux (and STILL free for personal use)

aerobelly
26th Apr 2010, 19:07
further to my previous info, the /var/crash files cannot be viewed as they are locked by the owner process that generated them, I forget the name, begins with app something or other.

type "sudo more <filename>" and give your password when asked to. On Ubuntu systems "sudo ..." gets around all restrictions like that, so be careful.



The DMESG command gives a long script and the only thing that caught my eye is a number of timeouts of eth1, right at the end of the file. I don't know if eth1 is the wired network card or the wireless, can't check without interupting the memory test.

The Gigabit X58 series all seem to have onboard ethernet, so I would expect the BIOS to find that before an add-on card, so the LAN would most likely be eth0 and the wireless eth1.



Also, the suggestion that it only crashes when the wireless network is on is a red herring. It can crash before it has even finished booting up. I ran it for about an hour, playing video off USB drive, wireless network disconnected, and it crashed again.

I would try it with the wireless card removed, in case it's an ISA/EISA problem. You did say that it's an old card.


The machine is an i7 930, with 6Gb ram on a Gigabyte x58 motherboard. Nothing in it seems to be running warm.

Is it possible that Ubuntu 9.1 is just not compatible with the spec of the machine?

X58s have been around for nearly two years, so I'd be surprised. Just one thought, some Linux distributions are built in different versions for up to 1Gb of memory, and more than 1Gb (Fedora perhaps?). Is the Ubuntu you're using a full 64-bit, unlimited memory version? I use magazine-cover copies, but I'm careful to make sure I've got the right one for the architecture, AMD 64 bit, or i386 depending on which of my 4 machines I'm installing on.


'b

Saab Dastard
26th Apr 2010, 20:38
I would try it with the wireless card removed, in case it's an ISA/EISA problem.

I don't think that any commercial ISA wifi cards were ever made.

SD

rans6andrew
26th Apr 2010, 20:43
the wireless card is not ancient, although not brand new. It is a Netgear PCI MA311. A trawl of the net suggests that Orinoco is not the best driver for this card although that is what seems to be referred to in the scripts I saw somewhere.

If I find a better driver, do I need to re-build the whole OS to install it? Sorry I am new to Unix at this level.

The memory test ran through 9 times in about 7 hours before I stopped it. It flagged up that it had found 6 Gb of ram, I don't know if the version of Ubuntu I have is 32 or 64 bit, I guess there is an easy way to find out? I got the CD from a friend who had downloaded and created the "Live" CD it from the Ubuntu free site, I understand.

Will a disconnected wireless card still have any activity in the system? I will try unplugging the card, tomorrow.

rgbrock1
27th Apr 2010, 14:17
to find out if you're running a 32-bit or 64-bit version of Ubuntu enter the
command: uname -a at the shell prompt.

When downloading Ubuntu from their web site you have to specify the 64-bit
version from a drop-down menu. Otherwise you get the 32-bit version by default.

rans6andrew
28th Apr 2010, 11:54
thanks to everyone for their interest.

I appear to be running only a 32 bit Ubuntu, I can download a 64 bit but doubt that it has any bearing on my system crashes.

On the good side, I have removed the MA311 wireless card and the system has been up for 5 hours now, no sign of any instability.

The question must be - is there a driver more suited to the MA311 card? If there is I can get it onto a memory stick and then onto the machine, how should I install it? Is this a kernel rebuild process? Scary stuff.

Rans6...

rans6andrew
28th Apr 2010, 17:11
after shifting the whole shooting match into the other room I find that the wired ethernet does not work at all. I have tested the wired connection using my laptop, running Ubuntu, and noted all of the info in the network configuration menus. The same info, except for the mac address, is copied to the new machine and..... nothing. At least the wireless did connect for a while, until the machine falls over bigtime.

The laptop auto eth0 has a mac address in the boxes, I think it was picked up automatically, the new machine does not. Is this something I need to find and enter or should it pick up for itself?

Rans6....

mad_jock
28th Apr 2010, 20:42
use a live OS fedora and see if it works

rans6andrew
29th Apr 2010, 06:08
phew! The lack of connectivity through the ethernet wired link was caused by my failure to find all 3 bits of the bios to set up. I had spotted the option for power saving of the network hardware and the option for enabling the hard wired LAN but I had missed the option to turn on the LAN boot rom.

Just need to sort out the driver for the MA311 wireless card and then cart all the stuff back to my play room.

Rans6...

batninth
29th Apr 2010, 19:51
Ok Linux Dudes, is anyone here running any personal productivity software on Linux that they like/recommend?

At the moment I use TiddlyWiki for note taking, but am frustrated by the lack of a good calendar / reminder / task list type function. You can get some "tiddlers" to do this stuff, but I find they take away the flexibility of the note taking.

I also tried OneNote under Windows but it was worse

Anyone got anything they think is good for this stuff?

Thanks

batninth

rans6andrew
30th Apr 2010, 10:34
by totally disabling the wired network in my system I seem to have sorted the wireless connectivity issues. Hurrah!


Now, I am having issues with some video files and video streams. If I get the BBC sport live video up on the screen the computer crashes after a few minutes. Also, I found 2 video files (.mpg format) on my portable hard disc which crash the system as soon as they are loaded into a player. Other files play perfectly happily though.

Ubunto 9.10 updated, 32 bit OS, i7 system with loads of grunt.

Two build updates have installed themselves since I loaded from a live cd. The latest one doesn't boot properly. When it gets to the point where there is a white Ubuntu emblem on a black screen it freezes. On a press of the reset button it restarts and I am offered normal or safe starts from 3 builds, the second one launches normally. How might I determine what is wrong with the highest numbered build option? And fix it!

Newbee on a steep learning curve,

Rans6....

edited almost immediately to remove those typos that the submit button causes!

rgbrock1
30th Apr 2010, 12:46
Batninth:

Try using the app named Evolution. Not only is it an email client (and has connectivity to Exchange servers) but it also has Task lists, Contacts, Calendars and Memos. I very nice productivity application. Does everything MS Outlook does, and then some.

For another very fine personal productivity app try OpenOffice: Spreadsheets, Word Processing, Presentation, Database, and Drawing applications all in one. You can open MS Word documents, for example, and save OpenOffice documents in MS Word format.

rgbrock1
30th Apr 2010, 12:58
rans6:

In all my years of working with Linux I don't think I've ever met up with someone having as many problems with a Linux distro as you have!!!!! And it is extremely rare to see a Linux distro crash either itself or the system hardware.

Disabling the wired network card to get the wireless one working seems odd.

Perhaps you might want to try a different distro: one which may not give you all the headaches you've been experiencing? (Mandriva Linux comes to mind. I run it myself and it works "out of the box" as well as supporting most modern hardware)

Mac the Knife
30th Apr 2010, 15:19
"In all my years of working with Linux I don't think I've ever met up with someone having as many problems with a Linux distro as you have!!!!! "

Agree. And the failures seem to occur in a rather un-Linuxy way.

"And it is extremely rare to see a Linux distro crash either itself or the system hardware."

What usually happens when it does is that a process locks up or the X-server crashes. Fix by ending process or restarting x-server. Full blown kernel panics are unusual. Never known a Linux distro spontaneously reboot, that's Windows type behaviour.

I suspect either a troll or a duff motherboard (Memtest only tests memory, not other components).

Mac

:ok:

rans6andrew
30th Apr 2010, 20:38
I too am concerned about the possibility of it being a duff motherboard, as it is such a complex item that convincing the supplier that it is the guilty party might be tricky. On the other hand I am not looking forwards to trying it with windows due to the licence registration process and the need to re-licence if it all goes pear shaped.

The system stayed up for some 12 hours, yesterday, with no network hardware enabled. I used it to display some stuff from .pdfs while I was doing my day job. In the evening I went to the bbc website (wireless enabled) to check on the snooker results, clicked "play in a pop out window" and within a few moments it crashed and re-booted the whole system. This is repeatable.

I am still finding my way around the bios in the Gigabyte GDR3a motherboard, it is the most complex (versatile?) board I have ever had to commission. Perhaps I am still not quite there.

Thanks for your interest.

Rans6...

BOAC
5th May 2010, 13:42
I have downloaded, burnt and multi-boot installed on an XP machine this latest distro with exceptional ease. Appears to have openoffice loaded and gave me instant browsing with FF. Yet to try the rest including printer/email etc.

Most impressed.

rgbrock1
5th May 2010, 14:45
BOAC:

Getting email and printing up and running will be just as simple as the tasks you've already encountered.

Have fun!

BOAC
5th May 2010, 16:37
Yep! The printer was so simple even I managed it. Emails set up but not actually downloading at the mo - something to work on, but again sooo straightforward. A breath of fresh air.

mixture
5th May 2010, 16:46
I too am concerned about the possibility of it being a duff motherboard

PassMark BurnInTest software - PC Reliability and Load Testing (http://www.passmark.com/products/bit.htm)

rgbrock1
5th May 2010, 20:02
BOAC:

Wait until you give OpenOffice a whirl. (If you haven't already.)
Open, read and write documents which can be open, read or written in MS Word, Excel, Powerpoint, etc. or, open, read and write documents which can be open, read or written in OpenOffice format.

All for free! (as in free beer!)

BOAC
5th May 2010, 20:24
Yes - I've used that in Windows. Now sorted the email issue - put a , instead of a . in the server - doh! Set up the video/etc player. Yes, I like it. It's the way free software should be:)

Here is a stoopid question - no sign of Av or firewall that I can find?

le Pingouin
6th May 2010, 06:01
Here is a stoopid question - no sign of Av or firewall that I can find?Not stupid, just Windows habituated :)

Linux has very little malware so running AV software isn't a necessity to survive. ClamAV is in the software repositories & the likes of AntiVir & AVG do Linux versions - generally run as "on demand" scanners & not running all the time as for Windows.

If you're behind a router then that will generally provide enough basic firewall functionality. Gufw & Firestarter are a couple graphical firewall config tools if you want to run one.

BOAC
6th May 2010, 07:17
Not stupid, just Windows habituated - thanks le P (greetings) - can you recommend a good rehab centre?:)

I'm aware of the 'relaxed' approach to malware in the Linux world, but is it only a matter of time and popularity before it becomes a 'worthwhile' target?

le Pingouin
6th May 2010, 09:12
Linux is used extensively on servers & there doesn't seem to be a huge problem with malware there.

Windows has been a very easy target in the past & is the OS of choice for those most likely to do stupid things. At least Vista & Win7 make a better job of stopping the user running as admin. When there's so much low fruit available on Windows why would the malware makers work harder for little extra return?

I may be wrong but my feeling is the Linux environment is more diverse than Windows making it harder to get malware to run effectively.

Avtrician
6th May 2010, 10:30
Because programs (Virii) cant install and run at user level (not with out a lot of effort), Av is not as needed in Linux. There is usualy a firewall that is installed, its just not as intrusive or recsource hungry as in Windoze.

Helped a workmate install Ubuntu on an old troublesome (under XP) Dell laptop. everything including the webcam worked without effort.

rgbrock1
6th May 2010, 12:44
BOAC:

As others have so correctly stated: I wouldn't worry about av on linux.
It is a rarity when any Linux distro suffers from virus infection. Don't give it another thought.

As for a firewall. Following this link and behold the joys of configuring iptables, if you so desire, or other methods of installing a firewall on your ubuntu distro. (Ubuntu is one of the few Linux distros which do not come with a default firewall installation.)

Basic Ubuntu Linux Firewall Configuration - Techotopia (http://www.techotopia.com/index.php/Basic_Ubuntu_Linux_Firewall_Configuration)

rans6andrew
8th May 2010, 09:07
I spent the week stealing components from another, reliable, working machine and proved that the crashing is not power supply, wireless card, graphics card or ram module related. It is not the software as it has been known to spontaneously restart before it has completed the POST. I also updated to Ubuntu 10.04 to see if that made any difference. I called Scan to ask about returning the motherboard and the CPU and they suggested I try updating the bios. This I have done and, fingers crossed, it seem to be more stable. 22 hours and no restart.

One thing that I have noticed is that the scroll wheel on the mouse doesn't always work. This feature has been hit and miss since the 10.04 upgrade. Any idea why? or how to sort it?

Rans6....

Mac the Knife
8th May 2010, 17:38
"One thing that I have noticed is that the scroll wheel on the mouse doesn't always work."

Firefox does this on my Windows XP machine (yes, I have one...:\) - I think its a Firefox thing - Opera doesn't.

Mac

:ok:

rans6andrew
8th May 2010, 19:23
'tis not a FF thing in this case, it doesn't work in gedit either.

The Nr Fairy
9th May 2010, 07:39
Sorta Linux question, sorta Mac (VMware Fusion) question.

My VM running Ubuntu 10 LTS worked ok for a few hours. Now I can't click in the VM's window and get it to focus - clicking / cmd-G, nothing.

Googling has been little help, I suspect because I'm not using the right terms

Anyone else seen this ?

Hawkeye79
10th May 2010, 16:11
@rgbrock1
Firestarter is a good graphical firewall on ubuntu. It can be installed via Synaptic or apt-get.

rgbrock1
10th May 2010, 18:20
Good point Hawkeye. Firestarter is indeed a nice GUI firewall front-end.:ok:

N727NC
6th Oct 2010, 09:56
The new-to-us family server is a secondhand Dell Poweredge 64 bit machine, 4GB, running Suse 11.2 (64bit) with a 180 GB drive with the operating system only and a Linux Raid 1 of 2 1 TB drives mounted as /home. The Raid is about half full of family data. You might think that the 180 GB drive should last forever before it filled, but only a month after commissioning the server, the drive is full.

What is likely to have caused this and where do I start to look for junk to delete?

rgbrock1
6th Oct 2010, 12:50
First place to look is in /var/log. Can be lots of useless crap in there.
Second place, /tmp. (Although most Linux distros delete the contents of /tmp on reboot)

And if you really want to see what's chewing up your hard drive space issue the following command:

$ cd /
$ du -h | more

This will give you a rather lengthy listing of ALL your files on the entire hard drive.
You'll have to hit the <return> key to scroll through the listing.

Hope this helps.

MacBoero
6th Oct 2010, 17:45
It might be worth looking into setting up some crontab entries to clear out the log directory at regular intervals automatically.

N727NC
6th Oct 2010, 18:43
Thank you for the suggestion RGB1 - I cleared out /var/log and /tmp (the wonders of shift+del!) and now the drive has 3.7 GB on it and the server is happy again. There is no way that there were over 175Gb of messages, so I'm guessing somewhere a file was incorrectly allocating space - perhaps following a less-than-orthodox shut-down.

I'll look into crontab to see if I can set up a routine deletion of old logs.

Gertrude the Wombat
6th Oct 2010, 19:06
"Family" machine I think you said ...

Maybe some of those "log files" were in fact not log files but the kids' cunningly hidden stolen porno movies.

aerobelly
6th Oct 2010, 20:09
Before deleting logs check just how much space they take. On this 'ere system, Kubuntu 10.04 installed just under 6 months ago, all the logs since then take all of 54MiB. If there are problems the logs can help a lot in isolating them to hardware/software/circumstances. Logs are in plaintext and very repetitive, so they compress very well (that's the .log.N.gz files). They shouldn't take much space, but can save your ass. One place I'd look is .xsessionerrors in users' home directories. They can get huge, and because they're out of the normal log hierarchy are frequently overlooked. If you're going to scan the whole disk for space usage try sudo du / | sort -nr | more This will list directories in order of the space they take. But it's slooooooow in GiB territory, what it's like in TiB areas I hate to think. (The "sudo" is needed to see into other users' usage, if you haven't used it before.) There are also graphical tools that do much the same job, if looking at columns of numbers doesn't float your boat. 'b

N727NC
6th Oct 2010, 21:27
Gertie - that sort of thing is hidden on pendrives - they know Dad is too much of a geek for them to get away with it on the server.
AB - too late now, all are gone, but you confirm what I saw, which is that logs aren't large files. there must have been something in the /tmp area which either was genuinely huge or was erroneously reporting itself as such.

Grateful to all for the very valuable help.

LH2
7th Oct 2010, 14:22
The quick and dirty way, preferably as root:

du -csh /* 2>/dev/null

Example output:

8.3M /bin
23M /boot
264K /dev
41M /etc
73G /home
156M /lib
23M /lib64
16K /lost+found
4.0K /media
4.0K /mnt
4.0K /opt
0 /proc
1.6M /root
12M /sbin
4.0K /selinux
214G /srv
0 /sys
728M /tftpboot
6.0M /tmp
3.4G /usr
20G /var
311G total


Then repeat as needed on any subdirectories of interest, e.g.,:

du -csh /var/* 2>/dev/null
61M /var/adm
50M /var/cache
4.0K /var/crash
4.0K /var/games
20G /var/lib
32K /var/lock
79M /var/log
0 /var/mail
4.0K /var/opt
372K /var/run
99M /var/spool
17M /var/tmp
4.0K /var/X11R6
12K /var/yp
20G total


An alternative syntax to have the output appear in order of decreasing size:

du -s /var/* 2>/dev/null |sort -nr
20403992 /var/lib
100660 /var/spool
80352 /var/log
61476 /var/adm
51000 /var/cache
17360 /var/tmp
372 /var/run
32 /var/lock
12 /var/yp
4 /var/X11R6
4 /var/opt
4 /var/games
4 /var/crash
0 /var/mail


Various graphical tools are also available, but I always use the above as it's guaranteed to be on any Linux installation (some embedded platforms excepted).

txdmy1
7th Oct 2010, 17:39
copy them daily and compress to an oldlogs directory with a cron script. Delete any over 30 days old in oldlogs within same script, simples.
Copied this from what we did @ work when I moved onto nix systems support and set up my own sever as a VM image. Don't use it much now though, thinking about getting rid

LH2
7th Oct 2010, 19:43
copy them daily and compress to an oldlogs directory with a cron script. Delete any over 30 days old in oldlogs within same script, simples.

man logrotate (http://linuxcommand.org/man_pages/logrotate8.html)

As the OP is using OpenSUSE, this will do the trick: zypper install logrotate

BOAC
22nd Oct 2010, 07:47
'Fix' due out shortly to close down a bug in the OS that enables someone to attain superuser rights on a system.

Just when you thought you were safe.....................

mad_jock
22nd Oct 2010, 08:37
Any link to what the bug is?

mad_jock
22nd Oct 2010, 09:07
Found it.

Linked to oracle client installations. You need to have an account on the machine and access to it.

Fix has aready been pushed.

mixture
22nd Oct 2010, 09:28
BREAKING NEWS: The Pope is catholic.

Honestly BOAC.

I've said it once and I've said it again.

Software is written by human beings.

The more lines of code in a piece of software, the greater the risk of bugs in the code.

More complex pieces of software have a great number of interdependencies with other software written by other people (crypto libraries etc.).

Mac, Linux, Windows .... even the infamous OpenBSD. Only an idiot would claim their software to be invincible, as time and time again, it's proven that it's not a case of if.... but when.

What differentiates the software developers is not whether there are bugs in their code, but the overall quality of their code......how many bugs are found, the seriousness of the bugs, and how the bugs are dealt with etc. etc.

BOAC
22nd Oct 2010, 10:31
Mixture - may I suggest you stop reading? Your news on the Pope is timely, however.:ugh:

mad_jock
22nd Oct 2010, 11:33
BOAC the simple difference between this "security breach" and the microsoft ones is that the linux/unix ones are usually found by pro security firms. And its is very rare they are in the core kernal. They also tend to be package specific as in this case. Its not actually the linux kernal that has the security hole its a third party add on for oracle clients.

BOAC
22nd Oct 2010, 13:14
m j - according to my info the flaw IS in the Linux kernel and was introduced in version 2.6.30. The fix is at http://www.vsecurity.com/download/tools/linux-rds-exploit.c. We may be looking at different things? This is the second recent kernel 'error' by the writers following September's.

mad_jock
22nd Oct 2010, 13:58
Its in a protocol layer of RDS which is a data package protocol for operating databases eg Oracle. You have to be on the local machine with a local account to be able to use it. ie you the user have to want to get into your own machine. Any self respecting linux user would know if you want to zero the root passwd and have access all you have to do is boot via a liveOS and zero the root passwd in the passwd file. Its only really an issue if you have a work machine aka your a dealer on the stock exchange. Even if you do get the local admin rights you still have no access to the servers.

For this to be able to work the RDS services has to be up, in 99% of the linux machines it won't be turned on.

The second flaw was part of the GNU C. libarys. And I might add as well there is no way I would ever open that link of yours. Its a C script that will screw every type of OS if it has something nasty inside it.

Which again is the main issue with nastys, users clicking on things that they don't have a clue what they do. Call it security_update.doc.c. most folk won't spot the c. on the end click on it and trigger the script.

BOAC
22nd Oct 2010, 15:19
MJ - to ease your concerns over whatever a 'c' is, you can visit VSR Security Advisories (http://www.vsecurity.com/resources/advisory/20101019-1/) (no 'c')

mad_jock
22nd Oct 2010, 16:01
C is a programing language.

And yes that is the "security flaw" which I was on about.

bnt
22nd Oct 2010, 16:53
Of course any computer is vulnerable if you have direct local access to it, or if you're silly about passwords. The book The Cuckoo's Egg (http://en.wikipedia.org/wiki/The_Cuckoo%27s_Egg_%28book%29) documents various hack attacks on UNIX that took place in 1986, in which secure military systems were brought down by "human factors", such as weak passwords and "social engineering" (i.e. call someone up and ask for the password). There were also remote attacks such as the Morris worm (http://en.wikipedia.org/wiki/Morris_worm), which exploited some known bugs in UNIX processes. When we talk about systems being vulnerable these days, we hope that people and designers have learned from these and other past vulnerabilities and closed them off, but of course it's not guaranteed.

One main difference between UNIX and Windows systems, in the past, has been that you had Windows users running with administrative privileges at all times, while the "root" user on a UNIX system was clearly defined as "only when you have to". You could log in as root and do work, but if you read any books or received any training, you were left in no doubt that that was a Bad Thing. If you're running e.g. a web browser, it should be running under your limited permissions, why would it need anything more? Normal UNIX users had no choice in the matter - permissions were enforced by the sysadmin.

Which wasn't a problem at first, since UNIX systems were always designed to be run by a trained sysadmin, but if you're going to roll out UNIX (Linux, Mac OS X, etc.) to users who are not sysadmins, you have to give them a way in, which led to the "su" or "sudo" method. This gives you temporary root permissions using your own password, not a root password. You launch an application with "sudo" do what you have to do, and close the app. If any application were to try that, you get asked about it - is it necessary? If that sounds like what Microsoft has been doing with Vista and W7, it's not a coincidence. :8

mixture
30th Oct 2010, 21:53
BOAC

Mixture - may I suggest you stop reading?

May I in return suggest you reconsider scaremongering type headlines that a journo would use..... "Linux IS vulnerable" .... :rolleyes:

Anyhow, no I won't "stop reading".... I'm off to read a book. Thank you very much.

mixture
30th Oct 2010, 21:56
bnt,

This gives you temporary root permissions using your own password, not a root password. You launch an application with "sudo" do what you have to do, and close the app

Ah yes... the joys of "sudo su" .... :cool:

Mike-Bracknell
30th Oct 2010, 23:49
Ah yes... the joys of "sudo su" .... :cool:

Linux...the Phil Collins of the IT world
:}

Bushfiva
31st Oct 2010, 01:22
Obligatory sudo reference: xkcd: Sandwich (http://xkcd.com/149/)

Shunter
31st Oct 2010, 20:12
Linux...the Phil Collins of the IT world

So Windows must be Benny Hill, right?

MG23
1st Nov 2010, 04:28
BOAC the simple difference between this "security breach" and the microsoft ones is that the linux/unix ones are usually found by pro security firms. And its is very rare they are in the core kernal. They also tend to be package specific as in this case.

The other difference is that 'Linux security hole' stories tend to hit the media two days _after_ my Linux machines have downloaded the updates which fix them. Normally when I see one of these stories I think 'oh, so that's why I got a new kernel on Monday', whereas when I used to run Windows I had to think 'Oh God, how long are they going to take to fix this one?'

That is, of course, still the situation with Flash bugs (nearly two weeks before they fix the latest critical exploit), but with Apparmor on Linux at least I can trivially sandbox Flash so that it literally cannot do anything bad to the OS because the kernel blocks it.

N727NC
15th Nov 2010, 15:35
I can't work out what is causing this problem, but over time the entire data disk fills up. Twice I have recovered the machine to a useable state by locating and deleting large directories, but this time I was too late and the machine has stuffed itself and now refuses to let me login.

I need to establish what the cause is, as there is clearly something amiss - and it isn't log files - they would never fill hundreds of Gbs in a number of weeks. Most importantly, however, I need to regain control of the server so that we can have our files back.

I can gain access to the machine using a live disk (SuSe, or Knoppix), but they refuse to let me create a RAID using the existing pair of data disks; they want to format the disks, which clearly I don't want to do. A possibility is to disconnect the 2 data disks and reboot onto the OS disk, reset the partition tables, reboot and try to remount the data disks, but I am anxious that I risk losing my data. Thoughts anyone?

And yes, the backup is fairly recent, but not that recent!

Mike-Bracknell
15th Nov 2010, 15:46
Boot with Knoppix or whatever, take the data onto a separate USB drive, then blow away the server and reinstall so that you know how it's configured. Terabyte disks are relatively cheap these days, so ~£100 would get you a pair which you can RAID (assuming it's a SATA array rather than SAS?)

N727NC
15th Nov 2010, 17:31
MB - Thanks for your thoughts, but I wish it were that simple. If I boot off a Live Disk, the partitioner recognizes the 2 disks as part of a RAID 1, but won't let me mount the RAID, saying that it cannot mount an unknown type of disk.

Is there a trick that I have missed?

Mike-Bracknell
15th Nov 2010, 23:11
MB - Thanks for your thoughts, but I wish it were that simple. If I boot off a Live Disk, the partitioner recognizes the 2 disks as part of a RAID 1, but won't let me mount the RAID, saying that it cannot mount an unknown type of disk.

Is there a trick that I have missed?

RAID1 is a mirror of 2 disks with identical information on both.

Break the mirror and mount an individual disk (if you can)?

MG23
16th Nov 2010, 03:55
What directories did you have to delete? I believe the / partition on my home server is about 16GB and I've never had this kind of problem.

N727NC
16th Nov 2010, 15:31
MB - breaking the mirror might let me get to the data, but might render both disks useless, so I am anxious only to go there when I have exhausted all other options (but I may have already done so).

MG23 - stupidly, I didn't take a record. I used the du -csh /* 2> ~/filelog to find the largest directory and, happy that it was not important to me, deleted it. This last time I deleted one of my own data folders - which I no longer needed - but even that was insufficient to get a proper logon.

Still stuffed ..........

Mike-Bracknell
16th Nov 2010, 17:43
MB - breaking the mirror might let me get to the data, but might render both disks useless, so I am anxious only to go there when I have exhausted all other options (but I may have already done so).

Simply physically remove one of the disks from the server. Store it somewhere, and it can be your array master (for rebuilding the array) if you muck up mounting the remaining disk.

bnt
16th Nov 2010, 20:36
If you get it back, can you install the baobab (http://www.susegeek.com/utility/baobab-disk-analysis-tool-in-opensuse-gnomekde4/) application? This is a friendly GTK+ app for examining disk usage information. I use it on Ubuntu, though the link I have suggests that you'd need to compile it from source. Anything major stands out pretty obviously.

N727NC
16th Nov 2010, 20:40
I've recovered the machine using a 'failsafe' logon, sufficiently that I can take a backup off the data disks. When I've got the data I need, I'll flatten the server and start again.

Thanks to MB and MG23 - and the others before - for your support.

I still have no idea how it is filling several hundred gigabytes of disk in a few weeks. I'm sitting behind a Netgear firewall and the standard SuSe protections are running.

Thank you for that hint bnt - I'll certainly install baobab - it will help me to keep an eye on the disk consumption.

Unixman
17th Nov 2010, 09:53
I would also suggest that a simple "lsof" might well be useful. This command lists all open files on a system and you might well be able to identify which file is causing a problem; alternatively use fuser -c filesystem to list all the processes that have files open on filesystem

For example (from a Solaris box)

# lsof | more

COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
sched 0 root cwd VDIR 85,10 512 2 /
init 1 root cwd VDIR 85,10 512 2 /
init 1 root txt VREG 85,10 48952 1610 /sbin/init
init 1 root txt VREG 85,10 41088 4499 /lib/libgen.so.1
init 1 root txt VREG 85,10 51176 4537 /lib/libuutil.so.1
init 1 root txt VREG 85,10 23276 4494 /lib/libdoor.so.1
init 1 root txt VREG 85,10 143744 4526 /lib/libscf.so.1
init 1 root txt VREG 85,10 870760 4509 /lib/libnsl.so.1
init 1 root txt VREG 85,10 51780 4514 /lib/libnvpair.so.1
init 1 root txt VREG 85,10 37400 4528 /lib/libsecdb.so.1
init 1 root txt VREG 85,10 1640776 4480 /lib/libc.so.1
init 1 root txt VREG 85,10 101036 4510 /lib/libmd.so.1
init 1 root txt VREG 85,10 93924 4530 /lib/libsocket.so.1
init 1 root txt VREG 85,10 27100 4483 /lib/libcmd.so.1
<snip>

Always look for regular files (VREG above - but Linux could well be different)




# fuser -c /var
/var: 965o 651o 603c 602o 588o 580co 520o 509o 478o 476o 472o 462c 303co 7o

Then use ps -ef | grep pid

# ps -ef | grep 580
smmsp 580 1 0 Nov 04 ? 0:04 /usr/lib/sendmail -Ac -q15m

No surprise that sendmail is writing to /var :8


BTW is swapd running?

N727NC
17th Nov 2010, 21:56
My thanks to all for their constructive help - especially to UNIXMAN for suggesting lsof, which I'll be using when all is back to normal.

Progress to date is that I have rebuilt the OS disk from scratch, but I was unable to persuade the OS to let me rebuild the RAID. However, I managed to mount one of the RAID disks to recover the information - a full backup is running at the moment. When the backup is secure, I'll try to rebuild the RAID without reformatting both disks, in the hope that it will recover the mirror - any clues as to how to best go about this? At the moment one disk is Ext4 - I was obliged to reformat it after the initial OS rebuild, but the other still thinks it is part of a RAID - albeit I have managed to mount both disks as /tmp and /tmp1.

The current rig has a 3 GB partition mounted as /SWAP - would swapd be more efficient? (I think I see what you are getting at - might the swap file have grown to fill the data space? Answer is, I think, no - as I only have a fixed swap partition).

......."Weird Trip, Man" (Oora - Edgar Broughton Band, circa 1972).

Unixman
18th Nov 2010, 03:57
I would stick with a fixed swap partition unless there is a very good reason not to.

One thing you don't say is whether you are using software or hardware RAID for your mirroring.

N727NC
18th Nov 2010, 12:53
The RAID is a linux software raid. I can't find a menu on the Dell to implement a hardware raid, which would have been my preference.

The OS is now running fine, and I am rebuilding the SAMBA server so that the various windoze clients can see their data again.

One issue is that because I mounted one of the drives as /tmp, there are lots of processes now using it, so the OS won't let me unmount it! Lesson to be learnt there, methinks. Any ideas on how to get out of this one?

rgbrock1
18th Nov 2010, 13:42
Not a good idea or practice to mount a disk in the /tmp mount-point.

Start over and use some other syntax. (/temp and /temp2) If you don't start over you're going to have a host of issues.

Unixman
18th Nov 2010, 17:12
Make certain that you haven't got an entry in /etc/fstab for the mounted filesystem; if you have then remove it and revert back to what should be in /tmp... then reboot

Mike-Bracknell
18th Nov 2010, 17:53
The RAID is a linux software raid. I can't find a menu on the Dell to implement a hardware raid, which would have been my preference

What PowerEdge is it? and what's the PERC in it? (or just quote the tag number on the reverse and we'll find out)

Usually, Dell provides drivers for use at build time for OSes with the PERCs, and you might find that the PERC BIOS is accessible at POST. CTRL-M?

N727NC
29th Nov 2010, 11:50
Thanks to all for your support - rebuilt machine now running like a dream and no repetition of full disk so far. Also, KDE4 seems to be behaving better than on the earlier build, so I'm guessing something became corrupted along the way. It's always frustrating when you can't identify the problem, but it seems to be behind us now.

Now, about that backup........

vulcanised
25th Feb 2011, 11:44
Or is it just the programmers?

London Stock Exchange halts trading over technical glitch - Telegraph (http://www.telegraph.co.uk/finance/markets/8347013/London-Stock-Exchange-halts-trading-over-technical-glitch.html)

Not the first time since the new system started.

Mike-Bracknell
25th Feb 2011, 15:23
Apparently it's all due to the migration to Millennium IT a few weeks back. Unlikely to be to do with the underlying OS.

mixture
25th Feb 2011, 17:04
vulcanised,

Before you start on your Microsoft fanboy mantras, I suggest you look into the history of Microsoft at the LSE, it's not exactly pretty. :rolleyes:

Look, it's simple. A computer is an idiot. It knows nothing. The number of humans and lines of code involved in getting a system at the LSE up and running is probably beyond your comprehension (we're talking everything from the BIOS up to the applications the LSE run and absolutely everything in-between). It is an indisputable fact that due the ever increasing complexity of computing itself, plus the complex environment of the LSE, that the s**t will hit the fan. The thing is nobody knows when, how often, and how serious the error will be. In the vast majority of cases it's just little bugs that can be squashed without the journalists wetting themselves, but occasionally, something will happen that the external system users will notice.

Sorry for the tone, but I think someone had to tell things like they are !

vulcanised
25th Feb 2011, 19:49
Microsoft fanboy mantras


I have never been accused of that and I'm most certainly not a fan of M$.

If you're not capable of making a point without such unwarranted drivel then you're better off remaining silent.

mixture
25th Feb 2011, 20:05
You're the one who posted a link to overhyped journalistic nonsense.

No further comment.

AnthonyGA
25th Feb 2011, 23:18
From what I've read, it looks like the problem is related to neither Microsoft's operating system nor Linux, but is instead a consequence of extremely poor IT management and design. A poor workman blames his tools.

You can build rock-solid systems on either type of operating system. You can also build garbage on either OS. If the previous system was using C# or .NET, those are already bad signs. A switch to Linux is also a bad sign. Both actions imply that the end user was simply trying to find the absolute cheapest, quickest "solution," without any regard for testing, safety, reliability, recovery, performance, etc.

You get what you pay for, and if you don't know how to write specs and/or don't know anything about IT, you usually get even less than you pay for.

MG23
26th Feb 2011, 00:50
A switch to Linux is also a bad sign. Both actions imply that the end user was simply trying to find the absolute cheapest, quickest "solution," without any regard for testing, safety, reliability, recovery, performance, etc.

Why would a switch to Linux imply that you weren't concerned about safety, reliability, performance, etc? I wouldn't run any important or safety-critical server software on Windows.

AnthonyGA
26th Feb 2011, 05:53
Why would a switch to Linux imply that you weren't concerned about safety, reliability, performance, etc?

Because it's usually motivated by an attempt to save money, since many flavors of Linux are free. This desire to save money betrays an attitude that is more interested in money than in safety, reliability, etc. If you adopt free software in place of payware, you have to assume many of the responsibilities normally taken by the vendor of commercial software. But organizations that switch to Linux to save money typically aren't willing to assume those responsibilities, and so things go wrong.

The total burden of work and responsibility is going to be roughly the same no matter which operating system you use. Many organizations, when they try to switch to free software, are naïvely trying to get something for nothing.

If they simply wanted UNIX, then the obvious choice would be some sort of commercially-supported version of UNIX, which would bring all the technical support and responsibility of a paid vendor with it. The fact that Linux was chosen instead strongly implies that the only motivation was lowering costs, with all other considerations taking a back seat. The catch is that you cannot lower costs that way, all you're really doing is shifting them around (instead of paying money to a third party, you'll be spending it on payroll for your own employees).

I wouldn't run any important or safety-critical server software on Windows.

Neither Windows nor Linux is appropriate for mission-critical or safety-of-life software. For that you either need a mainframe (in the case of business software) or an embedded system (for safety-of-life software). You can use Linux or Windows for the latter, but not just off the shelf. And for all potential Linux applications, I prefer UNIX or a UNIX descendant instead.

On the desktop, only Windows or (in some cases) Mac OSX is appropriate, unless the desktop role is very tightly and deliberately constrained. For servers, in most cases, I'd install UNIX or its immediate relatives. Linux is popular mainly for reasons that are unrelated to technical considerations. I wouldn't put Windows on a server unless it had to support something that runs specifically on Windows, such as Microsoft Exchange Server or Windows domain management.

Mac the Knife
26th Feb 2011, 06:18
"On the desktop, only Windows or (in some cases) Mac OSX is appropriate,......"

???????????????????

I'll inform my successful small business (mostly Linux, 2 Macs) immediately!

:ok:

Mac

Saab Dastard
26th Feb 2011, 11:07
I do find it strange that such an important system was based on Linux and intel, rather than on 'NIX and high-end mid-range systems from HP, Sun or IBM.

It does suggest a penny-pinching approach or attitude that may have extended into the design, development and testing regimes.

As others have said, it's rarely the OS or hardware that's to blame, usually it's the design and implementation that's at fault.

SD

mad_jock
26th Feb 2011, 13:15
I also find it strange Saab

Although I visited an old site I used to work out not so long ago and found things very much changed.

All the Solaris servers were gone to be replaced by... Linux although to be fair it was a commercial flavour.

All the desktop sparks were gone to be replaced by linux deskyops and MS laptops. There were loads of terminal servers doing database stuff.

The enterprise exchange system which had been the bane of my life at times had been punted and it was back to using sendmail on the nix systems with IMAP.

It all looked rather shoddy to me and certainly not what you would have expected in a blue chip companys server room.

AnthonyGA
27th Feb 2011, 10:31
Although I visited an old site I used to work out not so long ago and found things very much changed.

All the Solaris servers were gone to be replaced by... Linux although to be fair it was a commercial flavour.

All the desktop sparks were gone to be replaced by linux deskyops and MS laptops. There were loads of terminal servers doing database stuff.

The enterprise exchange system which had been the bane of my life at times had been punted and it was back to using sendmail on the nix systems with IMAP.

It all looked rather shoddy to me and certainly not what you would have expected in a blue chip companys server room.

The common trait to all of these actions is an obvious, ill-informed, and greedy attempt to save money by any means conceivable. Unfortunately, any apparent savings in the short term will be offset by much higher costs in the long term. However, even accountants sometimes don't see the forest because of the trees, and may not realize that higher costs could be the result of penny-pinching decisions made years earlier.

None of the changes you describe has any real technical justification. The use of terminal servers is especially irresponsible, although I've seen it often enough. In the old days, that was called "timesharing," but timesharing worked much better than terminal servers.

I'll inform my successful small business (mostly Linux, 2 Macs) immediately!

You can get away with all sorts of things in a small business, including many unconventional IT policies (except a lack of backups, which can be fatal to any business). When you are managing 60,000 desktops in 100 countries, however, the rules change.

Mac the Knife
27th Feb 2011, 14:25
"When you are managing 60,000 desktops in 100 countries, however, the rules change."

Indeed. With their record at this sort of scale I'd be worried about using Microsoft. And even considering their massive discounts to big business users it would be expensive - though I would write it off as a business expense and get the ordinary tax-payer to subsidise me (and Microsoft).

"You can get away with all sorts of things in a small business, including many unconventional IT policies..."

I hardly think Linux per se qualifies as unconventional nowadays - should we all then be restricted to commercial UNIX or Microsoft? Would it be "better" if I was using FreeBSD (which I considered) or are only commercial offerings acceptable?

The fact is that just about any modern OS is as good (or bad) is its implementation in a business. Crap sysadmins, slack security and lazy policies will make any system liable to instability, corruption and crashes no matter how much money you have paid for it.

Mac

Booglebox
27th Feb 2011, 15:45
No offence chaps but you are all stuck in the stone age. I don't know anybody using anything else apart from regular off the shelf Linux (when not using MS) for anything from small, fairly mission critical projects (radio station) to extremely large-scale web applications that serve tens of thousands of users.

Mike-Bracknell
27th Feb 2011, 16:38
No offence chaps but you are all stuck in the stone age. I don't know anybody using anything else apart from regular off the shelf Linux (when not using MS) for anything from small, fairly mission critical projects (radio station) to extremely large-scale web applications that serve tens of thousands of users.

Then your definition of 'mission critical' is different from AnthonyGA's (which incidentally is a pretty textbook enterprise view of things).
Also, a web app serving 10,000 users is not "extremely large scale".

AnthonyGA
28th Feb 2011, 08:21
Indeed. With their record at this sort of scale I'd be worried about using Microsoft. And even considering their massive discounts to big business users it would be expensive - though I would write it off as a business expense and get the ordinary tax-payer to subsidise me (and Microsoft).

I'm not sure what you mean by record of scale. Workloads are typically distributed over multiple servers in large enterprises. For example, Exchange is usually spread over dozens or hundreds of servers, and the same is true for domain control and validation. There are some enterprises with well over 100,000 Windows desktops in use.

I hardly think Linux per se qualifies as unconventional nowadays …

On the desktop, Linux is extremely unconventional, with only about 0.1% of the market. That percentage has not significantly changed in years, and unless some fundamental changes occur in the Linux world, the percentage will never change.

On servers, Linux is popular, because (1) it's cheap or free; (2) it has been very heavily hyped, especially by people who have never heard of UNIX; and (3) it looks a bit like UNIX (although UNIX fans will want the real thing, and I don't blame them).

Would it be "better" if I was using FreeBSD (which I considered) or are only commercial offerings acceptable?

Use whatever you want. I run FreeBSD on my server, and Windows XP on my desktop.

No offence chaps but you are all stuck in the stone age. I don't know anybody using anything else apart from regular off the shelf Linux (when not using MS) for anything from small, fairly mission critical projects (radio station) to extremely large-scale web applications that serve tens of thousands of users.

I don't know of any large organization successfully using Linux as a plug-in replacement for Windows desktops. Linux is not suitable for the desktop. The only organizations attempting this are those that are hellbent on "saving money," although there are many pitfalls to trying to move to Linux that can wash away savings and cost a fortune, as many organizations have discovered.

Linux is more popular on servers, for reasons already stated (unfortunately these reasons do not include technical superiority). A system serving ten thousand users is not "large-scale" by my definition, which comes from the world of mainframes. Even my own personal Web site serves thousands of unique visitors a day. A fairly good-sized company might have 40,000-80,000 desktops, a large company may have many more.

AnthonyGA
28th Feb 2011, 23:03
The fact that Linux may be cheap or free does not stop large companies having a business model that allows them to make some extremely good revenue out of the applications that run on that O/S. Go ask IBM about Linux on System z running under z/VM.

IBM makes a lot more money on its own proprietary operating systems. So do other companies. When they support or offer Linux, it's mainly to respond to a demand by IT managers who don't know any better than to insist on whatever they last read about in the trade rags.

The proprietary operating systems are often dramatically superior to Linux for a given type of job, too. Mainframe operating systems, for example, are extremely productive for the types of work for which they are designed, far more so than a generic OS like Linux. Even UNIX is a terribly poor choice for mainframes, and if misguided customers didn't insist on it, it wouldn't be in the catalog.

AnthonyGA
1st Mar 2011, 08:22
Been waiting for years for a commercially viable release of OSX that can run legally on any of the stuff that passes for a PC. Now that would be ideal, nice GUI on top, NetBSD underneath. Looks like I will still be waiting for a while though.

Probably. Apple—like Microsoft—has chosen to favor the desktop over the server, and the two goals are in direct conflict. Apple has written mountains of code to convert a UNIX-like command-line timesharing server into a single-user GUI-based desktop. It has succeeded admirably, but in so doing it has made its OS less and less suitable for use as a server. This isn't really a problem for Apple, which wants to sell desktops, anyway. The only reason Apple chose existing software as a basis for its new OS was that it couldn't afford to write a new OS from scratch (at the time). Writing an entirely new OS, as Microsoft did with NT, costs billions of dollars. Apple might have the cash for that today (no thanks to the Mac, but thanks to the iPod and iPhone), but it didn't back then.

Anyway, GUIs soak up much of the horsepower of any system equipped with them, irrespective of the OS beneath. Glistening, dancing, transparent 3-D GUIs may win beauty contests, but they are very expensive in terms of resources. Unfortunately, today's Windows is stuck with a GUI, which is one of the drawbacks that make it less suitable than UNIX and its ilk as a server. In some cases up to 80% of the processing horsepower of a system can be consumed by the GUI, so just having one on a server is a waste of money. Not only that, but many administrative tasks are much faster to carry out with a command-line interface than they are with a point-and-click GUI interface. Windows is very tiring to use as a server because it is impossible to avoid using the GUI for many tasks.

I'm not sure about the extent to which you can strip the GUI out of Mac OSX, but it's pretty much impossible with Windows today. XP, Vista, 7 … they all come from the NT code base for the most part, but over the years the rock-stable and super-secure NT code base (which is very well written) has been contaminated by imports from Windows 95, which was garbage. The original NT GUI was quite distinct from the kernel, and the system was very secure in consequence. Those days are gone. Both were progressively sacrificed for the sake of users who wanted a more "friendly" and "pretty" interface, which required gutting some of the security features to improve performance (whence DirectX et al.). I was never happy about that, but that's the way it went. The secure Program Manager and Explorer were discarded in favor of the mess from Windows 9x, destabilizing the system. This improved the "user experience" for Windows on the desktop, but put holes in the security for Windows as a server, and made the system more difficult to lock down. It's still more secure than OSX or Linux, though, by orders of magnitude.

Apple did the same thing with OSX, bolting on vast amounts of extra code to make it pretty and friendly, and thereby undermining the security and suitability of the core OS in a server or locked-down environment. UNIXoid systems aren't really secure to begin with, but adding a GUI makes them worse.

And there are many flavors of Linux that fall into the same trap, only the GUI is more primitive and less functional than that of OSX or Windows (not having billions of dollars' worth of top developers behind it). The fancier the GUI, the less suitable the system is as a server.

Other UNIX systems and clones are also doing this, and I don't know why. FreeBSD is run by a great many people with a GUI, which I suppose makes sense on the desktop (although why anyone would run anyBSD on the desktop is a mystery to me), but I run it strictly as a server, with just a simple command-line interface at the console and a few SSH sessions from my desktop, thereby allowing me to run plenty of stuff on a very small machine.

Anyway, the industry doesn't seem to want to accept that you cannot be all things to all people, and you cannot be the world's best desktop AND the world's best server. Until it faces this reality, you're going to have people running the wrong OS on the wrong systems, and companies encouraging them in their error.

Mike-Bracknell
1st Mar 2011, 08:32
Unfortunately, today's Windows is stuck with a GUI, which is one of the drawbacks that make it less suitable than UNIX and its ilk as a server. In some cases up to 80% of the processing horsepower of a system can be consumed by the GUI, so just having one on a server is a waste of money. Not only that, but many administrative tasks are much faster to carry out with a command-line interface than they are with a point-and-click GUI interface. Windows is very tiring to use as a server because it is impossible to avoid using the GUI for many tasks.

I'm not sure about the extent to which you can strip the GUI out of Mac OSX, but it's pretty much impossible with Windows today.

Windows Server Core: Overview | SerkTools (http://serktools.com/2010/01/20/windows-server-core-overview/) :ok:

le Pingouin
1st Mar 2011, 11:54
This improved the "user experience" for Windows on the desktop, but put holes in the security for Windows as a server, and made the system more difficult to lock down. It's still more secure than OSX or Linux, though, by orders of magnitude.Oh please! :yuk:


And there are many flavors of Linux that fall into the same trap, only the GUI is more primitive and less functional than that of OSX or Windows (not having billions of dollars' worth of top developers behind it). The fancier the GUI, the less suitable the system is as a server.You do know with Linux the GUI is entirely separate from the underlying OS don't you? Install server tools becomes server. Install GUI becomes desktop.

mad_jock
1st Mar 2011, 14:18
:D To be honest its not a server if it has a keyboard and monitor attached.

The number that I used to regularly tenet into I only really knew which country they were in and which IP address to telnet.

Mac the Knife
1st Mar 2011, 15:14
50 Places Linux is Running That You Might Not Expect (http://www.focus.com/fyi/information-technology/50-places-linux-running-you-might-not-expect/)

Though I believe Munich has caved in to MS pressure....

Ho hum!

:ok:

AnthonyGA
2nd Mar 2011, 05:38
Windows Server Core: Overview | SerkTools

Well, that's certainly a welcome change, of which I was not aware. I haven't used Windows as a server in years, since the only (or main) reason for doing so would be to support native functionality on Windows desktops. I'm almost tempted to try it … but not quite. The tiny box I cobbled together from spare parts in order to run a FreeBSD server probably wouldn't even boot Windows.

You do know with Linux the GUI is entirely separate from the underlying OS don't you?

Yes, but separate or not, GUIs consume a great deal of resources and introduce many complications to the system. While this may be justifiable on a desktop, it's a tremendous waste on a server.

Sometimes it's even a waste on the desktop. In one of my earlier computers I had a Windows FTP application that never seemed to reach the 10 Mbps speed of the LAN for transfers. I finally discovered that it was spending most of its CPU time painting its window, and when I switched to the simple CLI version of FTP that comes with Windows, transfer rates immediately rose to the capacity of the link.

To be honest its not a server if it has a keyboard and monitor attached.

That depends on your configuration and environment. While you certainly wouldn't put monitors and keyboards on every server in a server farm, if you only have one server (as I do), it's not a big deal to have a separate monitor for it. I do share the keyboard, though. It also has its own mouse, although I'm not sure where I put it (I never have a use for the mouse on the server).

Even so, I do SSH into the server from the desktop most of the time, as this is more flexible and makes it easier to have multiple "terminals" connected.

Which is why if you read my post I said running 'under z/VM' - in other words as a guest OS in a totally separate LPAR.

Guest OS or not, it's still not a mainframe OS. It's just Linux. Little server operating systems are lightyears away from mainframe operating systems.

Up until around the mid 70's when MVS 3.8 existed, it was public domain and able to be installed on any competitors hardware - Amdahl, ITEL.

Freely available and public domain are not the same thing. While earlier versions of MVS were published and readily available (I think—it's been a long time), I should be very, very surprised if IBM ever released anything into the public domain. I'm not sure if anything would have entered the public domain on its own. In any case, it's not as if one can install MVS on a PC or a PDP-11/70 (without emulation).

mad_jock
2nd Mar 2011, 08:10
In general I woud only ever have a monitor etc attached when the build was being done unless of course I had already built it virtually and it was a squirt job.

Most places have and I forget the system name but you hit the crt button twice and you can get access from a gold fish bowl 14" screen and a bio hazard keyboard. I never used it but have wired it up and then reallocated the fancy new monitor which had been purchased for the server room. Huge screams from the windows boys but as the racking etc came off the nix budget and it used to stop the CEO's PA moaning about doing spreadsheets on a Admin standard monitor they could go and sing.

On the subject of GUI's it used to fill me with joy when starting at a site to find the server room full of "pipes" screen savers. You just knew you were in for months of teaching folk that didn't know how to suck eggs how to suck eggs.

le Pingouin
2nd Mar 2011, 11:33
Yes, but separate or not, GUIs consume a great deal of resources and introduce many complications to the system. While this may be justifiable on a desktop, it's a tremendous waste on a server.

How is that relevant if you don't have the GUI running or don't even install it?

As you commented SSH in from a desktop or if you really want a GUI on the server run something light on resources & only run it as needed.

AnthonyGA
2nd Mar 2011, 16:30
How is that relevant if you don't have the GUI running or don't even install it?

It's not, but the point is that GUIs are sometimes installed and running on servers. That's certainly true for Windows servers, and the inexperienced administrator might also have Linux or UNIX servers installed with GUIs, particularly if they were set up with some sort of default install that always puts in a GUI. As mad_jock has indicated, if you start one day at a new job and you see a server room filled with screens and screen savers (either is a bad sign), it tells you something about the people already running the place. It's a bit like configuring your server farm to run Seti @ Home.

I don't think UNIX or Linux systems should ever have default installations that put in any type of GUI. If you are running these operating systems and you don't know how to set up a GUI yourself, you don't know enough to be using these operating systems. I know that this is often done to encourage the use of these operating systems on the desktop, but they are not suitable for the desktop. The obvious exception is OSX, which has UNIX-like underpinnings but has nevertheless been heavily modified to serve more or less exclusively as a desktop (if you remove the GUI from a Mac, well, why bother paying for a Mac?).

AnthonyGA
2nd Mar 2011, 19:27
Incorrect.
In the 70's OS software was provided FOC when the client purchased the hardware.

That does not mean that it was in the public domain.

Since then MVS 3.8J has been readily available under public domain along with all associated software including compilers etc to anybody who wants it.

Again, making something freely available does not place it in the public domain. That is a common misconception.

However, I did some research, and it appears that some versions of MVS may have fallen into the public domain under current copyright law (which required, at one time, that copyright notices be placed on copyrighted materials published in the past in order to retain copyright protection). Apparently IBM took no steps to ensure copyright protection of some code (in the days when steps were still necessary) and has not attempted to assert copyright in some cases.

The source code for mainframe operating systems historically has been more or less public, to allow customers to modify code and no doubt because proprietary mainframe source code isn't of much use to anyone who doesn't have the corresponding hardware. However, publishing source code is not equivalent to placing something in the public domain.

AnthonyGA
3rd Mar 2011, 05:29
Glad that you have satisfied yourself with the fact that IBM have software in the PD.

I had already mentioned the possibility of software falling into the public domain, but I'd be surprised if IBM ever explicitly released it to the public domain. In the early days of computing, at least some types of software weren't considered very important in terms of intellectual property. The objective of vendors was to sell hardware, and an operating system was just a necessary evil in order to get the hardware sales. Later, operating systems became important pieces of intellectual property in their own right. The potential for sales-damaging misuse of proprietary mainframe operating systems has always been self-limiting, anyway.

IBM do not make the source code of their OS's for example z/OS available to anybody.
For one thing much of it was written in PL/S which is itself only available to staff within Big Blue.

Note the word "historically" in my post. MVS was written in assembler, as I recall, although I never did any tweaking of the OS myself.

mad_jock
3rd Mar 2011, 07:09
MVS was written in assembler

And hence it was fast as hell and used next to no resources.

mixture
3rd Mar 2011, 07:33
Rubbish. IBM do not make the source code of their OS's for example z/OS available to anybody.

Yeah, and that's what they said about Microsoft until the Chinese and Russian governments came along and twisted their arm a bit. :ok:

To be honest its not a server if it has a keyboard and monitor attached.

An IP KVM and IP PDU (and/or IPMI) in a datacentre can be the difference between sitting in the comfort of your own home (or office) diagnosing servers or having to stand around in a noisy dull environment.

For the majority of IT environments,having console level keyboard, mouse and monitor access to your server at all times is a must.

le Pingouin
3rd Mar 2011, 08:43
Anthony, Linux may not be suitable for your desktop but it's been working nicely for me for 12 years. Horses for courses. I'm interested why you think a clueless Ubuntu user is any worse off than a clueless Windows user? It's easy enough for them get in above their head in either case.

mad_jock
3rd Mar 2011, 09:00
Even the commercial flavours of linux don't really cut the mustard at high loads.

As soon as you start going over about 0.7 your log files start filling up and its starts chuntering. You can get away with running what would be high work loads on a MS machine at very low loads on a linux box makes the clueless think its a straight swap and they are happy. Then they stick a thin frontend DB application on it and the whole thing goes to rat poo.

Linux as a desktop I think is cracking, it really doesn't get anyware near the grade as say a solaris enterprise server which it is mean't to be replacing as a server. If you want a fly websever setup it can manage. Anything serious it falls over.

Actually I think Linux does protect the user better than windows does. You can run it in dafty mode and the OS won't let you compromise the OS. The simple fact that it default installs to running a normal user account removes 80% of the potential for murdering it. Add in packages and respo's and auto updates the potential for disaster is significantly less.

And mixture my experence is the opersite to yours. The folk who have access to the hardware is extremely limited, far more so than root access. And its not uncommon to have a hardware 24h standby as well as a sys admin standby. Some of the guru's I have worked with I wouldn't trust to rewire a plug never mind pull a machine out of the rack and hot swap a processor or network card. But get them writing scripts and doing fancy poo with NIS+ etc and they leave me in there wake.

mixture
3rd Mar 2011, 09:39
And mixture my experence is the opersite to yours.

Fair enough, different circumstances, different exerpiences.

However, let me give you a concrete example:

Say I'm doing a PXE install of Windows 2008 (no, I'm not going to get into OS wars here, both Linux and Windows have their place) onto a rack full of HP Proliant DL380s.

One of them is bluescreening.

The ability to (a) see the bluescceen, (b) do a hard power cycle (c) PXE boot some diagnostic tools and play with BIOS settings ...... all from hundreds of miles away .... is priceless and enables me to decide what replacement parts need to be sent to site.

Or if you want an example from the non-windows world..... remote upgrades of OpenBSD boxes, booting into single user mode on Linux, Firmware upgrades etc. etc.

Even the commercial flavours of linux don't really cut the mustard at high loads.
If you want a fly websever setup it can manage. Anything serious it falls over.

Oh right... why do people like Amazon run mission-critical Oracle RAC Databases on Linux running on commodity HP Proliants then ? What a load of codswallop, Linux is perfectly capable of supporting loads.

Want more examples ? How about ITV's F1 website... that uses a linux platform for load balancing....

4000 concurrent connections to a video rich site. CPU utilisation 1% memory utilisation 500Mb of the 8GB available. Pair of HP DL360 G5 8GB RAM, 1Gb NICs (in active passive config).

Another application (same hardware spec):

1.2 Million transactions per minute
72 million transactions per hour (including in excess of 350 SSL terminations per second) sustained 750Mbps.

Mike-Bracknell
3rd Mar 2011, 09:56
Anthony, Linux may not be suitable for your desktop but it's been working nicely for me for 12 years. Horses for courses. I'm interested why you think a clueless Ubuntu user is any worse off than a clueless Windows user? It's easy enough for them get in above their head in either case.

That's an easy one to answer.

TCO.

If you have an office, and you want to employ office workers, and part of the requirement is that they're competent on the OS you deploy.....you try finding an office full of workers that have SEEN a Linux desktop, let alone used one.

If you then want to support said office, you need a support weenie. Try finding a range of those that support Linux (and not just one, several so you have a choice of them should you need to let one go).

Obviously there are specific applications for which a Linux desktop is a good idea (code monkeys, designers, etc), but for the majority of mainstream application support, Linux is a poor choice.

mad_jock
3rd Mar 2011, 10:16
Sorry mixture we are on about the same thing

The stuff I used was KVM and yes I fully agree thats the way forward. My point was that the admin shouldn't have to have his bum in the server room so multiple screens and god forbid someone actually working on them is a no no.

Bar having to get a screw driver out to fix something you should never be in the room.

And for the folk that don't know what we are on about.

Specialist suppliers of IP solutions, including KVM over IP and remote server monitoring. Call us now 01202 872771 (http://www.itm-components.co.uk/kvm-over-ip.html)

Got a bit fancier since I worked with them but the principle is still the same.

mixture
3rd Mar 2011, 10:20
Sorry mixture we are on about the same thing

No worries. Although I was secretly looking forward to finding out what mysterious ways you had to resolve a bluescreen without one. I thought you might be about to teach me how to suck some different eggs !

mad_jock
3rd Mar 2011, 10:23
Install a solaris server would be my first suggestion :D

le Pingouin
3rd Mar 2011, 10:25
Mike, in that case the next version of Windows won't be suitable for the desktop either ;)

In most work environments the systems are so locked down that OS competency is meaningless because you can't do anything other than use the apps. And how is using a GUI based app in Windows vastly different to using one in Linux? You point, you click, you type, you save.

Anthony was flatly saying Linux wasn't suitable for use as a desktop system, which is plain wrong. Deployment in specific situations is a different matter.

mad_jock
3rd Mar 2011, 11:09
And to be honest letting loose some users on apps is really counter productive to generating profit.

One particular witch I worked out that we could employ an additional 2 office grunts if we didn't have to have a floor walker visit her 6 times a day.

AnthonyGA
3rd Mar 2011, 19:08
And hence it was fast as hell and used next to no resources.

That's the permanent advantage of assembler. If you want to see how fast PCs really are these days, run something on them that is written in assembly language, and be amazed. Bloatware grows so quickly that today's PCs don't really get normal tasks done any faster than they did 25 years ago. Every increase in hardware speed has been eclipsed by an equal or greater increase in software bloat.

I'm interested why you think a clueless Ubuntu user is any worse off than a clueless Windows user? It's easy enough for them get in above their head in either case.

The list of reasons why Linux is a terrible choice for the desktop is too long to provide here. Suffice it to say that Linux among ordinary users is asking for trouble: a support and administrative and strategic nightmare. And there are no compensating advantages.

AnthonyGA
4th Mar 2011, 09:13
That comes across as a very bigoted view.

I have no emotional attachment to operating systems, so bigotry is not an issue. If I'm supporting or administering a hundred thousand desktops, I simply want something that minimizes the support calls and administrative headaches for me and my staff. Linux is a very poor choice in view of those objectives.

The reality is that, in 99.999% of cases, Windows is indeed the right tool for the job, irrespective of any visceral dislike that some people may feel for the operating system or its vendor.

le Pingouin
4th Mar 2011, 09:56
Anthony, you seem to be out there by several orders of magnitude as well. Becoming a bit of a habit when you're discussing Linux. Anyone would think you really don't like it, in a rather visceral manner. I'd call it bigoted.

Not everyone shares your objectives.

AnthonyGA
4th Mar 2011, 10:41
Anthony, you seem to be out there by several orders of magnitude as well.

Given that Linux represents less than 0.1% of desktops, my own assessment is clearly congruent with mainstream opinion. Linux has made no significant inroads on the desktop in the two decades of its existence. Unless there are some very fundamental and significant changes in the Linux world and the IT market at large, that will never change.

Not everyone shares your objectives.

True. Apparently about 0.1% of desktop computer users do prefer Linux. But the other 99.9% seem to agree with me.

le Pingouin
4th Mar 2011, 11:11
Still out by an order of magnitude.

Operating system market share news (http://www.netmarketshare.com/operating-system-market-share.aspx?qprid=8)

mad_jock
4th Mar 2011, 13:18
the three sites I have done for small airlines since I was working full time in IT have all been Linux desktops.

Its saves a bloody fortune on licensing and if you ever get a visit for software they just turn round and walk out the door when they find out what you are using.

The users I must admit hate it with a passion mainly because they can't screw around with it and its not windows and the burds have issues with not having office and using openoffice instead. But the boss quite quickly likes it after zero down time and no money for licenses going out. Its a hellva lot more stable network to run without needing all the back office staff that MS requires. once its set up 30 mins a week does for the house work. I just had a sudo script for creating users that HR used if they had a new start. 30sec after walking in the door everything was good to go from email through to FTL's.

Mike-Bracknell
4th Mar 2011, 13:22
Anthony, you seem to be out there by several orders of magnitude as well. Becoming a bit of a habit when you're discussing Linux. Anyone would think you really don't like it, in a rather visceral manner. I'd call it bigoted.

Not everyone shares your objectives.

I'm just back from a meeting with a sales manager who's setting up a new company in the UK. He's fresh from doing the same thing in Ireland, and I couldn't help but smile when he said the following, unprompted:

"When we started the company in Ireland, we got a guy in to do the whole IT for us. He stuck in Linux desktops and Open Office and said 'it'd be just the same'. It's a pile of ****. It won't even run my laptop properly. I just want something that's normal and works".

mixture
4th Mar 2011, 14:53
Open Office and said 'it'd be just the same

Yeah, when will people realise, it's NOT the same thing.

I've seen IT managers sucking up to CFOs by saying they will deploy OpenOffice for all new recruits and save $$$ ..... all well and good for the secretaries and general office dogsbodies ..... but when you start recruting for a new trading desk, it suddenly dawns on you that Open Office is rubbish at more advanced stuff such as macros etc.... the traders then proceed to eat the IT department for breakfast when they loose deals.

Choose the right tool for the job. A proper IT manager should put licensing costs as a secondary concern behind the consideration as to whether or not the software is actually suitable for its intended purpose.

Just because you might look at Open Office through rose tinted glasses, doesn't mean everybody is shortsighted and needs glasses too. :cool:

Mac the Knife
4th Mar 2011, 22:44
"If you are heavy into extensive VBA macros......"

Indeed. But the fact of the matter is that 99% of the people who use an office suite are not and Open/LibreOffice works just fine. Our seccys adapted within a couple of days.

A mainstream Linux distro is now easy to install, easy to configure, easy to administer, very stable and considerably more secure than anything but the most carefully locked down Windows (which 99.99% are not).

The venom displayed in this thread is odd.

There is absolutely no reason why (apart from MS's widely documented anti-competitive practices) why the world should be tied to Microsoft's OS and apps for ever and ever.

Mac

AnthonyGA
5th Mar 2011, 00:50
The venom displayed in this thread is odd.

For some people, computers are a way to get things done. For others, they are a religion.

Still out by an order of magnitude.

It was "several orders of magnitude" a few posts earlier. What will it be next?

MG23
5th Mar 2011, 18:03
Given that Linux represents less than 0.1% of desktops, my own assessment is clearly congruent with mainstream opinion. Linux has made no significant inroads on the desktop in the two decades of its existence.

Nobody even knows how many Linux desktops exist, because there are no sales figures to count. I have multiple Linux desktop machines at work and we have a Linux desktop at home, a Linux laptop, a Linux server and a Linux netbook... I doubt any of them are reporting statistics to anyone. Occasionally I boot the laptop into Windows to edit video or play a game that doesn't run on Linux, but otherwise I don't miss it.

And the desktop market itself has peaked; there'll be desktop computers for a long time to come, but the big growth is likely to be in various kinds of mobile hardware. Linux has a substantial presence in the mobile market while Windows has hardly any; I've seen iPhones and Android phones but I've never ever seen a Windows phone.

Saab Dastard
5th Mar 2011, 19:02
but I've never ever seen a Windows phone.

Lucky you!

I had the misfortune to have been given a work phone with Windows mobile or CE (I honestly can't remember which) that was so bad that I quietly took the SIM card out and back into the Nokia 6310 that I had "forgotten" to hand back!

SD

Mike-Bracknell
5th Mar 2011, 23:51
And the desktop market itself has peaked; there'll be desktop computers for a long time to come, but the big growth is likely to be in various kinds of mobile hardware. Linux has a substantial presence in the mobile market while Windows has hardly any; I've seen iPhones and Android phones but I've never ever seen a Windows phone.

fyi...

File:Smartphone share current.png - Wikipedia, the free encyclopedia (http://en.wikipedia.org/wiki/File:Smartphone_share_current.png)

and whilst early WinCE and WM devices were dire, devices later than WM5 were generally "ok" and Windows Phone 7 is comparable* to an iPhone.

(*as in the same ball park)

Booglebox
6th Mar 2011, 01:09
Nobody even knows how many Linux desktops exist, because there are no sales figures to count. I have multiple Linux desktop machines at work and we have a Linux desktop at home, a Linux laptop, a Linux server and a Linux netbook... I doubt any of them are reporting statistics to anyone. Occasionally I boot the laptop into Windows to edit video or play a game that doesn't run on Linux, but otherwise I don't miss it.

And the desktop market itself has peaked; there'll be desktop computers for a long time to come, but the big growth is likely to be in various kinds of mobile hardware. Linux has a substantial presence in the mobile market while Windows has hardly any; I've seen iPhones and Android phones but I've never ever seen a Windows phone.

Figures given for PC OS marketshare are usually based on useragent data from people visiting the Akamai top 500 or something like that, I think.

Must call you out on your phone statement - looking at the diagram linked to by Mike Bracknell, only Android is linux-based. Symbian uses a proprietary kernel and iOS uses the Darwin kernel, which as you probably well know is a derivative of a derivative of Unix, but not Linux.

mad_jock
6th Mar 2011, 07:31
Kindles are linux and there are millions of them out there.

I agree that the number of pc with linux installed on them and for that matter windows is miss leading.

The fatc that virtually ever shop machine is supplied with windows screws up the figures. How many these days get wiped and linux put on them as the first job who knows.

AnthonyGA
6th Mar 2011, 07:59
Nobody even knows how many Linux desktops exist, because there are no sales figures to count.

Sales figures do not equate to operating desktops, as anyone who has wiped a pre-installed OS to install something else knows.

Since many desktops have Internet access, the number of operating Linux desktops can be inferred from user agent information provided to Web sites. On my own site, with about 1.6 million unique visitors per year, Linux was reported as the OS of exactly 0.9% of visitors in 2010. Windows represented 84.62% of visitors, and Mac (all versions) represented 11.2% of visitors. About 0.000334% of visitors were still running OS/2, which was a favored underdog OS until its fans (if they were old enough to remember OS/2) switched largely to Linux.

I have multiple Linux desktop machines at work and we have a Linux desktop at home, a Linux laptop, a Linux server and a Linux netbook... I doubt any of them are reporting statistics to anyone.

And I doubt that your arrangement is even remotely representative of the norm. Most people have just one computer, if they have any computer at all, and it typically runs under Windows. People running Linux are often running multiple machines. People running Windows typically have just one machine. I have the impression that Mac users may be slightly more likely than Windows users to have multiple computers (all Apple products), because Mac users are more likely to be fans rather than simple users.

And the desktop market itself has peaked; there'll be desktop computers for a long time to come, but the big growth is likely to be in various kinds of mobile hardware.

I agree. One of the unspoken realities of personal computing is that PCs are designed to assist with intellectual tasks … and there are a great many people who never engage in intellectual tasks. Desktop computers appeal to people who are somewhat more intelligent than average, just as home gym equipment appeals to people who are in somewhat better physical condition than average. This is why computers are gradually converging towards entertainment devices (have you noticed that almost all monitors sold today have aspect ratios that are useless for anything other than watching TV and movies?).

Most people don't even need an application like Word—because they never write anything. Indeed, in the U.S., about 30% of the population is functionally illiterate and can barely do anything with a computer at all.

mad_jock
6th Mar 2011, 10:38
I wonder how many computers are used just purely for looking at porn.

sea oxen
6th Mar 2011, 12:20
On my own site, with about 1.6 million unique visitors per year, Linux was reported as the OS of exactly 0.9% of visitors in 2010
Let's face it, billgatesisgod.com or howdoifixmybluescreen.net could skew the results a bit :)

Several years ago, I was asked whether we should switch our desktop environment of about 15,000 machines to Linux. Although I was in favour, my advice was that it would be courageous - in simpler terms, lunacy (or should that be linuxy?)

In an infrastructure role, our Windows machines represent a significant overhead in terms of manpower. The Unix crew know more about the MS side of things than they'll let on, but they won't cooperate out of pride and petulance (a good title for a novel), and playfully point out the deficiencies in Uncle Bill's OS - the MS machines have more patches than a Scotsman's condom. I get to manage both tribes. That sucks like a failed actress on crack in a bus station.

When Mrs SO bought her latest laptop, it came with Windows 7. She expected me to wave the Fedora wand it at, as I've done in the past - but as this was not just a hand-me-down from me, as it's been in the past, I insisted that she keep it in its original state.

Dumb.

The afternoon Mrs SO was leaving for a four-week trip, she called me, quite distraught. Her profile was buggered, and she couldn't get in. There is a registry key you can unbugger to repair this, but you need safe mode or the admin uid to do it. This wasn't really feasible over the telephone from the office.

At work, I can afford a team of thirty people to run around picking up the dog's eggs that MS OSes produce. I can sustain the mandatory weekly reboot of our MS desktops, because I am being paid for it. No such luxury at home, and that's why I am in high demand amongst my friends because 'SO works with computers'.

I put MS OSes on desktops at work in the same category as minorities recruitment and H&S. A pile of crap, but a sandwich with some sh!t in it has more nutrition than without. Adopting Linux on corporate desktops is dropping the soap in the shower, because with Windows you just blame Bill. With Linux, it's you who'll be Mr Sh!tty Operating System.

SO

Cheerio
6th Mar 2011, 20:22
The fatc that virtually ever shop machine is supplied with windows screws up the figures. How many these days get wiped and linux put on them as the first job who knows.

Well there is one here - I've just bought last week a new Ideapad pre-installed with W7 Starter. Due to timing it has had a a short reprieve. I must admit that it looks pretty slick, but its on borrowed time. In 3 more days Opensuse 11.4 is released and that's what is going on. No dual boot nonsense, a clean install. I've been Suse since 9.3 on all my PC's, desk and lap.

hellsbrink
18th Mar 2011, 16:24
Ok peeps, need our resident Linux bods now.

I got bored and turned the desktop machine into a dual boot Win 7/Ubuntu 10.10 (Maverick Meerkat) system. Everything is pretty much ok, on both sides, as far as bootup, etc, goes, BUT the wireless connection on Ubuntu is a bit erratic at times.

I'm using one of these Sitecom USB Wireless N dongles (Yeah, I know, these USB dongles ain't the best but it's what I have and since the router is in the same room I am not just dropping a signal due to walls, etc) which uses a Ralink chipset and, on bootup, it runs fine then drops the connection. After a few minutes faffing around it sometimes comes back but other times I have to do a full restart to get a reliable connection.

I've done the obvious things like look at the settings, and all seems ok there, the dongle has no issues on Win 7 and will happily stay connected for as long as the puter is on so that leads me to think the issue is something to do with Ubuntu. Any suggestions?

PS. Chose Ubuntu because I got a corrupt download of OpenSuse from their site and am too close to my "traffic limit" to pull down another 4.8Gb to get a working version.

Mike-Bracknell
18th Mar 2011, 17:28
Sounds like a driver issue, rather than a Linux one.

hellsbrink
18th Mar 2011, 17:44
That was what I thought, but since I'm a noob at Linux I don't know which way to turn. Followed the instructions to install a windows driver through ndisgtk package thingie and now I have NO wireless at all and can't "undo" what I did (back on win7 now).

Ho hum, time to hit some other forums and get what they say.

Mac the Knife
18th Mar 2011, 17:48
Looks like support for this chip has been included in the kernel since Linux 2.6.18 (http://kernelnewbies.org/Linux_2_6_18). You shouldn't need an external driver or Ndiswrapper.

See Linux Kernel Driver Database: CONFIG_RT2X00: Ralink driver support (http://cateee.net/lkddb/web-lkddb/RT2X00.html)

Ask around the Ubuntu forums rather than here!

:ok:

Mac (Mepis Linux - MEPIS | A Linux operating system based on Debian Stable (http://www.mepis.org/))

hellsbrink
18th Mar 2011, 18:03
Yeah, Mac, the driver is built in to Meerkat BUT it doesn't explain the random "dropout" of the connection.

That's the bit that confuses me, but after a hunt around I MIGHT have found a solution......

Oh well, gives me something to do!!

Mike-Bracknell
18th Mar 2011, 19:26
First thing to try - upgrade the wireless AP (or router)'s firmware. :ok:

Saab Dastard
18th Mar 2011, 23:52
Linux and wireless does seem to be quite hit and miss, especially with old(er) kit.

There's a lot of problems with chipset manufacturers not releasing Linux drivers (why???) and being the opposite of helpful in allowing the Linux community to access their code to develop their own (again, why????).

In many cases, the answer has been to use the windows NDIS driver with a Linux wrapper, but this is not entirely satisfactory.

I've tried and failed several times to get various Linuces working with wifi cards on desktop and laptop PCs - OK, not recent kit, about 8-9 years old (but that's all I've got to play with right now).

On the bright side, you will immeasurably increase your understanding of Linux and networking if you persevere (and hopefully succeed), but that may not actually be your objective!

Good luck!

SD

Mac the Knife
19th Mar 2011, 00:13
"I have 3 different Ralink based USB NIC's, with a 2500, a 2571, and a 2870 based chipset.

The drivers included in every kernel in every distro I have ever hopped to are pardon my language, total crap.
Anyone who has used one of these NIC's for more than an hour knows what I'm talking about (A quick google search can be enlightening). Low signal levels, and frequent disconnects requiring the module being unloaded and reloaded. Being the cheap ass I am buying a Atheros based NIC isn't in my agenda so my solution has been to obtain the driver source from Ralink themselves (http://www.ralinktech.com (http://www.ralinktech.com/)), these drivers are rock solid and why they are not included in the kernel I don't know, maybe someone else can enlighten me."

Go to - Ralink corp. (http://www.ralinktech.com/support.php?s=2)

Keep us informed.

:ok:

hellsbrink
19th Mar 2011, 06:26
Well, had no help from linux forums and am now too peed off with the random connection drops so the Meerkat is gone.

Will be getting another distro soon, already have PCLinuxOS so might try that but methinks I'll be looking at something like Open Suse because the last thing you want is to be farting around like this when you ain't sure about what you are doing, the learning can come after a basic WORKING setup is operational and it don't look like that is possible easily with Ubuntu.

You would think these issues would be fixed by now, the Realtek chipset problems are not exactly new.....

hellsbrink
19th Mar 2011, 06:48
Ok, have the PCLinuxOS live CD running so i'll see how the connection stays with that. So far it's been up for 3 mins and that's more than the average for Ubuntu

hellsbrink
19th Mar 2011, 12:23
Well, latest PCLinuxOs distro installed as a dual-boot, absolutely no issues with the wifi, starting to configure it to suit what I want. Am happy (so far)

The wifi issue must have been a Ubuntu one, so that won't get installed again anywhere.

Now it's time to start learning again since it's been a heck of a long time since I really did any DOS type things.......

Thanks for trying anyway, guys

Justiciar
6th Apr 2011, 11:28
I have recently "converted" to Linux having had my computer trashed by a very virrulent virus late last year.

Everything appeared ok until a recent potentially catastrophic problem with a full root partition, which prevented me logging on with little or no warning. Having managed to solve the proble without knowing how I have to say that my experience is now rather jaded.

From an aviation perspective the obvious issue is there being very little available by way of aviation applications written for Linux. About the only one I can think of is Notam Plot. In terms of flight planning software NavBox will just about run under Wine, but nothing else will, especially Skydemon, which has to be the most advanced VFR planning software currently available.

Generally, achieving the same functionality as Windows 7 has been a chore. For example, getting the right version of Java to run Afpex took days. Open Office, whilst good, still has some bugs which cause problems and several times I have reverted to Windows to edit stuff.

So, whilst I like not having to constantly fend off virus attacks I find myself on the point of junking Linux on my dual boot laptop and returning to the Windows 7 fold.

mixture
6th Apr 2011, 11:36
catastrophic problem with a full root partitio

oh...."rm -rf /" .... :cool:

Justiciar
6th Apr 2011, 12:00
oh...."rm -rf /" ....

Does that not assume you know where the big files are and that you are prepared to loose them??

mixture
6th Apr 2011, 12:55
Justiciar,

I suspect my leg pulling attempt may not have been detected by your radar....

Doing rm -r / would certainly result in a "catastrophic problem with a root partition" ... :cool:

Justiciar
6th Apr 2011, 13:10
The penny has dropped. I should not meddle with things I don't understand and good job my lap top is at home and not in front of me:\

Any ideas on how to actually solve the problem when you cannot get more than the terminal?

le Pingouin
6th Apr 2011, 14:09
Midnight Commander (mc) is an invaluable command line tool - a clone of Norton Commander, the DOS based file manager. You can use it to display directory sizes, select & delete or move multiple files, edit files, etc.

mixture
6th Apr 2011, 14:15
Any ideas on how to actually solve the problem when you cannot get more than the terminal?

Apologies if I've missed something, what was the problem, I was under the impression you had solved it?

Justiciar
6th Apr 2011, 14:45
well yes, but I don't know how or why :confused:

The root directory was full meaning that you can only get into the system via the terminal (what you actually get is a mesage saying that Gnome power manager is not installed correctly and you cannot then log into the desk top but can only get to the terminal using Ctrl-Alt-F1). There is no warning of this critical state approaching.

The key seems to be to free up space on the root directory by either moving or deleting files, but this did not seem to work. In the end I recall using a command which automatically gives a % increase in the size of the root directory but I cannot now trace what that was for future reference. So the key questions are: how do you stop this happening in the future and if it does happen what is a quick solution to the problem? (a trawl of the internet offers no clear solution).

AnthonyGA
8th Apr 2011, 11:21
So, whilst I like not having to constantly fend off virus attacks I find myself on the point of junking Linux on my dual boot laptop and returning to the Windows 7 fold.

This mirrors the experience of most non-geek attempted converts to Linux, and illustrates some of the reasons why the OS has never made a dent in the desktop market.

Besides, anything that could faithfully emulate the Windows environment would also be vulnerable to Windows viruses, so if you need to run applications that are available only for Windows, then actually running Windows itself is the obvious solution.

izod tester
8th Apr 2011, 12:09
Yup,

For me, there isn't a suitable Linux application which does the same as AutoRoute - so I have an XP virtual machine so I can run AutoRoute when I need to - without having to leave my Linux desktop.

sea oxen
22nd Apr 2011, 09:28
Justiciar

You'll be delighted to know that you're not alone. Getting the partition sizes right a priori is something which few people get right every time once the system's been in use for a while, even in a commercial environment. Luckily, there are ways to get out of jail if things go TU, as you found out - du -k|sort -g, then rm that big download from r*dtube :)

I suggest that you keep gparted.and the system rescue cd in your toolbag. You're able to boot from these and resize if you need to. If you Giggle "linux disk space monitor script startup" you'll find a wealth of scripts to warn you if your space is running out.

Linux isn't for everyone - even not, as I suspect, Mrs SO. But as I have to fix the confounded thing when it breaks, she's crossing over to the dark side this weekend. Then I'll install Linux.

SO

rans6andrew
21st Oct 2011, 12:26
Earlier this week, my system decided to install Ubuntu 11 over the previously stable Ubuntu 10. It all seemed to go smoothly at the time but now I find that the system just freezes at random intervals and requires a hard reset to recover it. It has frozen at any time from partway through drawing the initial desktop to several hours of work later. This morning it froze just as it was trying to connect to our wireless network and it messed up the key.

I have not fetched/installed any new utilities and the previously installed stuff still works.

I have prompted it to check for updates every day just in case there was some "correction" issued. Today a few files updated and I have not seen any improvement.

I am not a Unix/Linux/Ubuntu user as such, so any help in a "use words of few sylables, slowly" way would be much appreciated.

Thanks for your interest,

Rans6....

Golden Rivet
21st Oct 2011, 14:08
Running 11.10 without any problems....

It doesn't take long to do a clean re-instal.

Unlike windows it recognises all hardware and prompts for additional drivers. It virtually works straight out of the box.

green granite
21st Oct 2011, 14:20
Unlike windows it recognises all hardware and prompts for additional drivers.

Windows not only recognises the hardware, it automatically goes and finds all the drivers for you. :E

Golden Rivet
21st Oct 2011, 14:51
I run XP, Windows 7, Android and Linux on various devices around the house.

you are more than welcome to slag off Linux, but it works, and its free....

green granite
21st Oct 2011, 15:38
Who slagged off Linux? I didn't, I merely pointed out that windows gets it's own drivers. :)

le Pingouin
21st Oct 2011, 19:51
Rans6, is it 11.04 or 11.10? Do you know what video chipset you have? My guess is a flaky video driver.

Guest 112233
21st Oct 2011, 20:03
On this release, we seem to have lost the ability to micromanage the fonts - I find this a pain on my Asus - This could be a factor in your misery - OK I'm being a moaner here, but I did use the font management facilities on my notebook.

CAT III

rans6andrew
21st Oct 2011, 20:17
I am on Ub 11.10 32bit. I don't know what the graphics chipset is, it was a fairly cheap graphics card when I built the system nearly 2 years ago. I don't do anything with intensive/high speed graphics and don't need anything expensive here.

I will try to find out which chipset it is tomorrow.

Can I interrogate the system to find this or must I have the hardware in bits?

Sometimes, just before it crashes/locks up, the sound goes off the rails, it may die by doing "the needle stuck....the needle stuck....." or it may just stop abruptly. Either way a hard reset is needed to clear it.

The hardware is very capable (Core I7 and 6 GB of ram at 2.8 GHz) and ran 10.something faultlessly for about a year. Would it not keep the installed drivers which were doing a good job.

Thanks for all,

Rans6Andrew