Ubuntu Installation :: Write 10.04 OVER 7.04 Without File Loss?
Aug 27, 2010
how can I install 10.04 over 7.04 (which is partitioned with XP) which I want to retain WITHOUT over writing XP and WITHOUT losing my data files stored in 7.04?
Partition 1: Windows 7 - NTFS Partition 2: Ubuntu 10.10 - Ext4 Partition 3: Data - NTFS Partition 4: Windows Recovery - meh...
Basically, I have it so that I have one large NTFS partition (Data) for sharing files between Ubuntu and Windows. It works very well. I keep all my Documents, Music, Pictures and Videos on "Data" and then sym-link those folders to my home folder on Ubuntu. Unfortunately, I'm afraid to bootup Windows. Sometimes when I boot Windows, files that I made or edited in Ubuntu get lost, unindexed or corrupt. It happens very frequently with image files (.jpeg, .png, etc.) but also happens with PDFs and folders. This means that not only do these files become unusable, but I also can't delete them. Even using $ rm -rf my_file returns a "Cannot delete my_file. It is not a file or directory."
The only way to get rid of these files is to perform a CHKDSK on Data when I boot Windows. CHKDSK always finds a shed-load of files that have gone haywire. I'm usually greeted by an seemingly indefinite list of "Removing index entry xyz from $afxyz" and other scary looking actions being taken out on my files.I keep backups of my files on an external HDD and a remote server, but it's no use when I'm backing up corrupt files.
I'm using Ubuntu 9.04 jaunty.I want to install Ubuntu 9.10 from the cd but I dont want softwares that I installed to be lost.what should I do?(It's the matter of time,internet speed and internet traffic limits)
Loss of internet connection after upgrade from 8.04 to 8.10 Like the title says, I lost my internet connection after upgrading my Ubuntu system from 8.04 to 8.10.I used the downloaded automated method. It appeared to work fine on its initial reboot into 8.10. But I noticed the internet icon indicated loss of connection. Even so, I was still able to browse the net through Firefox. But then when I rebooted the next day the opposite was true: the connection icon indicated connection, but Firefox would not actually connect, nor would update manager. It has been like this since. This is a dual boot machine, and the Windows XP Pro side still connects to the internet as usual, no problems.So how can I make my new 8.10 system connect to the internet?
Machine: Win2000 Pro, AMD64 Athlon 2GHz, 2x80Gb HD, 2 GB RAM Symptoms: Running 9.10 with dual boot no probs for ~ 1 year. Decided to upgrade online via synaptic to 10.04. Upgrade went well no warnings, etc. On reboot all OS options present in Grub BUT clicking on Win2000 causes the screen to blank temporarily and then return to the Grub menu ie. no functioning Win2000 boot option. Ubuntu 10.04 boot yields blank screen with flashing cursor, a sudden stream of IO error messages (can't capture it but the lines look identical) then screen flickers and I'm at the purple log in screen and everything proceeds perfectly from there.
Observations: For the first time on any of my ubuntu installs (i've been using since 8.04 always installing from CD though) grub installation asked where I wanted it. In fact, the screen shows all devices with check boxes wth which to make my choice. That is sdb, sdb1, sdb2, ... , sda, sda1, sda2, There was a message that says that if in doubt install on _all_. Being in doubt I checked all the boxes: sdb, sdb1, etc. I noted in the message streams that followed after i continued that some lines had "... this is BAD ..." but since it was all rolling along very quickly no idea exactly _what_ was bad.
Action: I am not a linux newb but I am a grub newb. I fished the forums and googled only to come completely confused about what the problem might be or what to do. Example: have i overwritten my win bootloader? is my mbr corrupted? if i remove grub what will happen? if i remove and reinstall grub will i get windows booting back? if i use my win2000 install CD will it overwrite my whole disk (as many warn) or can i just fix the mbr? if I do the latter what order do i ie use fix mbr then fix boot? The info i've come across so far either doesn't address what happened to me, deals with other OSs (XP, Vista), or is applied to earlier editions of grub/ubuntu. Right now I have not found a clear step by step that addresses my circumstance. Would anyone be able to provide some guidance? I need the windows boot to run some software (Matlab) that btw will not work with XP/Vista (hence the reason i still have win2k).
I used to have Ubuntu 10.10 and Windows 7 RC on separate partitions and I could choose to boot into either. Today I installed Windows 7 Enterprise trial version over the Windows 7 RC partition, and lost the boot screen. Reboot goes to Windows automatically. How can I get the boot options back to launch Ubuntu?
When I ls -l /etc/passwd, -rw-r--r-- 1 root root /etc/passwd When I login as myself, and rm /etc/passwd, it asks: rm: remove write-protected file '/etc/passwd'? If I say yes, will it actually delete the passwd file?
Excited at seeing the new features in 10.04, I clicked the Upgrade button tonight. I am now really regretting it! Problems:
1. My screen resolution should be 1280x1024 but the System>Preferences>Monitors control panel only shows 1024x768.
I think I have onboard Realtek graphics. Do I need to install proprietary drivers? Everything worked fine out of the box with 9.10!
2. The sound isn't working. Again I think I have onboard Realtek sound, and again it used to work fine without any intervention from me...
3. Although on first startup wireless networking was working fine, I restarted to see if that would solve the display issue, and wireless networking stopped working too!
I have an RaLink wireless card.
When I used Grub to choose 2.6.31.20, I got some error messages at startup (e.g. mount couldn't mount /dev), but then it did eventually start up and the sound and wireless networking are working again. But the resolution is still not fixed. It is now offering 1152x864, which it didn't previously, but no 1280x1024 (my screen's native resolution).
df -h [URL] I did the following command to find everything is in /usr or /var, then tracked it down to /usr/lib and /usr/share as the main offenders, but out of all the directories none are more than 1mb or so.
du -sh /* | sort -gr | head -n 5
I tried to uninstall firefox, which is what got me in this mess in the first place, the log claims it will remove ~240 mb but failes on a "E: Write error - write (28 No space left on device)" [URL] If I could juggle something onto an external hard drive so I can uninstall firefox I would be out of the wood. Failing that I believe a new install is in order.
What are the possible problem when Windows access the file from Ubuntu got Read Only even though have a full permission to read, write and execute the file? Ubuntu to Ubuntu accessing the file there is no problem only Windows got a problem.
I have tried for some time to write an .img file to a USB and a SD Card to boot from to my Netbook. The .img file is from Moblin and they suggest that I use Win32 Disk Imager to write the file to my USB. Though it haven't worked for me even if i use the SD Card and other computers.
The easiest for me would be if i could write the .img file on my SD Card and i have search for some tutorials but haven't found a way to write the file.
I've read something about "dd" but don't get the "command", "code" - thingy. If any one know an easy tutorial for it would be great! Or even better would be a tip to another program/utility like the Win32 Disk Imager that works.
I Use a Windows 7. HP Pavilion Elite and a Windows XP. Compaq Mini 110
I want to backup my Ubuntu box and I found a couple very helpful articles that describe how to use tar for backup and restore. So far so good. I've dabbled in Lin/Unix in the past, mainly on work systems, but Linux on my personal PC is new to me.
When I ran the tar command as root (su -) I noticed several errors scrolling by in the window. It's scrolling too fast to note the exact errors, but I did notice "permission denied" a few times.
Is there some way that I can capture the output of the tar command to a file so I can review it for errors and/or permission denied statements?
Can I just add some arguments to tar?
tar cvpzf backup.tgz --exclude=/proc --exclude=/lost+found --exclude=/backup.tgz --exclude=/mnt --exclude=/sys /
I have created a file newfile.txt using: touch newfile.txt Now I want to write to that file from terminal i.e whatever I will type after the $ will be written to the file this is what I want. How can I do that?
I am trying to change the write permissions on a file and On the screenshot you will see where i have underlined, its states i dont have owner rights to modify this file, how do I get owner Permissions when this is my installation..
I'm writing a script/plugin for Nagios for testing a WebLogic server.. I redirect some output to a file, and then i read that file to get some data, but i can't seem to write to that file with my script :s... this is the most important code
[Code]...
* EDIT * When i execute this script through a local terminal (PuTTy), it works but when i execute it from Nagios, it doesn't work..
I'm trying to write a script to generate an html file (complete with formatting- echo "[random formatting]" >> index.html) for all the files in the given directory. So far, it works pretty well. HOWEVER, I want the listed files to be treated as links. I'm using awk to grab the part of the filename I want, but I don't know how to do this out as it fails if there is more than one file. The HTML side would look something like this:
Code: <li><a href="2010.05.29.html">May 29</a></li> It all works fine up to the actual number of the day- fine with one file, fails with more than one. My code is this: Code: # Grabs all the files and puts them in a list with anchor text "Listed" ls | find 2*.html | sed -e 's/^/<li><a href="/'
After the syslog facility rolls logs weekly, the Postfix cannot seem to write properly to the mail.log file. What I don't quite understand is that Postfix is still able to write the following error to the log file: ..."status=deferred (temporary failure. Command output: Can't open log file /var/log/mail.log: Permission denied )"It is my understanding that Postfix uses several different processes to write to log files, but I'm confused as to why it is able to write errors to the log but not able to write when sending/receiving mail. After I chmod 777 the mail.log file, Postfix slowly clears the queue and the mails are then received. Everything functions fine for another week, until the logs roll again.
I want to keep a trace of the URL I visit, so I use a command line like this:
tcpdump -ien1 -v -X 'tcp port 80' | sed -nl 's/^.0x[0-9a-f]{4}:.{43}(.)$/1/p' |perl break.pl |perl -pe 's/(GET|POST).(.*?).HTTP/1....Host:.([a-zA-Z._0-9-]*)../" BEGURL
[Code]....
I also tried redirecting stdout and stderr to /tmp/out, it's still empty. The file has write access. I have no idea what it can be. Is there anything else than stdout and stderr?
I want to write a shell script file for the below subject
subject / situation : i have many users say user1, user2, user3, user4 and so on... within my /home dir
Within a user dir say.. /home/user1 i have many unwanted files. these unwanted files start with the name core for eg. core2324, core9789, core 9079 etc.. i need to delete them.
I want to write an automated script for this, which can do the same. How to write a script which can delte these unwanted core files which exist in all user dirs.
I have the following problem. I call a C++ program from a Java servlet by using Runtime exec. The OS is ubuntu and I use Netbeans 7.0 with Glassfish 3.1 web server.The program executes but it does not open and write into a specified file in a specified folder. The same C++ program compiled under Windows opens and writes this file.How can I solve this problem in Linux?
On Monday i had Centos 5.3 with / and /home in ext3, with no problems (at least i don't remember any problem).But as I decided to use XFS (I have a SSD , and i use a 3D Modeling software called Maya , that it's quite sure it could take advantage of use XFS), i decided to create in my new Solid State Disk , that i recently bought , a swap , a / partition formated in XFS , and a /home partition in XFS too.Then I used a Live cd to copy the contents from the old HDD to the new SSD , using find and Rsync --> (cd /mnt/oldroot/ ; find . -xdev -print0 | rsync -avHx --exclude-from /mnt/rsync-filter . /mnt/newroot/)
Once done , i edited fstab,Menu.lst , install grub in the MBR and i had to use mkinitrd to load the XFS.ko so be able to boot with the new system.All worked fine , although i only could log with the root , my normal user could not log , and a error appeared : GDM could not write your authorization file.This could mean that you are out of disk space or that your home directory could not be opened for writing.In any case it is not possible to login. Contact system administrator.But i remember that prior to Copy all to the new SSD , and use XFS , all was fine, and i was able to log with my user.
So thinking that perhaps the copy did by Rsync was not ok , or something i decided to reinstall Centos, now in a partition in my SSD and try another time , but without using a /home partition neither a normal user ,only using root.The SSD had a Swap , a 32 GB XFS partition and the / of the new Install in ext3 (20 GB ).Once installed Centos 5.3 , I updated only Glibc , yum and Python packages (like said in Centos 5.4 Release notes recommends , but not doing the final yum update)Then i updated only the kernel , to have a XFS supported Kernel , and then i rebooted (I would update all once migrated to xfs)
Hardware: Sun T-2000 with Solaris 10 5/09 U7, ZFS root and RAID (what subversion is writing to)Software: Subversion 1.6.12, Apache 2.2.11, db-4.2.52 ( and all related dependencys of course)Everything was fine until today, I have someone come over and they are getting this error when doing an import: svn: Can't write to file /DATA/* : File too large After some testing it seems it does this on files larger than 2G in size, but after googling until I could not google anymore I could only find people having this issue with Apache 2.0 or using APR lower than 1.2 (mine is 1.3.3). Is there a files size limit inside Subversion?
I just found that I could perform write operation using a normal user account to a file system I mounted with the commands as followed:
sudo mount -t ntfs /dev/sda1 /mnt/disk/
This is the corresponding entry in the output of "mount" command: /dev/sda1 on /mnt/disk type fuseblk (rw,nosuid,nodev,allow_other,blksize=4096)
As far as I remember, when using a normal user account, I had to use "sudo" to perform any write operations (mkdir, rm, etc) to a device mounted using "sudo". But now it seems to be changed.
Do I remember wrong, or did Karmic have any updates change this setting? (I never manually changed user settings, except that I added a root user, but I never used it.)
OS: Karmic(up2dated) Kernel: Linux stephen-laptop 2.6.31-17-generic #54-Ubuntu SMP Thu Dec 10 16:20:31 UTC 2009 i686 GNU/Linux