General :: Viewing Backup Files But Only In Some Directories?
Oct 26, 2010
I backed up my Laptop with a script, as follows:
Code:
#! /bin/bash
sudo
growisofs -Z /dev/dvd -dvd-compat -r -v /home
I then installed a new version of Ubuntu 10.04 from disk and copied the files in /home from the cd to the hard. I am able to open, view etc. all the files in most directories except those in /home/documents. There are text files created by gedit, OOWP and several PDF files. I cannot open or view these files, depending: gedit and pdf files gets a Err.Msg. "Don't recognize file type" (it is clearly marked PDF) . The OO files look like rows of 'high bits' and a dialogue box opens giving me the options to change Char. Set, Font, Language, Paragraph break.
View 6 Replies
ADVERTISEMENT
Dec 19, 2009
i am in need of linux help. iam at college and i need this back/restore script to pass this final part of an assessment. i require a backup script that will not only backup but also restore files to the relevent directories. e.g. users are instructed to store all wordprocessor files in a directory named wp. so i am needing to create a backup directory and 3 directories within that and some files within the 3 directories and then back them up ot restore them. l know i should/have to do this myself by been trying to get/understand info for the last few days and came up with zero.
View 14 Replies
View Related
Mar 4, 2010
I'm creating a backup scheme with rsync.
It is pretty clear not to include these:
/var/lib/named/proc
/var/lib/ntp/proc
[code]...
View 3 Replies
View Related
Jul 7, 2010
I need to stop the viewing of robots.txt on my website. I get the contents of the file displayed in my browser when I issue the command: [URL] stop this as it displays all the directories I don't want them to go to.
The main problem is that I can look at any directory on my site and get a file listing and then right-click on that file name and then save it to my client hard-drive all from my browser. ex: [URL] I think I can change this behavior in the apache config but don't know enough.
View 3 Replies
View Related
Apr 30, 2010
I need to backup some Directories but
sudo chmod 777 *
Wont include permissions for daughter dirs
View 2 Replies
View Related
Jan 5, 2010
what directories/files should be backed up? What are you using for this job?Basic backup: /home, /etc maybe /root and /boot and often you want to backup some parts of /var such as /var/log. I can use cp and scp for simplest backup. tar, cpio for tape etc... I can use dump and restore for whole file system backup. rsync for incremental backup.
View 7 Replies
View Related
Mar 23, 2011
I am writing a script, in that my requirement is, if all the fill types stored in one directory from that we need to separate different different directories based on the file types.
for example in a directory(anish). 5 different types files
1- directory
2- .txt files
2- .sh files
like that and my requirement is the (1- directory is moved to one new directory(dir) which we are given in the script)and (2 .txt files are moved to another new directory(test) which we are given in the script)and ( 2 .sh files are moved to another new directory(bash) which we are given in the scrip)finally the directory anish should be empty..using bash script.how it is possible !!
View 7 Replies
View Related
May 21, 2010
I'm running an Ubuntu 9.10 Linux server. I'm trying to find a way to backup the machine while it is running and from what I see, this eliminates the disk clone utilities. All of the disk clone stuff I have seen for Linux requires that you reboot into a special live CD.So my question is this, what is the best solution for backing up the system while it is running? Also, I don't really care about the OS config too much, I just want to be able to keep my stored files and my programs that I have installed on it.
View 3 Replies
View Related
Jun 25, 2011
I would like to zip only selected directories(and its child directories as well)I have many directories in the current folder like app, content, db, library etc.But I would like the zip only app and content and its child folders. I am trying the following.
zip -r ../backups/code/20110625 -i app/* -i content/* . *
But I am getting the following error. zip error: Invalid command arguments (nothing to select from)
What is the correct syntax to achieve this?
View 1 Replies
View Related
Oct 26, 2010
I'd like to know how to see which files have been printed on the printer. I can see which print jobs have run and who submitted them using the following command:
Code:
$ lpstat -W completed -o
but it doesn't show me the names of the files that printed.
Is there a way to set a retention policy to have it hold the job in the queue even after it's finished printing? Or a command to view the details of jobs that have already run? I'm looking for something similar to the output of the lpq command.
View 1 Replies
View Related
May 21, 2011
I'm just setting up a partition on a seperate HDD in my system. I plan to use the partition to backup the important files on my main HDD (to guard against HD crash).
The question I have is about where would be the typical location to auto mount this partition? Which would it be normal to go for:
1. /backup/
2. /media/backup/
3. /mnt/backup/
4. /home/chris/backup/
View 7 Replies
View Related
Nov 16, 2009
Is there a way to view all the crontab files, owned by root, users, and other system accounts, that exist on a system simultaneously rather than having to go the individual accounts? The distribution in question here is the Debian 4.0 release.
View 2 Replies
View Related
Apr 14, 2010
When we view multiple files using less, how to go on to the next file? I gave these two commands:
Code:
[root@localhost log]# ls -lt boot.log*
-rw------- 1 root root 0 Apr 11 04:02 boot.log
-rw------- 1 root root 0 Apr 4 04:02 boot.log.1
-rw------- 1 root root 0 Apr 1 19:14 boot.log.2
-rw------- 1 root root 0 Mar 21 04:02 boot.log.3
-rw------- 1 root root 0 Mar 14 04:02 boot.log.4
[root@localhost log]# less boot.log*
This is what I got:
Code:
boot.log (file 1 of 5) (END) - Next: boot.log.1 <RETURN>
(END) - Next: boot.log.1
I could not view boot.log.2.
View 3 Replies
View Related
Aug 9, 2009
I created a password file for use with ncsa_auth in squid. Firstly, is there a way to view the passwords in the file or are they all encrypted? Secondly, is there a way to get squid to reauthenticate the user after 24 hours?
View 9 Replies
View Related
Mar 10, 2011
I would like to make Firefox view text files not in its internal editor, but in the external editor (namely EmacsClient).Is it possible to change this default behavior of Firefox?I beg your pardon for being inconcise, I'll try to state the matter once again. First thing to mention is that I use the Linux version of Firefox. That means that unlike the Windows version the contents of the application bindings dialog is very scarce.
When I click on the link to a text file (be it remote or local) by default Firefox opens it in the internal browser. I've tested, somehow it looks at file extension - when for example I make a file with .mpg extension, the behaviour is as it should be - the dialog with "Open With..." and stuff. When the file has unknown extension (unknown to /etc/mime.types), in my case it's .out, pure text format - the default behaviour is to open it in a Firefox window.
View 2 Replies
View Related
May 19, 2010
Is there any software in Linux to view huge .txt files, say, over 10 megas? I'm now using default "gedit", version 2.28.0, which seems to not be able to open huge .txt files. It's the same case for Windows default .txt browser, but in Windows, "Win Word" seems to work fine. software under Linux to browse huge .txt files?
View 5 Replies
View Related
Aug 25, 2009
How can you create a script to move or copy files from a main directory into multiple directories below the main directory.
View 7 Replies
View Related
Jun 25, 2010
I am getting '+'symbol (-rw-r-----+) while viewing the file permission of exim_mainlog files. what the reason for this '+' symbol. -rw-r-----+ 1 mailnull mail 648448492 Jun 25 10:27 exim_mainlog
View 2 Replies
View Related
May 17, 2010
Ubuntu 10.04
I want to copy all directories, files, and hidden files and hidden directories with one command. I want these items to replace any same items in the target directory.
I have tried several things, such as:
cp -r *
cp -aR *
but I only seem to get visible files and directories. Obviously, I am missing something. (A brain, probably....)
View 9 Replies
View Related
Jul 7, 2011
The directories
/home/<user_name>/.Trash
and
/root/.Trash
do not exist.
I've tried viewing hidden directories with nautilus and using "cd" in terminal. "locate" is equally useless.
View 4 Replies
View Related
May 14, 2011
I'm totally new to Linux and this website. I was wondering if anyone had or could help me create a shell script that would merge two files from two different directories and then have that new merged file in a third differnt directory.The merged file would need to eliminate duplicates and sort the contents.
View 6 Replies
View Related
Aug 8, 2010
I'd like to remove all directories of a certain depth that don't contain .txt or .log files -- is this possible? So far I have: find ~ -mindepth 3 -maxdepth 4 -type d -exec rm -r '{}' ; Is it possible to add in "only if the directory doesn't contain .txt and/or .log files"? Or do I have to start learning perl to do that?
For example:
dir 1:
hello.txt
runme.sh
dir 2:
runme.sh
oct12.log
[Code]....
View 13 Replies
View Related
Apr 14, 2010
I'm hoping somebody can find something here that I haven't. I'm trying to use rsync to backup home directories to a nas. First, I NFS mounted the nas and ran an rsync and everything worked out fine. the transfer completed after a few hours and everyting was transferred (lots of stuff!). I then decided that I don't want to leave the nas mounted all the time and I didn't want to automate mounting and unmounting of the nas as I didn't think I could produce a script that would work reliably enough. So I decided to start an rsync daemon on the nas and upgrade via that. I run the following command (results are included. the ^C is me killing it after it hangs).
Code:
ryan@server:/etc/backup$ sudo rsync -ax --stats --progress --delete /data root@192.168.0.98:backups1
root@192.168.0.98's password:
sending incremental file list
data/home/user/Documents/
data/home/user/Documents/The File.wmv
[Code]...
View 3 Replies
View Related
Mar 12, 2010
3.State what kinds of files are stored in the following directories. Give any ONE file that can be found in these directories.
a. /etc/
b. /proc/
c. /sbin/
View 2 Replies
View Related
Jun 26, 2011
If I execute the following command:
cp -R /myfiles /mydestination
If myfiles contains several sub-directories and files, in what order will they be copied? For example, directories might be named 0123a, 9993c, myfolder, xfolder.
They are not copied in alphabetical order OR in date order OR in the order they appear when using a standard ls command as far as I can tell, so what actually does determine the order?
Edit:
I am trying to determine the order that the cp command uses in order to determine how far along my copy command made it before it stopped. For example, I was hoping to be able to determine it copied 3 of the 4 directories successfully.
View 1 Replies
View Related
Jan 29, 2010
The rm command man pages discusses removing files or directories recursively. So what is meant by deleting a file or directory recursively? And what are some reasons for doing so?
View 3 Replies
View Related
Feb 1, 2010
I have 60+ directory's each containing multiple .doc files. I need to move them to a single directory and keep their file name intact. I don't think cp will do that with out listing all the file names. I was thinking of something like: cp -r /dir/*.doc /newdir . Or should I use a combo like find -type *.doc|cp /newdir?
View 5 Replies
View Related
Jan 8, 2010
Is there a way to do the rm command where I can remove files by owner. I run the standard ls -al command and I want to be able to remove the files that are owned by me in that current directory. One other step how can I remove files in all directories owned by me. I did the google search first guys and a majority of the pages just dealt with the basics like rm -r
View 5 Replies
View Related
Jun 3, 2011
My system is centos 5.5, and I need nobody:nobody's directories and files under data.
There is a directory named "data, and this directory has so many directories and files generated by web program. Most of them is nobody:nobody.
I want to to make a list of these nobody:nobody directories ans files.
Is it possible to make a list of these directories and files?
View 1 Replies
View Related
Mar 3, 2010
I am searching for any system call similar to "ls" command we use in shell. My requirement is knowing the files and directories in curent working directory and process them based on there type. Here as of now I spawn another procees with system command like system
("sh ls -l | grep ^d | awk '{print $9}'").
Instead of this I want to use any system command where I can capture this information directly into my local character buffer. My opinion is that system calls will not spawn another process as a result less time it takes, another reason is once I use the system command again I need to capture the information to a local file then again read it into local buffer. I want to avoid the file manipulation here.
View 1 Replies
View Related