I am using RHEL 4.4. Last time when I reboot my server it generate an error, and mention to run fsck command in repair mode. When I ran, this fix some problems, but after that it generate an error of gdm and X11 services after showing login sceen and getting user name and passwod. But I login via putty from a remote system. So, when I tried to make changes like create directory or file or even tried to make any change in any file it generate an error that " you can not make changes in read only file system".
I've the following file structure that I would like to add to git.
Code:
These are big directories and I don't need them all checked out. I only need the src directory. After I commit the files in the /app/src, it must be pushed to a remote site.
If I want only to checkout the src directory to work on, it's important to create a special file structure in git? For example, instead of doing git init on app general directory, should I do git init on all subdirectories?
Is it possible to checkout only part of a file structure in git?
My clearly outdated Linux course I'M using is telling me that the directory structure for building RPMs is in /usr/src/redhat, but on my redhat system, there is only /usr/src/ > debug & > kernels, folders.
How do I download all the files form here: [URL]. I am on freeBSD 7.0 and I tried wget with the -r switch and it gives me URL's only. Maybe this is simply not an ftp site I don't know. How I can download all those files with the same directory structure.
Amarok is nice, and currently the only thing I've tried that will actually play audio from network shares and not lock up. Only problem is that it doesn't seem to list my library by directory/folder structure.. only album/artist/genre or some variation thereof. Is there anything that does list by folder structure?
I would like to find the command that copy my eclipse options to another workspace code...
It doesn't work, and it could be source of error to write the path .metadata/.plugins manually. It certainly a better idea to create a complete script ?
Is there a way to force rsync to not make directories in its destination directory; ie, to simply dump all of the files from the source directory directly into the destination without copying any of the folders that the files were originally in? I tried --no-dirs, but that seems to only be for empty directories.
What is the best and simplest way to compare two directory structures without actually comparing the data in files. This works fine: diff -qr dir1 dir2 But it's really slow because it's comparing files too. Is there a switch for diff or another simple cli tool to do this?
I will be doing actual development and testing on the same machine as the server. It is a single user machine in the sense that I will be the only one working on the machine. There will be multiple hosted languages, specifically PHP and RoR while possibly expanding later. I'd like the setup to translate well to a production environment. With those 3 things in mind there are a couple of things I've had in the back of mind.Seeing as it's a single user machine I haven't been able to decide whether or not I should be working on things out of my home directory or if they should be located outside of it.I'm feeling that outside of a user directory would be better as it would translate better to a production environment, but I'm also not sure if that will come with any permission annoyances or concerns seeing as I'll be working on the same machine. Hosting multiple languages seems like it may be a bit quirky. With PHP I've found you're generally just dumping the project somewhere in the document root where as something like a Rails app you have the entire project and you only want the public directory in the document root.
I have to set up my development environment under OEL 5.4. The internal directory structure is more or less fixed, the question is where to put the whole application tree in standard FHS for the development phase.For production, the system will be deployed on a server where data is kept with an Oracle 11g database. Development goes with JDeveloper which has its own internal application/project directory structure; however, I whould like to be independent from this. Unfortunately, by carefully studying the web for FHS, if I will adhere to it I only find ready-made packages to put under /opt, for example. No development aspects are mentioned. make some proposal, possibly with respect to best practices.
dir1subdir1subdir2etc.and at the lowest level they contain all of these jpegs that I need. The problem is that I only need some of them. They're named like this:
pic1.jpg pic1_med.jpg pic1_small.jpg pic2.jpg pic2_med.jpg etc.
I want to just grab the ones without the size suffix and copy them all to another set of folders, while preserving the directory structure. The numbering all starts at 1 for each low level subdirectory, so I think that the directory structure is the only way to not get them mixed up.
I know that cp has a recursive option -r but how do I just extract the ones without the underscore? And then how do I preserve the directory structure when I move them over?
jump into a Linux class in college with only 3 weeks left in the course. I thought I would be able to catch on, and go figure, it didn't exactly happen that way. I was given an assignment to do, and I am so far lost it isn't even funny. I need to create a directory structure, set up file security, create a step by step instruction manual on how to copy/delete said files, and create a guide to common Linux commands. How would I create these files in root and share them with the other users? and where can I find a list of common commands and their functions?
I'm working with a dual-boot laptop running Ubuntu 10.0/Windows 7 and a Debian 5 VPS while the OS's shouldn't have much impact on my question.
What I would like to do is create a html page that I can upload to my VPS which lists all of the files/folders on my local 2TB hard drive (Specifically media such as Movies, Music, TV Shows...). The media obviously will not reside on the server, but I would like to at least have a list which will allow me to select, for instance, a bands artist so that it redirects me to the albums in the directory below.
Ultimately, I'm looking for Open Directory Browsing without actually having the media on my server. I have been attempting to create something to this effect using lynx, however, I'm not sure if it can be done with this command or if it's even possible for that matter.
I want to copy all files with the name XYZ* into one folder. The problem is that the files are in different subfolders and that not even the depth of the folder structure is the same for all files. Luckily, at least each file has a unique name.
Of course, I thought about the cp command but I guess the depth of the folder structure needs to be the same for this to work.
I have uShare 1.1a setup to talk to my XBox 360. If I share a directory that has no subdirectories, the video files display on the XBox. However, most of my files are in sub-directories on a different partition - I don't really want to copy them to the share, but uShare doesn't seem to recognise any sub-directories or files contained therein.
I have tried setting up symbolic soft links directly to the video files (although this is a pain, it is better than moving the files)...
Code: ln -s /home/jonftp/TV-Shows/Buffy/Season-1/Buffy-101.avi /home/share/Buffy-101.avi ...but these don't show up on the XBox either.
How can I get uShare to "drill down" the directory structure to list the files or how can I get uShare to follow symbolic links?
Unfortunately, I deleted my /home/ directory by running "rm -rf *" accidentally. The partition (/dev/sda3/) has an ext3 filesystem. After deleting the /home directory, I shutted down the PC and rebooted from a RIPLinux liveUSB, which has some tools that allowed me to recover some files. However, what I would like to do is to recover the directory tree structure, rather than the files, in order to see which files I deleted.
What I exactly want is the following: I would like to have the output of "ls -lR /home/" before deleting all the files, but the problem is that now the /home directory is empty.
I am using GTKPOD to add music and sync my Ipod touch. I have gtkpod up and running and I have gone through the initial steps of changing the repo and selecting my ipod touch. But now when I try to "load ipod" i am receiving an error stating that the IPOD directory structure not found.
I have changed the repo so that is ipod mount point: /mnt
But after selecting load ipod;
Could not find iPod directory structure at '/mnt'.
If you are sure that the iPod is properly mounted at '/mnt', it may not be initialized for use. In this case, gtkpod can initialize it for you.
Do you want to create the directory structure now? >>>Create Directory Structure
Following this message; Warning
The following has occured:
Error initialising iPod: Problem creating iPod
How do i create the iPod Directory Structure that it is asking for?
I've been using rsync in a bash script to make hourly copies of jpeg files that are created every few minutes. The images are contained in a number of subdirectories, with the filenames using the date and time
At the moment, my source and target directories are identical, and rsync is easy:
Quote:
rsync -av data/images/ /mnt/data/images
Note that the source directory is purged at regular intervals to stop it filling up. So the target directory has all the images, but the source doesn't.
I need to change my script so that the target directory has a different structure from the source directory, like this:
I have a drive with an NTFS partition where all the files were deleted. What I'm looking for is a way to rebuild the directory structure and recover the files. I really, really want the directory structure as the partition contains 460 Gigs of data. Normally I would use the tools here: [URL] but I've never dealt with this much data before. Everything there that I've used creates a pretty messy dump however.
I have used ntfsundelete before but only for a few files at a time. I have no idea what would happen if I tried to run it on a partition of that size. I'm comfortable with data recovery but this amount of data is beyond me. I've run ntfsundelete with no args and from what I can tell of skimming the pages of output all the files are fine. The partition has not been written to.
I have several copies of a file set with different organizational structures, but the same files. i.e. On client1 files can be found in ~foobarfile1, ~foobarfile2, ~foo-avernfile3, ~foo avernfile4 On client2 files can be found in ~foo-barfile1, ~foo-barfile2, ~foo-tavernfile5, ~foo avernfile6 On client3 files can be found in ~file1, ~file3, ~file5, ~file7
I have access to one client and the server where I'd like all the files to be synced. I'm not worried about conflicts, just having a complete copy of all files[1-7]. Is there a way to cause RSYNC to remove the directory structure, so that I get something like: client1% rsync files server:backup client2% rsync files server:backup etc where at the destination all files will be checked against the destination set regardless of the source directory structure?
I'm having problems with compiling recursive Makefiles in my directory structure: My folder layout is: top/|- one/|- one.c (With main function)|- zero.c|- two/|- two.cin my top folder the make file looks like:
Code: MAKE_DIRECTORIES = one two .PHONY: all all: $(MAKE_DIRECTORIES)
.PHONY: $(MAKE_DIRECTORIES) $(MAKE_DIRECTORIES): @echo $@ $(MAKE) --directory=$@ in my one and two folder I have the following Makefile:
Code: .PHONE: all all: @echo $@ $(CC) $(CFLAGS) *.c But when I compile it from top folder: make
I get following output: Code: one two Which states that directory statement by echo in main Makefile is ok but the files are not compiled in one and two.
I am trying to create a simple bash script to rsync some folders within a directory stucture. I am using wild cards, in the rsync source directory structure, but my command always fails. I believe it is the way I am using wild cards within my for loop. Here is my command ;
Code:
for seq in `cat test.txt` ; do rsync -nvP /folder/folder/folder/folder/folder/**/$seq /folder/folder/folder/ ; done This always fails, where if I do a ls to the destination, to test the path, it always works.