General :: Creating A Directory Structure - And Setup File Security?
May 15, 2010
jump into a Linux class in college with only 3 weeks left in the course. I thought I would be able to catch on, and go figure, it didn't exactly happen that way. I was given an assignment to do, and I am so far lost it isn't even funny. I need to create a directory structure, set up file security, create a step by step instruction manual on how to copy/delete said files, and create a guide to common Linux commands. How would I create these files in root and share them with the other users? and where can I find a list of common commands and their functions?
I'm working with a dual-boot laptop running Ubuntu 10.0/Windows 7 and a Debian 5 VPS while the OS's shouldn't have much impact on my question.
What I would like to do is create a html page that I can upload to my VPS which lists all of the files/folders on my local 2TB hard drive (Specifically media such as Movies, Music, TV Shows...). The media obviously will not reside on the server, but I would like to at least have a list which will allow me to select, for instance, a bands artist so that it redirects me to the albums in the directory below.
Ultimately, I'm looking for Open Directory Browsing without actually having the media on my server. I have been attempting to create something to this effect using lynx, however, I'm not sure if it can be done with this command or if it's even possible for that matter.
I want to copy all files with the name XYZ* into one folder. The problem is that the files are in different subfolders and that not even the depth of the folder structure is the same for all files. Luckily, at least each file has a unique name.
Of course, I thought about the cp command but I guess the depth of the folder structure needs to be the same for this to work.
i am running Ubuntu Lucid x64 as a fileserver that shares its files via SFTP, NFS and Samba. Currently the hard disks are configured to go to standby if they are not needed. This works perfectly as long as no one browses the shares or my HTPC is running: That one repeatedly looks through the shares for new music or movies. In other words my problem is that the disks are spinning up a lot more often than they should have to. Additionally the spin-up time delays the response time while browsing. Since the machine has a lot of unused RAM i want to tell the kernel that it should keep the directory structure in memory. That way the disks would not need to spin up every time someone browses through the directories.
I've been looking high and low for a utility program or perl script or something that can take a linux directory structure as input and convert it to MS-DOS 8.3 directory structure.
The purpose of this is to conform to the path format that is expected on my rather old Creative Zen Neeon MP3 player for m3u play lists.
I am in the process of writing an rsync script to run unattended backups of my entire file system to another system located on my local network using ssh and password-less rsa keys.
I will absolutely will not use password-less keys with the root account and this is the limitation preventing me from accomplishing my goal because root is required by rsync to access the / tree and copy it to another location. I decided that if I compiled the script into a binary that I didn't have a problem with the password being contained within the binary itself but from what I've read there is no way to elevate to root and then back down to user level from within the script/binary.
I can create the script as the user and use chroot to make it owned by root but retain execution permission for the user but it will still cause the ssh login to be under root and therefore require either that I am there to enter my password or the use of password-less keys under the root account which I reiterate I will NOT do. Currently the script is executed by the user on the machine containing the files to be backed up.
I've the following file structure that I would like to add to git.
Code:
These are big directories and I don't need them all checked out. I only need the src directory. After I commit the files in the /app/src, it must be pushed to a remote site.
If I want only to checkout the src directory to work on, it's important to create a special file structure in git? For example, instead of doing git init on app general directory, should I do git init on all subdirectories?
Is it possible to checkout only part of a file structure in git?
My clearly outdated Linux course I'M using is telling me that the directory structure for building RPMs is in /usr/src/redhat, but on my redhat system, there is only /usr/src/ > debug & > kernels, folders.
I am using RHEL 4.4. Last time when I reboot my server it generate an error, and mention to run fsck command in repair mode. When I ran, this fix some problems, but after that it generate an error of gdm and X11 services after showing login sceen and getting user name and passwod. But I login via putty from a remote system. So, when I tried to make changes like create directory or file or even tried to make any change in any file it generate an error that " you can not make changes in read only file system".
How do I download all the files form here: [URL]. I am on freeBSD 7.0 and I tried wget with the -r switch and it gives me URL's only. Maybe this is simply not an ftp site I don't know. How I can download all those files with the same directory structure.
Amarok is nice, and currently the only thing I've tried that will actually play audio from network shares and not lock up. Only problem is that it doesn't seem to list my library by directory/folder structure.. only album/artist/genre or some variation thereof. Is there anything that does list by folder structure?
I would like to find the command that copy my eclipse options to another workspace code...
It doesn't work, and it could be source of error to write the path .metadata/.plugins manually. It certainly a better idea to create a complete script ?
Is there a way to force rsync to not make directories in its destination directory; ie, to simply dump all of the files from the source directory directly into the destination without copying any of the folders that the files were originally in? I tried --no-dirs, but that seems to only be for empty directories.
Been messing around with Ubuntu 9.1 for the last few weeks and am loving it so far. Been trying to get in the terminal and learn a little something, to no avail. LOL I have been googling and searching the site today for info on networking. My Linux box is a desktop, with my main HDD mounted with music, and movies and some other stuff. My intent is to network the two laptops in the house (Windows XP and Windows 7) to the Linux box so I can listen to my music and watch movies when not in the office. I have found some info, mostly involving Samba, and plan to install Samba tonight and fiddle with it. My issue was with security. I have read a few posts and they talk about the fact that if you share files in this manner, the set up is not secure at all. Is this something i should really be concerned about? If the folders I share only have my music and videos in them,
What is the best and simplest way to compare two directory structures without actually comparing the data in files. This works fine: diff -qr dir1 dir2 But it's really slow because it's comparing files too. Is there a switch for diff or another simple cli tool to do this?
I will be doing actual development and testing on the same machine as the server. It is a single user machine in the sense that I will be the only one working on the machine. There will be multiple hosted languages, specifically PHP and RoR while possibly expanding later. I'd like the setup to translate well to a production environment. With those 3 things in mind there are a couple of things I've had in the back of mind.Seeing as it's a single user machine I haven't been able to decide whether or not I should be working on things out of my home directory or if they should be located outside of it.I'm feeling that outside of a user directory would be better as it would translate better to a production environment, but I'm also not sure if that will come with any permission annoyances or concerns seeing as I'll be working on the same machine. Hosting multiple languages seems like it may be a bit quirky. With PHP I've found you're generally just dumping the project somewhere in the document root where as something like a Rails app you have the entire project and you only want the public directory in the document root.
I have to set up my development environment under OEL 5.4. The internal directory structure is more or less fixed, the question is where to put the whole application tree in standard FHS for the development phase.For production, the system will be deployed on a server where data is kept with an Oracle 11g database. Development goes with JDeveloper which has its own internal application/project directory structure; however, I whould like to be independent from this. Unfortunately, by carefully studying the web for FHS, if I will adhere to it I only find ready-made packages to put under /opt, for example. No development aspects are mentioned. make some proposal, possibly with respect to best practices.
dir1subdir1subdir2etc.and at the lowest level they contain all of these jpegs that I need. The problem is that I only need some of them. They're named like this:
pic1.jpg pic1_med.jpg pic1_small.jpg pic2.jpg pic2_med.jpg etc.
I want to just grab the ones without the size suffix and copy them all to another set of folders, while preserving the directory structure. The numbering all starts at 1 for each low level subdirectory, so I think that the directory structure is the only way to not get them mixed up.
I know that cp has a recursive option -r but how do I just extract the ones without the underscore? And then how do I preserve the directory structure when I move them over?
I had made some modifications in the source code of a software called "HomeBank". I'm not able to make a setup file using "Inno setup".how to create an .EXE file for the source code to execute.
I need to create a folder for every single file in a directory, possibly making the folder have the same name as the file that it will be containing. Is it possible to do via terminal?
I have just been bothered by a fairly small issue for some time now. I am trying to search (using find -name) for some .jpg files recursively. This is a Redhat environment with bash.
I get this job done though I need to copy ALL of them and put them in a separate folder BUT I also need to keep the order intact after copying.
For e.g - If I get a JPG file under /home/usr/new/1/ then the destination also needs to be /test/old/new/1/.
At the moment, I am simply putting all files under /test/old/ and I can't somehow get the later /new/1/ folder path created under /test/old/
I understand this could well be done using while OR if else loop, though if someone can just guide me with a hint, I would be really grateful.
I will complete the rest of the steps and was asking here since I am still not comfortable with the shell/bash scripts yet and planning to be really good at it over the next couple of months.
I just bought a USB flash drive. Whenever i click it to open its contents it gives me the error " Unable to mount USB20FD " then under that it says "Error creating moint point: No such file or directory " (btw it does say moint point and not mount point, which is kind of weird.i can use the flash drive i just bought.
Is it possible to create a text based menu layout in bash were it is possible to browse through. The menu list should look something like this:
---------------------------------------- user: root colour: blue number: 4 animal: dog ----------------------------------------
At the start the cursor should blink at the r from root so that text can be entered. When pressing the enter the cursor should go to the b from blue and so on. the imported thing is that all the text is visible also beyond the position from the cursor.