I need to create subdirectories in about 300 existing directories - the subdirectory will have the same name in all 300 existing directories. How do I do this using the mkdir command using a regular expression or globbing?
I want to make a webserver with multiple users allowed to login through SFTP to a specific folder, www.Multiple users are added, lets say user1 and user2, and all of them belonging to the www-data group. The www directory has an owner www-data and a group www-data.
I have used chmod -R 775 on the www folder, but after I try to create a folder test through my SFTP server (using Filezilla) the group of the directory created has only r and x permissions, and I am not able to log in with the second user user2 and create a directory within www/test due to a lack of w permission to the group.
I also tried using chmod 2775 on www directory, but without luck. Can somebody explain to me, how can I make it so that a newly created directory inherits the root directory group permissions?
I can do:mkdir messages and then: touch messages/hello.txt Is there a command that will do both - create the directory if it doesn't exist, and then the empty file? Something like: touch -p messages/hello.txt
I am trying to exclude multiple directories when using tar. I can do it for just one directory with exclude= directory.I can also do it for multiple directories by typing that code again and again.As you can see im trying to call this variable that has endless amounts of directories in it seperated by a space.. but when run it doesnt work! It will however work if i just put one directory in the variable. Any ideas?
I have one file called test.sh and in that file I have the below code. All this code is, is paths to three directories (as you can you can clearly see!).
Code: #!/bin/bash BACKUP="Documents /bin /sbin"
Now I have this other file which reads the directories (by using $BACKUP) and creates a tar file of everything in that folder. But what I am unsure of what to is create a bit of code that will simply look in test.sh, read all the directories and print a line saying either they all exist or some are missing. If possible it would be good to know which directories are missing too!
I have fiddled around with using -d but I can only get it to work for one directory or manually having to write out each directory.
I have 60+ directory's each containing multiple .doc files. I need to move them to a single directory and keep their file name intact. I don't think cp will do that with out listing all the file names. I was thinking of something like: cp -r /dir/*.doc /newdir . Or should I use a combo like find -type *.doc|cp /newdir?
I found a script on webmaster world that mostly does what I need it to, but have been making modifications to tailor it to my specific needs.I know that //..*/ tells awk to ignore hidden directories, how do I define more directories to ignore? (i.e. temp, var, etc)? I've tried playing with prune before the awk command with limited success...I know that there are many ways to do the same thing and keep running into brick walls.
I have hundreds of directories in various subdirs that I need to remove. I want to remove all of these dirs, but can only find solutions on how to do remove files (or how to remove subdirs from within the current dir).
I think I need something like
find -iname 'testfile*' | xargs rm -i
where I want to remove every directory that contains the word 'testfile' within the directory name. I know xargs wont work for dirs,
I'd like to move a selection of files from all the sub-directories within an overall directory to a single destination. I don't want any of the directory structure, just the files themselves. This is what I tried so far:
mv /dir1/*/igs*.sp3.Z /dir2
There are other .sp3.Z files in the * directories within /dir1 but I just need the ones that start with igs..
I'm not sure if this is possible or even where to start. I assume that this can be done with an sh script using tar or similar.I have several very large zip files that contain images for all of the products in my online store. Each image is named after its 13 digit SKU (for example, 9987788000012.jpg). In order to import products into my store, all images are placed into a media directory. Unfortunately, there are over 100,000 images.
So I would like to break the images into sub-folders based on file name. For example, when I extract store_images.zip (or tar or whatever), my extract script would create directories (if they don't already exist) based on the first three digits of each image name, placing each image into the appropriate bottom level directory. For example, "9987788000012.jpg" would be placed in the following directory "media/9/9/8", with media as the root and "8" as the directory that holds any images that start with "998". Perhaps two sub-folders would be less cumbersome.Assuming this requires a script, particularly since it involves scanning image names, creating folders, and saving images to specific directories, which language would serve my needs best? PHP? Has anyone had to do something similar?
I have 5 FTP users that upload files (and subdirectories) in their home directory, i need to mirror theese directories beetween them and with a "master" directory (accessible from a 6th user). Files can contain spaces or others special caracters. All the files are in the same filesystem, and i want to use hard link because i don't want to waste 5 time the space of a single file. I tried with find but i cannot handle spaces in it.
I am attempting to copy a set of sub folders from their multiple parent directories to a new location.
For example, I have three folders to copy:
I would like them to be copied to:
In actuality there are many folders besides folder1, folder2, folder3, and no numerical order exists. So, the folder named 'photos' would be copied to its parent folder's name in a new location. I would need this to occur for all folders in the '/home/user' directory.
Description: I am a newly appointed system engineer taking care of linux servers. We have a new set of data coming in which need below configuration: How to do a script with function?:
for files with ".txt" in sm copy each of the files to folder : sm1 and sm2 (log every copy) if succesful: remove original log into the log file if not successful: (not successful copying 1 particular file to all the folders) retain and retry log into the log file mail out the admin with that particular file name
I have already do try a bit: cd /export/home/ for dir in sm1 sm2; do cp -p sm/*.txt $dir/ done Is my starting right? How to do the rest parts?
I have just installed an SSD as a secondary hard drive and formatted as ext4. (the Ubuntu installation is on a different drive)how I would go about creating a directory on the SSD that is owned by the user 'Test user'.
I am setting up an SVN server (svn+ssh) that will be used by students at the university where I work. I was considering in the beginning, one single repository and eventually creating directories for each project inside the repository. It seems to me now, that it is not very secure way of doing things. The directory on the server will be with rights 770 and this means that every student can come on the server and sweep out the whole repository.
Also mistakenly or not, every student can 'svn delete' the whole repository, which could be a nightmare to recover from. An issue might be to create groups and then assign users to groups and then create many repositories and each repository to be assigned with group. This means that I will have to manage tens or hundreds of repositories -- maybe not very common task. What is an optimal solution for this working environment.
I try to use rsync for backing up some directories and I have to following problem: some files have permissions that prevent me from running rsync under my own user id. So I run it under root using the option "-a" which according to the man page should preserve the permissions, owner and group information:
However, when I run this under root, the directories created in the backup location get user root and group root while ordinary files keep the original user and group. What am I missing here? How can I get rsync to preserve the user and groups for all files, including directories?
Here is a command to illustrate my problem Code: sudo rsync -a /home/youruser /tmp
If you try that and terminate with Ctrl-C after a few seconds, there will be a directory /tmp/youruser where the directories contained within are owned by root group root.
I have just installed an SSD as a secondary hard drive and formatted as ext4. (the Ubuntu installation is on a different drive) Im very new to linux, Could someone inform me how I would go about creating a directory on the SSD that is owned by the user 'Test user'
Im sorry if this is a daft question, im just moving from windows to linux and struggling a lot.
I am wondering how I would go about creating a .deb file that would extract it's contents to two different folders. I would have one file directory that should be extracted to /opt and I have a second that should bextracted to the current user's home folder and /etc/skel - How would I go about doing this?
I have a directory tree with lots of folders. I need to gather all files of same type, say .txt, and place them in a different folder all by themselves.
I know I can use the mv command, but it won't let me go through all the subdirectories of my folder, just the current one. How can I search through all subdirectories for all .txts or whatever and move them to a folder of my choosing?
i am in need of linux help. iam at college and i need this back/restore script to pass this final part of an assessment. i require a backup script that will not only backup but also restore files to the relevent directories. e.g. users are instructed to store all wordprocessor files in a directory named wp. so i am needing to create a backup directory and 3 directories within that and some files within the 3 directories and then back them up ot restore them. l know i should/have to do this myself by been trying to get/understand info for the last few days and came up with zero.
So I have a share hosting account with 60 sites all running wordpress.
There is a plugin I want to delete from all 60 wordpress sites.
The plugin is in the same path in all 60 sites.
mysite.com/wp-content/plugins/carter Is there a way I can search the entire home directory with filezilla or another ftp and delete every folder with that name in every site or I have to do it the tedious 1by1 sucky way?
I currently have samba setup and connecting. What I am trying to do is have multiple users with access to different directories. For example , let's say there are folders A B C on my Linux machine. I want one guy to see A and C and another guy to see B and C and a third guy to see them all. But I want each user to have access to change delete or execute the files within these directories that they have access to