Basically, I'm in the terminal and I type in:cd desktop (or downloads or whatever)and nothing happens. I'm probably just being legendarily thick, but where am I going wrong?
Well, I am facing problem when doing lab questions.
I must use DLXLinux bundled in Bochs (bochs.sourceforge.net).
I am required to use the /usr/local directory.
In /usr directory, there is no directory named 'local' but there is one thing called 'local@'. So, when I try to use mkdir command to create 'local' directory in /usr , there are error "cannot make directory.....".
I'm quite new to linux but I have configured a simple ftp server and it's working great. I have a FTP-Shared folder with upload and download subfolders. Under upload's and download's I have identical category subfolders like mp3's, movies, software etc. in both. As the guy's upload, I would like to create a line crontab where I can move all the content under /FTP-Shared/upload/mp3/* older than 14 day's to FTP-Shared/downloads/mp3/ recursively (Like in cp command), but the timestamp must be searched on the first directory and not sub files example: /mp3/Club Dance/CD1/Hallo world.mp3This is how far I got:[root@clients ~]# /usr/bin/find /FTP_Shared/upload/Mp3s/ -depth -mindepth 1 -mtime +14 -type d -exec mv -f {} /FTP_Shared/download/Mp3s/ ;This command moves the directory and files, but it is not recursively
Is there any command in Linux which will find a particular word in all the files in a given directory and the folders below and replace it with a new word?
to all the pdf's in a directory and save each line of output to a text file. i.e. i want to collect each line containing "DOI" from every pdf in a text file. I am unable to understand bash scripting enough to write a for loop for this.
in the command line there's a button (tab) for autocompleting commands and I'm pretty sure linux has a button that prints the last directory I used/typed whatever?
I am struggling to learn the command line, and am stuck to the following In my directory ~/Music , I have many music archives , total about 0,8 Gbyte . Yet , changing to this directory ) and giving ls -dlh , I get
ioannis@ioannis-laptop:~/Music$ ls -ldh drwxr-xr-x 4 ioannis ioannis 4.0K 2011-03-04 14:55
So, only 4 k size and no info about the number of the files in the dir
I wanna delete a directory with its files and I wanna do that as follows: rm -r dirToDelete Unfortunately, I always get asked for EACH single file if I wanna delete this because it is write protected.... Is there a way to suppress this feedback message so that just the whole directory with its contents disappears?
I have been playing around with the tar command and I know this is how to use it. Code: tar -cf [filename] [directory] But what I want to make an archive from the current directory I thought just to not enter a directory but that doesn't work. I get an error about creating a empty archive so how to do I make it so how do I tell it to do the current directory?
I need to extract the absolute directory from the type command when I pass it a program name. E.g.
Code:
>type cat cat is hashed (/bin/cat)
There is one other case (I believe):
Code:
>type lpr lpr is /usr/bin/lpr
I thought of using regex, but it returns the whole line, not just the match. In addition, there is no option cited in the man page for type that returns just the command directory.
Note: this is part of my solution to a programming assignment in bash shell scripting.
I want to install a program (specifically metasploit), and have it accessible from any directory in the terminal. I have it installed correctly, but I have to travel to the directory it is installed in to run it (by using ./msfconsole ). I want to be able to be in any directory and just run "msfconsole" and have metasploit run. I have to copy the metasploit folder to the /opt/ directory? Maybe the /usr/bin/ directory?
Ubuntu 10.04. As part of my nightly backup script I archive my home directory with the following command tar -cvpzf /quitelarge/_mirror/mirror1/home-ken.gz /home/ken 2>> /quitelarge/_mirror/tar-error.log
It seems to work fine and I have recovered files from the archive on occasion. Actually I keep 7 rolling daily backups and a monthly burn to DVD. I had an sftp connection made by Nautilus to my server. Ubuntu for whatever reason places an icon on the desktop showing the connection. When I ran the script it decided to archive everything on my server - all 1.4 TB. I caught the problem when home-ken.gz was about 5 GB. I stopped the process, closed the sftp connection, rolled back the backups and tried again. This time I got a file of the expected size - about 45 MB.
In the error log I did find that the tar process was trying to suck the entire contents of the server into the archive file. tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/asound/ICH5/pcm0c/sub0: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/asound/ICH5/pcm0c: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/asound/ICH5: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/asound: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/scsi: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/acpi/event: Cannot open: Permission denied tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/acpi/fadt: Cannot open: Permission denied tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/acpi/dsdt: Cannot open: Permission denied tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/irq/21/smp_affinity: Cannot open: Permission denied
Is there an option I can place on the tar command to tell it NOT to follow the ssh connection which is sitting on my desktop? The closest thing I see in the documentation is -h which tells tar to "follow symlinks; archive and dump the files they point to." I am NOT specifying -h so if the ssh connection is treated as a symlink by tar I would still not expect the remote contents to be tarred.
I have a serious problem on my VPS. I ran "yum update" and then hit the Ctrl+C to cancel when I realized that I need to specify a package to update, not all of them, but when the console returned suddenly I lost the connection, when I try to reconnect to the machine, it says /bin/bash not found !! even when I try to issue commands from the VPS control panel, it reports that the commands are not there
I can issue a ticket for the ISP to resolve this, but I need to know the risks before I do this, I have no backup on MySQL and 2 live web applications, although they are still running, I am afraid if I restarted the VPS everything will be gone.
# Create a directory, and user, assign ownership of dir to that user and usergroup. sudo mkdir /mysecureddir sudo useradd mysecureduser sudo chown mysecureduser:mysecureduser /mysecureddir
[code].....
I've read some similar issues dealing with apache, but its still not clicking for me. Group has rwx access to directory and everything in it. I'm in the group.
So I am trying to put together a simple command that when executed from the project folder will run the appropriate hg/svn command in each project i.e:
[Code]...
Since the client has many such projects, Instead I am looking for a solution similar to find -exec where the svn/hg commands are automatically executed on each first level of match (i.e. svn up is run in the project/a folder but not in project/a/subfolder). How can such a command be constructed ?.
What command will provide you with the number of files in your current directory? Choose one answer. A. ls -c B. ls | wc -w (this one) C. ls -n | count D. ls -wc (this one ?)
Why would rsync insert a user's home directory path in variable expansion when run via cron, but not when run manually. The gory details... Red Hat Enterprise Linux AS release 4 (Nahant Update 6) Linux 2.6.9-67.0.20.ELsmp The script (parts anyway, and simplified)...
I am total new to linux as I worked mostly on RTOS (symbian). My problem is, I need to find the file IOSTREAM.H and I am following commands below: 1) cd / 2) find . iostream.h ( finds the file / directory from the current path) It shows No such File or Directory
I'd like to copy a file, say widgets/water.txt, to all subfolders in the folder widgets using a single command. So if the folder widgets has 10 subfolders like widgets/blue, widgets/green, etc. I'd like to copy water.txt to all of them with one command.
I tried the commands
Code:
cp water.txt ./*/water.txt cp water.txt ./*/
However these don't seem to work. The latter gives 'cp: omitting directory' errors.
When I log on a root and attempt to issue the command Freshclam to upgrade the virus definitions it attempts or create a new file with a definition name. I get a message stating that the directory isnt writable. The user and group access rights are as follows:
USER = read, write, execute Group = read, write, execute All= read, execute.
The only way I can get around this is by applying a 777 which would be read, write and execute for all. Now, I have a group define with several user ids in it including Root.How do I connect the group with the directory/file so I dont have to apply a 777 access right to group users could issue the Freshclam command.