I have a newly built Redhat 5.5 box. When connecting over VPN and executing either "ls -al" or "df -h", the command prompt hangs without returning any output. Have got same result with TeraTerm and Putty clients. However, other commands run fine. Also,local users are able to run "ls -al" or "df -h" successfully.Storage is local disk. Any idea what could be mis-configured.
I'm trying to rsync files and directories from a RedHat linux host(v 4.5 & 4.7) to a Windows server 2003R2 Standard Edition with cygwin running. I'm executing the rsync command from the cygwin shell. The transfer involves rsync'ing approximately 1 TB of data from the linux server to the windows server. After about 280+GB of data transfer, the transfer just dies.
There seems to be no particular file or directory that the transfer stops at. I'm able to rsync GB's of data from other linux hosts to this cygwin server with no problem. Files and directories rsync fine.The network infrastructure is essentially the same regardless of the server being rsync'ed in that it is GB Ethernet running through Cisco GB switches. There appear to be no glitches or hiccups across the network path.
I've asked the folks at rsync.samba.org if they know of any problems or issues. Their response has been neutral in that if the version of rsync that cygwin has ported is within standards then there is no rsync reason this problem should happen.I've asked the cygwin support site if they know of any issues and they have yet to reply. So, my question is whether the version of rsync that is ported to cygwin is standard. If so, is there any reason cygwin & rsync keep failing like this?
I've asked the local rsync on linux guru's and they can't see any reason this should fail from a linux perspective. Apparently I am our company cygwin knowledge base by default.
How would I limit this to searching for the text 'SomeString' or 'SomeOtherString', but only if the file has extension .php, .inc or .js? Also - what piping to xargs does here? I don't understand how this command actually works.
I have a requirement to list files using find command My folder contains below list of files with out extention.I have a requirement to exclude only ABC.123.* type files and list others. Even though files having MNO contains this pattern i should not exclude. Even if file ends with .txt or .doc it should not be excluded. That is ABC.123.1234.txt should not be excluded.But I am not getting what is required. Can any one please let me know if I am doing wrong any where. As per my requirement I cannot use grep, -regex, or -regex attributes to find command.
its a very basic question but iam not getting it right nowi have to list all the pdf files on my desktop even the pdf files which are present in folders on the desktopls *.pdfonly list the files present on the desktop, but not the files in the folders on the desktop containing the pdf files.
I've got a Fedora 10 server with a simple read-only samba share.I'm able to mount and browse the share from a Fedora 12 client, but all directories appear as empty--and I can see on the server that they contain many files. This happens whether I browse using smbclient, or mount using mount.cifs.I've got smb/nmb ports enabled on both the client and server. File permissions on the server look right.The server smb.conf setup:
I think as a result of a script that started duplicating files in a loop, the allotted capacity on my VPS filled up with multiple nested copies of the same files... After a reboot, I could delete most of them, but got rm stalled in certain directories...
after isolating which ones, I found this: a directory listing that lists the files, and at the same time tells me they are not there!!!
Code: ls ls: cannot access userkey.php: No such file or directory ls: cannot access workshop.php: No such file or directory ls: cannot access quiz.php: No such file or directory ls: cannot access webservice_rest.php: No such file or directory
If I runls -R1I get a recursive listing of all files under the current directory.However, if I dols -R1 *.avi, ie I want to search only for files with the file descriptor .avi, I get an errorQuote:ls: cannot access *.avi:No such file or directorySo it seems I am using ls incorrectly. What's the correct way to use wild card pattern matching when using the -R switch? Or maybe that isn't possible?
I'm using rsync to create a mirror of the data files on our main server every day. I've looked at the man page, and can't see it; can I get a listing of the files that have been changed on or added to the mirror when it's completed? Can it just log what it's doing to a file?
suppose in my current directory, I have 50 sub-directories. Now, I am interested only in about 20 of those sub-directories (whose names match a pattern). I would like to recursively list the contents of these 20 sub-directories. How do I do that ? I would like to do this in Solaris 10 and Linux(RHEL 5.x).
I'm trying to make a shell script that will list the 50 newest files in a directory with several subdirectories in. I've been trying with the find-command with no luck and now I've figured I should probably use ls. The problem is when I do "ls -lRt | head -50" it will do 1 directory at the time. It will not first make the full list and then sort it. This will display all items in first directory, sorted, then the newest directory will be sorted and displayed. So I figured I have to sort the whole process of ls before I limit the head. So this is where I am at now: ls -lRt | sort <something clever here> | head -50
Only doing a "|sort|" will sort it by name if I understand it right and I don't know how to solve it. Here's also my first attempt if that is of any interrest or help, this was limited by the change status time of files (so some lists got very large). These lists dit not get sorted by time and I could not find any way to do so.find $ftpDir -ctime $time -type f -print > $ftpFileLsAny help on this would be appreciated since I'm sort of stuck now. After reading manuals for all the options I can think of and still there's just a big blur in my head..
Firefox opens file listing instead of Nautilus opening file listing.When I access a folder via "Places" -> "Home Folder" or "Places" -> "Downloads", Firefox opens and list the contents of the directory.I have re-installed Nautilus, un-installed Firefox and then going to "Places" -> "Home Folder" or "Places" -> "Downloads" launches Nautilus and I can view the contents normally. Anybody else had this problem with Firefox ? Anybody know how to fix this Firefox problem ?Running Ubuntu Desktop 9.10 64bit.
I have once tried transferring files from Redhat 9 web files to Centos 5.5 /var/www/html/ directory. Ok... LAMP is installed and working fine. httpd / apache is running and mysqld is installed without root password. Now I wan't to transfer it again to secure the database from Centos 5 to fedora 13. But when I tried browsing the web to check the web files I can only see these:
is it possible to write a script, when prompted in the terminal to output a tree listing of files and folder with out using a tree command.for example. control the output of ls -l to output that list like a tree (-- or /-)
Right now i have a HP DL 180 Server with 130 Gb Hard Disk & 8 Gb ram after Raiding0+1. i want to configure Domain Controller Server for my office for 200 to 300 Users. what should the partition size must be mentioned in my 130 Gb Hard Disk, is that going to be Sufficient for ME ?
i am bit confused about /Usr /Var /Boot partitions, as i need to manage perfectly in 130 GB
if i go with 4 Gb swap and remaining for " / " is that will be fine ? should i need to specify partition sizes separately for / tmp /var / usr ..
Trying to install Ubuntu 10.10 Server via a usb flash drive. Can't use a cd-rom as the server doesn't have a cd-drive! lol! It boots up and then hangs at the loading stage where the ubuntu symbol is displayed.
Iam handling mail server in redhat.MTA using is qmail.since last 1 month it get overload and get hang...since in morning its time for user to loging into mail through web interface.though suddenly login in to mail it get overload and server got hang.we need to restart the server...did not get time to kill https also some time MP port work to save us...still didnot get command to overcome the over load in server...
I am trying to install RHEL5.5 server on Dell Optiplex 360. When its reached at "checking dependencies in packages selected for Installation" its hang. What should I do, I checked cables hardware etc... every thing looks ok.
How do I make a server wait on shutdown for a set amount of time?I tried making a service that just does Code: sleep 7m and made it first priority on shutdown but it seems to ignore itI'm pretty sure it runs... I'm currently testing that to make sure
frist i want to thank all who worked on this website i was install squid and other service in my office network server and install the same services with same configration on biger network about 150 machines every thing well on office network but the other network have problem squid hang and stoped serve any request and squidclient command return "alarm clock " message after many time but the service still run at port 3128 and status is started when i run /etc/init.d/squid reload its run again sometime and hang again squid log file have not any problem on it i install my server on gentoo with flags