General :: Monitoring Copy Progress Of A Large File?
Sep 15, 2010
Is there a clever way to monitor the progress (as percentage or hash) of copying a large file (using pv could be an option)?Like monitoring the progress of a copy command such as this:Code:cp linux.iso /tmp/
View 2 Replies
ADVERTISEMENT
Jun 22, 2010
Is there any good tool in GNU/Linux that copy files like cp, but also shows progress and limits speed (and changes limit without interruption) like pv?
Prototype: find source_directory | cpio -H newc -o | pv -s `du -bs source_directory/ | awk '{print $1}'` | (cd /destination/directory && cpio -di)
Also rsync -aP source_directory /destionation/directory/, but this shows progress bars individually and can't change rate after started. Or may be I should just write a wrapper for pv/cpio? Done.
View 2 Replies
View Related
Aug 24, 2011
I want to transfer a huge file (60GB) over the NFS network on linux. Is cp the best option?
View 1 Replies
View Related
Mar 26, 2010
I'm trying to copy a 6Gb file across from my laptop to an external usb drive but it quits at about 4.2Gb every time with a "file size limit exceeded" error. i have checked the output of ulimit -a and there is no limit there on the file size. I'm using the Slax live Cd for this as it always gets the job done
View 8 Replies
View Related
Mar 22, 2010
i don't know if this is a slackware related issue but i have the following problem.I'm running a slackware64-current on my system. For my private data I'm using a QNAP NAS (Some ARM CPU with linux kernel 2.6.22), the file shares provided by NFS. I mount them withmount -t nfs 192.168.0.2:/Public /mnt/qnapWorks fine, no problems.But now, if i try to copy some large files ( > 1GiB) to the NAS share, sometimes the systems completely freezes during the copy process. I have to do a hard reset to bring the system back to work
View 5 Replies
View Related
Jan 3, 2010
I have some large image files that are 30 gig and more. I am running Ubuntu 9.10 whenever I try to copy one of these files to another drive I get a error saying the file is too large. I am trying to copy from an external Hard Drive or a slave drive does the same thing. I have a friend who has expressed the same issue. This must be a widespread bug.
View 9 Replies
View Related
Aug 19, 2010
I am new both here and in Linux. As the subject says, I would like to learn how to copy a directory (not a file) from terminal with progress bar showing. The copy is local, i.e., not to another computer. My distro is CentOS 5.5. I know that if I do it with nautilus I would see the progress, but I want to learn how to do it from the terminal. I know that PV command can show a progress bar, but from what I saw, it works well for files, but not for directories (recursive).
Is it possible to use PV for directories? If yes, could you please show me the syntax? I also saw that some people mentioned that rsync can also show a progress bar, I tried to do it, but it didn't work out - perhaps I got the syntax wrong. If rsync can really be used to copy directories with progress bar, show me the syntax? Any other ideas on how to do it? I would like ideas that do not involve using any script, i.e., just something that I can do using the regular commands.
View 6 Replies
View Related
Mar 10, 2010
I always wanted cp to have a progress bar for large files. I came across this:[URL]... I just wonder, how could you install it as an Arch package? Is it possible?
View 9 Replies
View Related
Sep 9, 2010
Ok so I'm running Mint (not full Ubuntu), and I'm not sure if this is a problem with the FS, Kernel, what not.I'm running Linux Mint 9 - x64 - kernel 2.6.35.14 - and when I did a large file copy operation (9GB) it froze up my system until the copy operation was done. I couldn't even use Pidgin, Mozilla, or anything, when trying to open up another Terminal it froze as well.
View 1 Replies
View Related
Sep 10, 2010
I have seen this 3 times now - it's an updated Lucid with EXT4, trying to copy to a 500G USB drive?
View 3 Replies
View Related
Mar 9, 2009
After searching I cant find script that can deal with directories. All the found scripts work file to file and not directory to directory. Someone know script that can deal with all this situation?
View 8 Replies
View Related
Mar 1, 2011
How to copy a Read-Only file in Linux and make the copy writable with a single cp command in Linux (Ubuntu 10.04)? The --no-preserve and --preserve seemed to be good candidates, except that they should "and" the mode flags, while what I am looking for is something that will "or" them (add +w mode).
More details: I have to import a repository from GIT to Perforce. I want that all Perforce depot files are Read-Only (that is how Perforce was designed), while all other files that were derived/copied from depot files are writable. Currently if a Makefile tries to copy a Read-Only file then the derived file will also be Read-only. This leads to build-errors when cp tries to overwrite Read-Only file second time. Of course the --force is a workaround here but then the derived file is also Read-Only. Also I do not want to mess with "chmod" after each "cp" command - I will do that only as the last resort.
View 1 Replies
View Related
May 12, 2010
I am using RHEL 5.I have a very large test file which cannot be opened in vi.The content of the file has some 8000 lines.I need to view ten lines between 5680 to 5690.How can i view these particular lines in a large file.what is command and option i need to use.
View 1 Replies
View Related
May 21, 2011
I have an application which generate logs like this
Code:
2011-05-17 13:21:27 - Msg 2402
File loading terminated.
File information: 3 records in input file found
3 records processed
0 records skipped
Load statistics: 3 messages loaded correctly
0 messages ignored
0 messages with errors
Details:
Destination OK ignored errors correct incorrect not sel. other
house Server (def 3 0 0 0 2 1 0
2011-05-17 13:21:27 - Msg 2410
Archiving information: File /path/to/xxx.txt
was archived as /path/to/xxx.txt.
Now I want to monitor this "house Server (def" and send alert based on 3 0 0 0 2 1 0
say if [ $5 -gt 0 || $6 -gt 0 ]; then
<send email>
View 6 Replies
View Related
Sep 2, 2010
I'm running ASSP on Ubuntu 10.04.1 it's mostly working fine. I have one problem which has been bugging me for some time. I don't want to filter outbound mail, but if I can relay (proxy) my outbound mail through ASSP, then it can automatically add to the whitelist.
As ASSP is a proxy, I need a server to send it to once ASSP receives it. I've tried my ISP, but this failed and they weren't willing to confirm if a connection attempt was received at their end.
Current setup
Inbound
mx -> router -> ASSP -> Exchange 2003
Outbound
Exchange 2003 -> mx
I'd like to setup outbound as either
Exchange 2003 -> ASSP -> <ISP> SMTP relay
Exchange 2003 -> ASSP -> <relay running on Ubuntu eg postfix>
Can anyone help me with troubleshooting steps or a better suggestion for how I can set this up. I'd love to know why my ISP setup didn't work, but I don't know a tool for monitoring IP traffic in Ubuntu SE, in windows I use Wireshark is there any equivalent I can setup for Ubuntu or a tool I can use in windows which will show all traffic, Ubuntu and windows server are on the same netgear switch, not sure it's smart enough to copy all traffic to another port for monitoring.
View 4 Replies
View Related
Jun 16, 2011
I want to copy about 40GB - to a partiton. There are two hard drives in my box one won't boot but I can aaccess it and mount partitions and I aim to move data from it to a new bootable hard drive. Doing a simple cp copy command may not be the best way to copy and paste such a large chunk? Also I want to backup the data I plan to copy/paste using a USB hard drive to backup. But I could also paste data from the backup to the new drive instead of from old internal hd to new hd. - that's another option.
View 1 Replies
View Related
Aug 28, 2011
I need to send large files from a Linux machine to another using cryptography. The sender machine knows the recipient IP but not vice-versa. I don't need strong cryptography and prefer higher-speed less-secure solutions.
There are no problems with presharing crypto keys but I'd prefer not dealing with SSH users creation.
I think to HTTP PUT over TLS, but I never had experience with it and I prefer to hear which are the possible solutions. I know that it can listen as a daemon but I don't know anything about cryptography. So pipeing with OpenSSL may be a solution.
View 2 Replies
View Related
Oct 12, 2010
Greetings from Greece. I tried to install opensuse 11.3 in an empty disk . Unfortunately the installation progress stops in 88% and the message error says "error copy live image to the disk". I have burn two different cd but the result is always the same.Is it a hardware problem or the cd is not correct?I had the 11.2 version in the same pc without any problem for a long time.
View 9 Replies
View Related
Jul 28, 2011
I have a large number of log files, on a linux box, I need to cleanse sensitive data from before sending to a third party. I have used the below script on previous occasions to perform this task, and it has worked brilliantly (script was built with some help from here :-)
#!/bin/bash
help_text () {
cat <<EOF
Usage: $0 [log_directory] [client_name(s)]
EOF
[Code]...
However, now one of our departments has sent me a CLIENT_FILE.txt with 425000+ variables! I think I may have hit some internal limit. I have tried splitting the client file into 4 with around 100000 variables in each, this still doesn't work. I'm loathe to keep splitting though as I have 20 directories with up to 190 files in each directory to run through. The more client files I make, the more passes I have to do.
View 2 Replies
View Related
Aug 18, 2011
I'm looking for a way to compress a large file (~10GB) into several files that wont exceed 150MB each.
Any thoughts?
View 2 Replies
View Related
Nov 2, 2009
I work for a school consulting company.We helped a school deploy about 1500 computers.The computers have windows XP but we have been using G4L for the restore partition on the drives.So far the software works great. We did however run into a problem in that many of the computers we deployed are missing the restore partition. The reason they are missing is long and convoluted and not really that important. What I have been charged to do is try and fix the restore partition problem. One solution that I had, which im not even sure if it will work, was to backup the recovery file, that g4l created, to DVD and write a basic script to recreate the partition and then copy the file over. This process would need to be as automated as possible since this disc will be inserted by the end user(the students). The backup file that g4l created is 5.9GB so it wont fit on just one disc and Dual layer discs are too expensive to use for this project, so the file will either need to be compressed again (not sure if that's a good idea or not) or split across two DVD's.
I have searched the forums here and I was not able to find anything to fix this problem. I was able to find some info on splitting files across two discs but im not sure how to use that to fix my problem.
View 5 Replies
View Related
Aug 6, 2009
After untaring a mysql file (very large) I'm trying to find where the file listed below has gone. I did a search on the file name:
fine / -name 'mysql-qui-tools-5.0' -print
But can't find the file.
-rwxr-xr-x root/root 9651820 2007-05-02 11:46:01 mysql-gui-tools-5.0/mysql-query-browser-bin
View 6 Replies
View Related
Jan 1, 2010
Having a bit of a issue with Debian Squeeze and transferring files to the Sony PSP..Hook up PSP to USB port and Debian mounts it..I go to drag a 125 meg mp4 to video folder..Copy windows takes about 10 seconds to transfer it..Exit USB mode and there is no video there. Go back into USB mode and look at video folder on the PSP memory stick and there is no video..It vanished. From another after copy progress closed I right clicked PSP and unmounted it..
It error-ed saying device was busy and could not unmount..Looking at light on PSP i see memory stick is still being written to..i wait for light to stop flashing..About a minute or so..Then am able to unmount it..Go to PSP video and theres the video ready to be watched. Debian isnt accurately showing the copy progress...Its showing complete when it isnt..I have to watch the light on PSP to know when it is truly finished.
View 3 Replies
View Related
Dec 20, 2009
i have a car stereo that reads a USB drive with all my music on it, however to sort through the music it uses a method of finding folders containing music, then displaying them all in a list. i find this interface annoying because in order to sort the music by artist i have to go and manually move it out of the album folders by hand, this takes a long time for 11+ GB of music so i was trying to use the linux CLI to quicken the process. use a command like this
Code:
mv /media/usb/music/*/*/* /media/usb/music/*/
but for some reason this moves all my music into the last folder alphabetically in my drive, the music is all pre-arranged like this /media/usb/music/artist/album/song
View 5 Replies
View Related
Apr 16, 2011
I am removing some old graphics from my server and one of the gallery programs have created two enormous directories that I cannot even open with FTP.
I tried to tar each directory and the first came out to about 37gb and the second keeps failing (its bigger one would assume).
How can I archive and split these into smaller files?
View 13 Replies
View Related
May 6, 2010
Every once in a while on a computer I'm ssh'd into, I will accidentally type "cat largefile.txt" and my screen will start rushing with text for the next 10 minutes. I'm always working in a screen session, so my current solution is to just log out and then log back in, and since it can go 100X faster when I'm logged out, it'll finish in the short time it takes me to type my password in again. Is there a better way? Either involving the fact I'm in a screen session? Or a way to do this within SSH? What doesn't work: detaching from the screen session (doesn't respond until file is done outputting) trying command to move to a different window in the screen session (also doesn't respond) typing ctrl+C to kill cat command (also doesn't respond, probably because the command is done and the buffers just have to catch up).
View 3 Replies
View Related
Mar 22, 2010
I have installed centos 5 and can print small to medium lpr files using cups fine (1 to 20 pages), but when i tried to print a file of 95 pages the printer just stops, I have to power off the printer and turn it back on and it will start printing again. It looks like some data is lost in this process. It may print 20 pages and stop. When restarted it may print 20 or 40 or complete the report.I can print to the devicectly and it works fine. It is only when the large print jobs are run through the spooler.I have tried on different printers and the same results, that's what makes me think it is a spooler problem.
View 2 Replies
View Related
Sep 28, 2009
I have on my windows machine several hundred files that are a format of .nc .ncs for a CNC machine. I need to convert them to txt which is something as easy as opening in notepad and then saving as .txt but there are so many that this kind of action would take way too long.
The reason I am writing the linuxquestions is because I would feel more comfortable in loading a live CD and using some sort of terminal command to do this than I would to download one of the many "freeware" type programs I have found for windows (even more so since I have had a root kit before and had to start all the way over to get rid of it).
I need to know:
1. Is this possible to do with the terminal without super advanced knowledge.
2. Can one please point me in the right direction; something to read or an example
View 2 Replies
View Related
Nov 26, 2010
I'm trying to manipulate a large text file full of records (metadata - one complete record per line). I need to delete every line on which certain words appear - there are five different words, all pretty simple all-caps strings with occasional whitespace. I tried using grep -v, which worked a treat, but only string-by-string. Ideally I'd like to run this as grep -v -f, where the file targeted by the -f contains the strings I need to match in order to delete the lines they're in.
i.e. grep -v -f filecontainingSTRINGS.txt targetfile.txt > outputfile.txt
When I try this, however, I don't get any matches - or more specifically, no changes are made in the output file. It works fine if there's only one string in filecontainingSTRINGS, but it doesn't work if there's more than one (I'm using newline as the delimiter). (Also my machine doesn't recognise /usr/xpg4/bin/grep - no idea what that's all about!)
View 5 Replies
View Related
Jul 13, 2011
I need to remove a large binary file(PDF file) from a large log file which is generated daily.This is seriously hogging space on our servers.I need to remove the large PDF from the logs to make the logs smaller and manageable
I need to take out the texts (or binary file) between the strings
<my:PDF> and </my:PDF>
<applicationForm> and </applicationForm>

<extractedSignature> and </extractedSignature>
I am not sure whether sed utility can do this, these are large files and need to be pruned .I am not seeking logrotation advice just a script or command that can strip these large logs of texts between the characters above . I am not sure how to do this.These files are rather large.I am not sure how to achieve this with sed , tail, head , tr or any other facility .
View 2 Replies
View Related