Server :: Speed Up Single Large File Copying?

Apr 22, 2010

I'm planing to copy a productive mysql innodb file from one server to another, and the file size is around 300GB. As the file is keeping changing all the time, I have to shutdown mysql instance and copy the large data file to other server as quickly as possible.I should have to find a way to speed up file copying ... I'm wondering whether there's a way to copy file block by block.If the destination side block has same content, then bypass it.

View 4 Replies


ADVERTISEMENT

Ubuntu :: Copying A Large File From The Network?

Feb 17, 2010

I am trying to copy a file from a network resource on my Local Area Network, which is about 4.5 GB. I copy this file through GNOME copying utilities by first going to Places --> Network and then selecting the Windows Share on another computer on my network. I open it and start copying the file to my FAT32 drive by Ctrl+C and Ctrl+V. It copies well up-to 4 GB and then it hangs.

After trying it almost half a dozen times I got really annoyed and left it hung and went to bed. Next morning when I checked a message box saying "file too large to write" has appeared.

I am very annoyed. I desperately need that file. It's an ISO image and it is not damaged at all, it copies well to any windows system. Also, I have sufficient space on the drive in which I am trying to copy that file.

View 8 Replies View Related

Ubuntu :: Root Filesystem Fills Up When Copying A Large File?

Mar 17, 2010

I was just copying a large (50GB) file from one mounted partition to another mounted partition (a USB drive), but before the operation completed, my root filesystem, on a separate partition, filled up.Because it filled up I also couldn't get past the login when I rebooted. I think this is because there is no room to load temporary files. I'm expanding the root partition to temporarily fix this. how can I avoid my root file system filling up when copying a massive file between mounted partitions? the file is being cached in root during the transfer.

View 3 Replies View Related

Ubuntu Servers :: System Crash When Copying Large File

Jun 15, 2010

I am having a bit of a problem with my Ubuntu Server 10.04 install. I think it might be a kernel problem. Basically, what happens is when I copy a large file (a 160GB disk image) to my drive (>60GB) the system consistently crashes after about 60GB of the file is transferred. It doesn't matter if I am sending the file using cifs, or over SSH. Checking syslog (paste dump here), it seems these flush errors always appear shortly before the crash occurs. The destination filesystem is a hardware RAID 10 array with 2TB of space. It is formatted as EXT4.

View 7 Replies View Related

Ubuntu :: Copying Large File Blocks Externally Causes Disk Full

Oct 15, 2010

This has happened several times now, with 9.10 and 10.04. I back up my photos periodically to external drives, using Nautilus. At the next attempted login Gnome won't start and sometimes gives power manager incorrect installation error.

First time this happened I was stumped and eventually did a clean install. Second time, I found advice elsewhere in this forum to solve this by emptying root trash, which did the trick. This time, however, root trash has nothing in it and 2 users trash were insignificant (I emptied them all anyway with rm -r). Tried looking for enormous directories but couldn't find a smoking gun. I would rather not end up doing another clean install - a painful and extreme solution. I'm continuing to look for solutions to the immediate problem, but my question really is, what causes this and how do I prevent it in the future? I've run Computer Janitor regularly and ran apt-get clean but no help. Should I do all my large scale copying from terminal? I'm not a total noob, but close.

View 9 Replies View Related

Fedora :: Join Multiple Mp3s Into A Large Single File?

Apr 15, 2011

is there an app, runs in Fedora, with a GUI, that can join multiple mp3s into a large single file?

I know Audacity can do this, but it's not really suitable for multiple files and I have voice notes that are in 10 minute chunks.

View 2 Replies View Related

Slackware :: NFS - How To Speed Up Large File/folder Transfer - Write - Speeds?

Dec 13, 2010

About NFS.

Server:

Client(s):

Code:

I have followed Robbie Workmans' HowTo [url]

Reading and writing works absolutely fine with small files but large files are tediously slow in writing to the server. (rw,no_subtree_check) are options in exported directories.

What is your experience with NFS and how can I speed up large file/folder transfer(write) speeds?

View 5 Replies View Related

Ubuntu :: Error "File Too Large" Copying 7.3gb File To USB Stick

Nov 24, 2010

I am trying to copy a 7.3gb .iso file to an 8gb USB stick and I get the following error when it hits 4.0gb

Error while copying "xxxxxx.iso". There was an error copying the file into /media/6262-FDBB. Error splicing file: File too large The file is to be used by a windows user, and I'm just trying to do a simple copy, not a burn to USB or anything fancy. Using 10.4.1.LTS, AMD Dual Core, all latest patches.

View 2 Replies View Related

Server :: Difference After Copying Large Directory To A New Directory?

Apr 4, 2010

I m having a RHEL-5 sever.ABC directory size is 57GB after taking backup in the same disk with name ABC.bkp showing 56GB. i used below command to copy/backup. # cp -r ABC ABC.bkp (different sizes after copying)..I checked both the directory sizes by #du -sh <ABC> and du -ks <ABC.bkp>In both GB and KB there is lots of difference (200mb). why this will happen in copying? what is the solution for above question? what is the correct way of copying 1dir to newdir exactly?

View 4 Replies View Related

Server :: Scp Truncate Text File Busy - Copying File Is Not A Running Binary?

Jun 14, 2010

I am having problems with scp during a backup operationI added a ps -ef before and after the scp operation used during the backup.The backup is a script to backup a Zimbra ServerI am including the code segment that I am having problems

Code:
# DRCP Section. To scp newly created archives to a remote system
if [ "$DRCP" = "yes" ]

[code]...

View 3 Replies View Related

General :: Copying Large Number Of Files From One Directory To Another

Feb 10, 2010

I've a directory containing around 2.8 lacs of files. I want to move them to another directory.If I use cp or mv then I get an error 'argument list too long'. If I write a script like

for file in ls *; do
cp {source} to {destination}
done

then because of ls command , its performance degrades.How can I do this?

View 7 Replies View Related

General :: Copying Large Number Of Files In Windows?

Mar 15, 2011

I am facing problem in copying a large number of file 18 lakh (18,000,000) files from my personal hardisk to another hardisk each file is very small and size of folder is around 3.95 GB copying files using copy given by Windows is frustrating and I am not even able to compress file its giving me error that its not readable.And problem is I am not able to open this drive in Linux it showing me error there saying do diskchk in Windows and Windows disk check is also not able to repair this drive and goes into some mode unsolvable.Is there any way to open disk with error to open in Windows and if not any way I can copy data faster?ERROR: Disk labled EDU is corrupt go to windows and chkdsk /f there and reboot into window 2 times.

View 3 Replies View Related

Ubuntu :: Everything Freezes When Copying Large Amount Of Data?

May 20, 2010

Well, when I copy large amount of data the other applications than Nautilus freezes until the copy is done...

So, what can I do? Because when backuping some data this is really annoying =/

View 6 Replies View Related

Server :: Remote MySQL Server Connection Dies After Wget Large File

Feb 3, 2011

We have 2 servers, 1 is the webserver and the other is the Mysql server.

When transfering a 2GB file from the webserver to the Mysql server.

The webserver's connection to the mysql DB server dies completely.

Need to restart the MYSQL process in order for it to come back online.

During this connection downtime, when using phpmyadmin on the mysql server shows no problem running queries etc.

View 2 Replies View Related

Fedora Hardware :: Low Transfer Rate When Copying Large Files Over Wireless

Jan 11, 2010

I just bought a HP 3085dx laptop with an intel 5100 agn wireless card.
The problem: copying a big file over the wireless to a gigabit hardwired to the router computer only gives an average 3.5MB/Second transfer rate. If I do the same copy from my wireless-n macbook pro to the same computer. I get a transfer rate of about 11MB/sec. Why the big difference? I noticed the HP always connects to the 2.4 GHZ band instead of the 5GHZ bands...

On the HPL
[jerry@bigbox ~]$ ifconfig wlan0
wlan0 Link encap:Ethernet HWaddr 00:246:36:AC4
inet addr:192.168.1.75 Bcast:192.168.1.255 Mask:255.255.255.0
inet6 addr: fe80::224:d6ff:fe36:acc4/64 Scope:Link
UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1
RX packets:639243 errors:0 dropped:0 overruns:0 frame:0
TX packets:1293049 errors:0 dropped:0 overruns:0 carrier:0
collisions:0 txqueuelen:1000
RX bytes:53832795 (51.3 MiB) TX bytes:1888619922 (1.7 GiB)

[jerry@bigbox ~]$ iwconfig wlan0
wlan0 IEEE 802.11abgn ESSID:"<censored>"
Mode: Managed Frequency:2.412 GHz
Access Point: 00:24:36:A7:27:A3
Bit Rate=0 kb/s Tx-Power=15 dBm
Retry long limit: 7 RTS thr: off Fragment thr:off
Power Management: off
Link Quality=70/70 Signal level=-8 dBm Noise level=-87 dBm
Rx invalid nwid:0 Rx invalid crypt:0 Rx invalid frag:0
Tx excessive retries:0 Invalid misc:0 Missed beacon:0

I am not getting any errors. I don't know why the bit rate is not known. My airport extreme base station typically reports that the 'rate' for the hp is typically 250~300MBi and about the same for the MacBook Pro. The hp is about 6 inchs away from the base station. Is there anyway to get the rascal to go mo'faster? Is there anyway to get the rascal to use the 5GHZ band.

View 3 Replies View Related

General :: Cp Adds Exclamation Points When Copying Very Large Text Files?

Jul 13, 2009

For my research I have some very large files that are basically millions of lines of ten columns of numbers. These files can be up to 5 GB in size. Recently I noticed that when I made a copy of one of my files, some exclamation points appeared in it where there should not be any: in front of random numbers throughout the file. Making another copy of the file would result in exclamation points in front of different numbers in different parts of the file. Doing this many times has given me up to four exclamation points in different parts of the file. Sometimes the file copies just fine without producing any extraneous exclamation points.Additionally, I have occasionally seen a "^K" where there should be a newline (the data that should have been on the next line was instead on the previous line with a ^K in front of it) in copies that I have made of my files. I don't know if this is related or not.

View 7 Replies View Related

Server :: Large File Size Cause RPC Authentication Error?

Oct 6, 2009

I think I am having a problem due to an NFS server file size limit. Is it possible I am missing a parameter on the RHEL NFS setup to handle large files? I am running an NFS server on a RHEL 5.1 machine and the HP-UX 11.0 machine does an NFS mount to that file system. The HP-UX executes a program that resides on the HP-UX machine to process a large 35 GB data file that resides on the NFS server machine. The program on the HP-UX can only read/process the first portion of the file until an "RPC: Authentication error" is returned multiple times until the program prematurely decides that it has reached the end of file.

I tried recompiling the same program to run on the RHEL 5.1 NFS server to access the 35 GB file locally (on the NFS server instead on HP-UX) and the program completed successfully, processing the whole file (about 7 hours of processing) with no "RPC: Authentication error." In addition, I have been running the nfs mount with the same machines for quite some time, but not with such large files sizes.

View 3 Replies View Related

Ubuntu :: 9.04 - Very Slow Copying Speed To Any Device / Drive

Jul 8, 2010

I have installed ubuntu 9.04 version. But now its too slow to copy files to any devices or even to other drives.

View 3 Replies View Related

Server :: Multiple File Pointers In A Single File?

Apr 20, 2011

I have files a, b, c and d. They're all relatively large and are served up by a static web server optimized for this purpose. I can get requests that look like this:

/abcd
/ad
/bacdac
...

Each request is basically a request for a concatenation of the files in the order of the letters. The list of possible requests is finite, but large enough that disk space will run out very quickly and be very expensive if I create all possible files via concatenation.Is there a way to create a pointer file like abcd that is essentially a multi-file symlink that first points to a then to be then to c then to d? So if the contents of the files were as follows:

a: hello
b: there
c: whats

[code]....

View 3 Replies View Related

Ubuntu :: Copying File To A Remote Ftp Server With Bash

Jan 10, 2011

I'm hoping to set up a cron job that takes a file and copies it to a remote password protected FTP server. I've got a command that formats the file with the correct name and I've put it in the anacron file in /etc/cron.d (which I think is right, haven't tested it yet).I'm not sure how to copy the file to a remote server though. I do actually have the ftp server bookmarked in my places menu. So is there a simple way of suppling a file path that will put it straight into that folder? The only problem I can see with this is that the connection won't be open continuously, so would need to be re-opened when needed (I could presumably save the password in the keyring so that I don't need to be there to type it in).

Or maybe set up a cron job that connects to and mounts the ftp server a minute before it has to copy the file over?

View 9 Replies View Related

Server :: NFS Large File Copies Fail - Error Writing To File: Input/output Error?

Jun 27, 2009

I recently upgraded my file/media server to Fedora 11. After doing so, I can no longer copy large files to the server. The files begin to transfer, but typically after about 1gb of the file has transferred, the transfer stalls and ultimately fails with the message:

"Error writing to file: Input/output error"

I've run out of ideas as to what could cause this problem. I have tried the following:

1. Different NFS versions: NFS3 and NFS4
2. Tried copying the files to different physical drives on the server.
3. Tried copying the files from different physical drives on the client.
4. Tried different rsize and wsize block sizes when mounting the NFS share
5. Tried copying the files via a different protocol. SSH in this case. The file transfers are always successful when I use SSH.

Regardless of what I do, the result is the same. The file transfers always fail after approximately 1gb.

Some other notes.

1. Both the client and the server are running Fedora 11 kernel 2.6.29.5-191.fc11.x86_64

I am out of ideas. Has anyone else experienced something similar?

View 13 Replies View Related

Server :: Large File Transfer Over Scp Or Samba Crashes Wireless Connection?

Feb 3, 2010

I'm setting up a htpc system (Zotac IONITX-F based) based upon a minimal install of ubuntu 9.10, with no GUI other than xbmc. It's connected to my router (d-link dir-615) over a wifi connection configured for static IP (ath9k driver), with the following /etc/network/interfaces:

Code:

auto lo
iface lo inet loopback
# The primary network interface
#auto eth0

[code]....

Network is fine, samba share to the media direction works, until I try to upload a large file to it from my desktop system. Then it downloads a couple of percents at a really nice speed, but then it stalls and the box becomes unpingable (Destination Host Unreachable), even after canceling the transfer, requiring a restart of the network.

Same thing when I scp the file from my desktop system to the htpc, same thing when I ssh into the htpc, and scp the file from there. Occasionally (rarely) the file does pass through, but most of the time the problem repeats itself. Transfer of small text files causes no problems, and the same goes for the fanart downloads done by xbmc. I tried the solution proposed in this thread, and set mtu to 800 in the interfaces file, but the problem persists.

View 1 Replies View Related

Ubuntu Networking :: NFS - Slow Write Speed And Copying Files In Nautilus

Jul 6, 2010

I just tried NFS for the first time after reading that it's considerably faster than SSHFS, which I currently use, but I'm experiencing slow write speeds and problems while copying files in nautilus.

[Code]...

View 1 Replies View Related

General :: Rsync Speed For Single Files And Directory?

Mar 23, 2010

Recently i am trying to check on the rsync speed for single file(2.4GB iso) directory ( 900MB directory with files inside ) When i run the rsync for single file: the speed i get is average 50MBps However, for a directory: average speed is 10MBps Is there any reason behind this ? i tried to google but unable to get the concept.

View 2 Replies View Related

General :: Upload A Single File To FTP Server From Ubuntu?

Aug 15, 2011

I need to upload a single file to FTP server from Ubuntu. This operation should be done in a script (in non-interactive mode). What is the right syntax for ftp?

I'm trying this, to no avail:

$ ftp -u ftp://user:secret@ftp.example.com my-local-file.txt
ftp: Invalid URL `ftp://'

View 2 Replies View Related

Ubuntu Servers :: Raid1 Reads At Single Disk Speed?

May 3, 2011

im on 10.10(desktop) and mdadm was v2.8.1 from 2008, very out of date so i tried 3.2.1 -> no change. mdadm raid1 read speeds are the same as single disk. note i used the tests in the disk utility benchmarking tool at first --these showed raid 5 atleast to be much better but when i tried dd reads raid5 dropped off with larger data to almost the same (slow) speed as raid1. compare:

[code]....

using two partitions will be enough to show raid1 performs at single disk speed. I dont really want to use a 4 disk raid0 just to get the read speed i should be able to get with raid1 as i dont really care about the size loss. I would of course use raid10 but i have found this suffers from the same problem (achieve same read speed as 2 disk stripe). So whilst im shocked others aren't reporting this, unless there is some obscure reason why my system would give these results i think raid1 in not behaving as it should.

View 6 Replies View Related

CentOS 5 Hardware :: Limit Sata Speed For A Single Device

Dec 18, 2010

I have an external sata dock for hdd that give me a lot of error till linux decided to lower the speed of it to 1.5 then it start work well

View 2 Replies View Related

OpenSUSE :: Dolphin Losing Files When Copying Many Files Or Large Folders?

Feb 14, 2010

I've discovered that Dolphin seems to lose random files when copying many large folders.

I first noticed this a few months ago when I tried to copy my music library from one folder to another on the same HDD. It consisted of around 600 folders and 6500 files. During the copy there were no errors but after the copy I found that some of the newly copied folders were missing files. I put it down to human error or a glitch.

Yesterday I tried to copy 13 folders containing rips of some of my DVDs. Each folder basically had one film of either 700MB or 1.4GB. Again no errors showed up during the copy but I found 3 of the newly copied folders were empty.

It's not so critical with music or films but I can't afford to lose work data like this.

Has anyone experienced or seen a similar problem with Dolphin? I'm going to have to do some more extensive testing but this is not good.

The first time I noticed the problem I was running KDE4.3.4 (I think) and now the latest was with KDE4.4.0.

View 9 Replies View Related

Ubuntu Multimedia :: Large *.mp4 Files In Gtkpod - Hangs At "Copying Tracks" At 0%

Jul 27, 2010

I am dual booting XP and Ubuntu 10.04, but in the future I will be getting a new machine and I will only be running Ubuntu and won't have access to iTunes. Because I have an iPod Touch, I have been trying to find workarounds for syncing everything that iTunes took care of in the past. One problem I have is managing movies. I have looked through various media players/iPod management tools (Amarok, Rhythmbox, gtkpod) and I am using Rhythmbox to sync my music and and attempting to use gtkpod to sync my movies.

gtkpod is able to sync songs (Tested with a few minute test clip) and short *.mp4 files (15mb I know for sure from test). I am unable, however, to get it to sync a movie (~700mb) I am able to drag it onto my iPod in gtkpod, but when I try to save the changes and write the files, it hangs at "Copying Tracks" at 0%. It eventually crashes during the couple times I have tried to wait it out. So this being my situation, my question is, is there a size limit to the *.mp4 files I can sync to my iPod Touch via gtkpod? is there any other tools that you know of that I can sync videos to my iPod with?

View 9 Replies View Related

Server :: Slow Speed While Connecting To Samba File Shares From Vista

Mar 6, 2010

I have set up a file server (Ubuntu Server Edition) for our lab. People can connect to common Samba file shares from their personal laptops/desktops, which run either Windows Vista or Mac OSX. The guys with OSX have upload/download speeds of ~2 MB/s, while the Vista machines are slogging away at ~200kb/s for downloads and ~400kb/s for uploads. In both cases, the connection are through wired ethernet ports which should function identically. Since the Macs work fine on the same network, I assume this is a Vista issue.

I have tried troubleshooting one of the Vista machines by:
1. Turning off the Remove Differential Compression feature
2. Disabling autotuning following these instructions
3. Adding a registry key following the same link above.

But nothing has improved. Anybody have any advice on addition tweaks to the Vista machine? Is there a chance that this is actually a server-side/samba issue?

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved