General :: Transferring Large Files Using Scp With CPU And Memory Considerations?

Oct 5, 2010

I want to transfer an arbitrarily large file (say >20GB) between 2 servers. I have several considerations:

Must use port 22 (ssh) because of firewall restrictions
Cannot tax the CPU (production server)
Memory efficiency
Would prefer a checksum check but that could be done manually
Time is not of the essence

Server A and Server B are on the same private network (sharing a switch) and data security is not a concern, Server A and Server B are not on the same network and transfer will be via the public internet so data security is a concern, My first thought was using nice on an scp command with a non-CPU-intensive cypher (blowfish?). But I thought I'll refer to the SU community for recommendations.

View 1 Replies


ADVERTISEMENT

Ubuntu :: Server Crashes When Transferring Large Files

Feb 6, 2011

Every time I attempt to transfer over a large file (4 GB) via any protocol, my server restarts. On the rare occasion that it doesn't restart, it spits out a few error messages saying "local_softirq_pending 08" and then promptly freezes. Small files transfer fine.

Relevant information:

Ubuntu server 10.10
Four hard drives in RAID 5 configuration
CPU/HD temperatures are within normal range

View 7 Replies View Related

Ubuntu :: Data Loss When Transferring Large Number Of Files?

Jul 20, 2010

This problem is not exclusive to Ubuntu, I've experienced it in Windows and OSX as well, but it seems that almost every time I transfer a large number of files (i.e. my music collection) between my desktop computer and laptop via my external hard drive, I end up losing files for no reason. I usually don't notice the files are missing until later on, because I am never informed of any data loss. Now, every time I make a large transfer of files, I just do it two or three times to ensure that I don't lose any files.

View 2 Replies View Related

Ubuntu Networking :: When Transferring Large Files Using Scp Between Desktop Running Maverick And Other Servers?

Nov 12, 2010

sometimes when transferring large files using scp between my desktop running maverick and other servers running Ubuntu, Debian or CentOS, I get the following error message: 77% 258MB 11.3MB/s 00:06 ETAReceived disconnect from xxx.xxx.xxx.xxx: 2: Packet corruptI've found a seemingly related bug report on launchpad here: but the provided "ethtool" fix did not help. I'd be most grateful for any ideas on how to solve this issue. Some more info:Linux lotus 2.6.35-22-generic-pae #35-Ubuntu SMP Sat Oct 16 22:16:51 UTC 2010 i686 GNU/Linuxlspci | grep eth -i00:19.0 Ethernet controller: Intel Corporation 82567LM-3 Gigabit Network Connection (rev 02)

View 1 Replies View Related

General :: Transferring Photos From Mobile To Memory Stick And Whole Folder Disappeared?

Dec 27, 2010

I'm not familiar with Linux and I do not understand technical wording at all. I was given this laptop as a favour and it is an ex-work Dell Latitude. I was given this to help me as a replacement when my Compaq working with Windows Vista stopped working. I do not understand the way Linux works much. I managed to upload photos before from my Ericsson K850i to my memory stick but this time when I tried it kept telling me that the path was invalid and such things which I did not understand. I created a folder for 2011 photos ready to go next year but the mouse pad is hard to control and even though I did nothing the 2010 folder went into the 2011. Sorry if I am not being very clear.

All I want to know is where did my whole years worth of digital photos go from my memory stick as I didn't delete anything. Would someone please help me understand all the different extensions or tell me what I need to do to try and retrieve them. I would be so grateful. I was nearly crying when it happened as when you right click in Linux there is no Undo/Redo button.

I have searched the board and I did see some other questions but I don't think they will help in this instance so I have started one. I hope I can get my photos back and also hope to hear from someone soon.

View 14 Replies View Related

Ubuntu Servers :: Samba Drops Connection When Transferring Large File

Jul 6, 2011

I'm experiencing connection problem when transferring a large file from Windows 7 (Home Premium) to my Ubuntu 11.04. The transfer starts, but after a couple of seconds the connection drops and all the shares are unavailable. I'm also unable to connect to the server over ssh, and the only thing I can do to restore the connection is to reboot the server. The strange part is that this was never a problem a couple of weeks ago, and I've not done anything to the setup on either machines besides installing security updates.

View 9 Replies View Related

General :: Transferring Files From Windows Machine?

Aug 17, 2010

I have a linux box that I plan to use primarily as a server. I also have another machine that dual boots Windows/Linux.I would like to have a way to backup my filehe Windows/Linux box onto the Linux server. In other words, I am assuming the hardrive on the Windows/Linux box could fail at any time and I want to have a backup of important files.Should I set up an ftp server to do this? Are there any security issues that I need to be concerned about if the files contain sensitive information etc?

View 14 Replies View Related

Slackware :: Is The Next 32B - 14 - Going To Have Large Memory Capability Enabled?

Jan 4, 2011

Just wondering now I see computers everywhere with 6G of ram or more if the new 32b kernels are going to be setup for large amounts of ram or will they still need to be recompiled? Also will there be better ssd support for things like "trim"? I know you edit the fstab to get the Linux version of trim but I am just wondering if stuff like that will just be automatic or if I will continue to need to tweak. I like to tweak but sometimes its the lack of need for tweaking that makes me like slack.

View 14 Replies View Related

Software :: Php - Transferring Files From Redhat 9 Web Files To Centos 5.5

Mar 28, 2011

I have once tried transferring files from Redhat 9 web files to Centos 5.5 /var/www/html/ directory. Ok... LAMP is installed and working fine. httpd / apache is running and mysqld is installed without root password. Now I wan't to transfer it again to secure the database from Centos 5 to fedora 13. But when I tried browsing the web to check the web files I can only see these:

[Code]...

View 10 Replies View Related

General :: Downloading Very Large Files Via SFTP

Apr 1, 2011

I need to download some very large files (circa 75 GB) from a remote server via SFTP. I've been using SFTP via the command line on my Linux netbook. Around halfway through, the transfer stops and says "stalled." Can anybody recommend a reliable way to download these files?

View 1 Replies View Related

General :: Creating Random Large Files?

Aug 27, 2010

how I can randomly write / create a 1 GB file in bash to test disk / network i/o? I was told I could use the 'dd' command but I don't know if there are some better ways and or what the 'dd' command looks like.

View 7 Replies View Related

General :: Tar Not Working With Large Number Of Files?

Dec 6, 2010

in my middle of script i am using tar command to tar some 1000 images and size of each image is 5MB.

all the images are provided as argument as tar -cvf file.tar <all images as argument>

but my tar file file.tar does not contain all the images.

View 4 Replies View Related

Ubuntu :: Can't Find Any Process That Consume Such Large Number Of Memory

Feb 23, 2010

I used 9.04 for months and it work fine before restarting my PC. After I restarted my PC, the memory consumption takes up to 4.2 GB after login. However, I cannot find any process that consume such large number of memory.

[code]....

View 3 Replies View Related

General :: Copying Large Number Of Files From One Directory To Another

Feb 10, 2010

I've a directory containing around 2.8 lacs of files. I want to move them to another directory.If I use cp or mv then I get an error 'argument list too long'. If I write a script like

for file in ls *; do
cp {source} to {destination}
done

then because of ls command , its performance degrades.How can I do this?

View 7 Replies View Related

General :: Copying Large Number Of Files In Windows?

Mar 15, 2011

I am facing problem in copying a large number of file 18 lakh (18,000,000) files from my personal hardisk to another hardisk each file is very small and size of folder is around 3.95 GB copying files using copy given by Windows is frustrating and I am not even able to compress file its giving me error that its not readable.And problem is I am not able to open this drive in Linux it showing me error there saying do diskchk in Windows and Windows disk check is also not able to repair this drive and goes into some mode unsolvable.Is there any way to open disk with error to open in Windows and if not any way I can copy data faster?ERROR: Disk labled EDU is corrupt go to windows and chkdsk /f there and reboot into window 2 times.

View 3 Replies View Related

General :: Share A Large Number Of Files Into Chroot Env?

Aug 17, 2010

I understand that chroot is usually used to provide security, however, for my issue, security is a big don't care. I am very new to using chroot and don't fully understand how the chroot'd env works.

problem: Trying to use a vendor supplied cross compile environment. The environment runs as a chroot'd env and works just fine. I have a large number of additional modules that I wish to compile in the chroot'd environment. FYI, these modules are also (succesfully) compiled for other targets not using chroot'd env's. Copying the source files into the the chroot environment is not an option (don't have hours to wait for copies to finish and it would break the make system). Having them live in the environment is also not an option (the chroot build is a tiny part of the build process and we cannot revamp our entire source tree to accommodate it).

I am looking for a way to have the compiler in the chroot'd env have access to a path that is outside of the env and typically higher up in the same path that holds the chroot'd env. I have tried soft links (they don't work as expected). Hard links only work for single files and there are 10's of thousands of files that would need to be linked. I am not sure how I would go about exporting the additional files and then mounting the exported files in the chroot'd env (or if that would even work).

View 2 Replies View Related

General :: Searching Text Files For Large Numbers?

Dec 31, 2010

I am looking for a way to search for large numbers in text files and print the nearby lines.

For example if I had a text file like:

Event: 11
blah: 3
blah: 41 bleh: 19
Event: 2
blah: 31

[Code].....

View 1 Replies View Related

General :: Uploading And Downloading Large Files Between Clients?

Jun 5, 2011

I am looking for a file sharing program to install on my dedicated server that will allow me to upload large MP3 files and allow my clients to download them. these files are recordings of counseling sessions for families who are seeking help for their children.

What I am looking for is similar to the system this company uses [URL].

View 4 Replies View Related

Ubuntu :: Understand A Large Amount Of Allocated Memory That Seems Not To Be Accounted For On System?

Mar 24, 2010

I am trying to understand a large amount of allocated memory that seems not to be accounted for on my system.I'll say up front that I am discussing memory usage without cache and buffers, 'cause I know that misunderstanding comes up a lot.I am in a KDE 4.3 desktop (Kubuntu 9.10), using a number of java apps like Eclipse that tend to eat up a lot of memory.after a few days, even if I quit most apps, 1 gb of ram remains allocated (out of 2 gb).this appeared excessive, and I took the time to add up all values of the RES column in htop (for all users).the result was about 1/2 gb.am I trying to match the wrong values?or could some memory be allocated and not show up in the process list?this is the output of free

Code:
total used free shared buffers cached
Mem: 2055456 1940264 115192 0 123864 702900

[code]...

View 6 Replies View Related

General :: Nas - Most Effective Backup Software -> When Dealing With Large Numbers Of Files?

Jul 18, 2010

I have two NASes. I work off of one, and the other is used as a backup. As I have it set up now, it's slow. Running a backup takes a week. Even for 7 TB, with 1,979,407 files, this seems a bit outlandish,particularly as both systems are RAID-5 and the network is all gigabit. I've been digging about in the rsync man pages, and I really don't understand what differentiates the various topologies.Right now, all the processing is being done on the backup NAS, which has the main volume from the main NAS mounted locally over SMB. I suspect that the SMB overhead is killing me, particularly when dealing with lots of files.

I think what I need is to set up rsync on the main nas as a daemon, and then run a local rsync client to connect to it, which would hopefully allow me to completely avoid the whole SMB-in-the-middle affair, but aside from mentioning that it's there, I can find very little information on why one would want to use the daemon mode for rsync.

Here's my current rsync command line: rsync -r -progress --delete /cifs/Thecus/ /mnt/Storage/input? Is there a better way/tool to do this? Edit:Ok, to address the additional questions: The "Main" NAS is a Thecus N7700. I have additional modules installed that give me SSH, and it has rsync, but it's not in the $PATH, and I havn't figured out how to edit the local $PATH in a way that persists between reboots. The "Backup" NAS is a DIY affair, built around a 1.6Ghz Via Mobo with a Adaptec Hardware RAID card. It's running CentOS 5 with a full desktop environment. It's the hardware I'm running rsync from. (Gigabit is through a additional PCI card).

Further Edit: Ok, got rsync over SSH working (thanks, lajuette!).I had to do a bit of tweaking on my command line, I'm running rsync with the args:rsync -rum --inplace --progress --delete --rsync-path=/opt/bin/rsync sys@10.1.1.10:/raid/data/Storage /mnt/Storage (Note: I'm specifically not using -a, because I want to change the ownership to the local account, to not freak-out SELinux)

View 5 Replies View Related

General :: Cp Adds Exclamation Points When Copying Very Large Text Files?

Jul 13, 2009

For my research I have some very large files that are basically millions of lines of ten columns of numbers. These files can be up to 5 GB in size. Recently I noticed that when I made a copy of one of my files, some exclamation points appeared in it where there should not be any: in front of random numbers throughout the file. Making another copy of the file would result in exclamation points in front of different numbers in different parts of the file. Doing this many times has given me up to four exclamation points in different parts of the file. Sometimes the file copies just fine without producing any extraneous exclamation points.Additionally, I have occasionally seen a "^K" where there should be a newline (the data that should have been on the next line was instead on the previous line with a ^K in front of it) in copies that I have made of my files. I don't know if this is related or not.

View 7 Replies View Related

OpenSUSE :: Transferring Files From One O/S To Another On The Second Partition?

Aug 22, 2011

I have two different Linux operating systems (on two separate partitions) on my laptop. One is open SUSE and the second is PCLinuxOS and I frequently have to update files and folders from one to the other (the consequences of trying to keep both current!). When I am using openSUSE and want to access folders/files on the other OS, openSUSE asks for SU authorisation to access the other . . ./home file. Until recently, I could open ALL of the folders on the (other) /home file but now I can open some folders but not others. The Dolphin screen shows a warning at the bottom that "Could not enter folder /media...... " ! I can then only access the folder by going into the menu and using the 'File manager -Super User' mode ..AGAIN.

I suspect this action is necessary because I have had to previously open these specific folders in 'Root mode' to be able to 'paste' the moved file! I would like to avoid this (double) 'messy' action - how to change/avoid this My system is a 32bit openSUSE 11.4 with KDE 4.6.0 and the 2nd OS is also KDE 4.6

View 6 Replies View Related

Ubuntu :: Transferring Files From One OS To Another On The Same Machine?

Mar 20, 2010

I've resolved to move onto Ubuntu, but I haven't done so quite yet. One question that sprang to mind was the transferral of files from the Windows OS to the Linux OS on the same machine, if it's possible. I'm thinking something like the dual partition, though I suppose Copy/paste wouldn't work.

Another possible solution was using dual partition and then Windows SCP, but everything I've read on the topic indicates that SCP works by finding the (different) host address of the computer; thus it would read my machine as one, even if I'm running Ubuntu on it as well.

how to transfer these friggin' things.

Part of the reason for my query is that I am quite strapped financially and am unable to backup my files on an internet service or on discs.

View 4 Replies View Related

Ubuntu :: Transferring Files From Windows?

Mar 13, 2011

I want to convert my wife's computer from MS XP 2003 to Ubuntu 10.10. I have saved "My Documents" to an external hard drive. How will I go about transferring these files to Ubuntu? Do I just pick individual files and go to "Save As"? Can I just transfer them en mass?

View 6 Replies View Related

Ubuntu :: Copying Hangs When Transferring Files?

Dec 10, 2010

when I try to transfer files between flash drives, it starts to copy normally and then after few seconds it starts to lag. Transfer rate drops with each second by 0.2 MB down until 5 MB/s where it completely hangs. This happens to every single flash drive I put in and this is annoying because I have external HardDrive connected over USB port and I keep all my data in there, so when I try to transfer the files from the hard drive to the flash drive - it lags and hangs. Same thing if I want to copy from internal harddrive to flash drive. Although everything is fine when I copy from the flash drive to the internal harddrive and everything is fine when flash drive or the external harddrive is alone (no other flash drives or something is connected).

I am also using Ubuntu 10.04. I have tried transfering files on my netbook running ubuntu, because I thought that it might be a USB port issue but the same thing happened on the netbook too...

View 1 Replies View Related

Ubuntu :: Transferring Files Between 2 Remote Servers?

Aug 13, 2011

I've tried FXP and rsync, none worked and I can't find much info on how to get them working, just alot of google results with people in the same situation as me!

I want to transfer files from 1 remote server to another remote server... a fairly easy task in which I'm sure there's an easy solution, I just can't find it.

View 4 Replies View Related

SUSE :: Transferring Files Using FTP In ASCII Mode?

Nov 5, 2010

I am trying to download a EBCDIC file from z/Os to UNIX using FTP in ASCII mode. The problem I am seeing is; when UNIX FTP Client issues a get; UNIX CPU goes to 100% CPU utilization when transferring a file in ASCII mode. I have done packet traces using wireshark and noted that z/OS Server is translating the file to ASCII before putting it on the wire, however when UNIX FTP Client gets the file I believe that the UNIX FTP Client is again translating the file to ASCII causing the 100% CPU utilization. I need verification that this is what is happening, and is there a way of getting around this other then transferring the file in binary mode then doing a EBCDIC

View 1 Replies View Related

Ubuntu Networking :: Home Network - Transferring Files

Mar 2, 2010

I am looking to set up a home network server on an old PC. I want to be able to access files on the network hard drive in the PC, stream video or music from the "server", and use the computer as a regular desktop connected to TV as a monitor. I tried setting up an FTP server and that worked for transferring files but failed to get a good connection for streaming or fast transfers.

Was less then 1MB a second. I tried setting up a shared folder but ran into several permission errors between the host PC and the laptop that that connected to the shared folder. which way to go with this to get what I want out of it. I know if can be done just not sure how to do it.

View 9 Replies View Related

Ubuntu :: Bluetooth Not Transferring Files To Nokia N900?

Aug 11, 2010

The computer and phone are paired, and transfer starts, but after 4096 bytes it says:Transfer finished with an error: Error sending object. I'm running 64-bit Lucid and Maemo 5 on the phone. Here are some of the things I've tried, following the docs, and the errors I get:

Code:
~$hcitool info 3D:F7:2A:60:7A:40
Requesting information ...

[code]....

View 3 Replies View Related

Ubuntu Networking :: Samba Crashes 11.04 When Transferring Files <1gb?

Aug 9, 2011

I've got a quadcore PC, 2gb ram pc running ubuntu 11.04 with samba set up on a shared folder - the plan is to share files and media between myself and my housemates. However, I do not have the router attached by ethernet, I am connecting to the router (in another room) by a d-link wireless dongle.

I can access the share without real issue from my windows laptop and other computers in the house share I'm part of, and I can happily transfer a text file here and there, but it seems to be anything over 20mb will not go. All the files I am trying to transfer are >1gb and it's driving me mad.

After some decent transfer rates of around 1mb per second it just freezes up. The screen attached to the actual server goes blank and the host becomes unreachable by ping.I'll be happy to provide any command results or anything to get this working. any assistance would be welcome!

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved