Ubuntu :: Everything Freezes When Copying Large Amount Of Data?

May 20, 2010

Well, when I copy large amount of data the other applications than Nautilus freezes until the copy is done...

So, what can I do? Because when backuping some data this is really annoying =/

View 6 Replies


ADVERTISEMENT

Ubuntu :: Reserve Large Amount Of Space From New Partition

Mar 5, 2011

In gparted I have the following stats for my /home drive

size: 824 gb
used 75.51 gb
unused 748.59 gb

Now when I view this in nautilus it shows something else: remaining free space as 709 gb. My question is what happened to the 40gbs? the 75.51gb are my files, but where did the 40gbs go to? Because 709 (total remaining) + 75 (my files) + 40 (mysteriously lost gbs) = 824gb. When I first made the partiton, it was a 824gb partition and ubuntu had automatically at that point reserved about 40gb for something. Does anyone know why Ubuntu reserved this space?

View 4 Replies View Related

Ubuntu :: Understand A Large Amount Of Allocated Memory That Seems Not To Be Accounted For On System?

Mar 24, 2010

I am trying to understand a large amount of allocated memory that seems not to be accounted for on my system.I'll say up front that I am discussing memory usage without cache and buffers, 'cause I know that misunderstanding comes up a lot.I am in a KDE 4.3 desktop (Kubuntu 9.10), using a number of java apps like Eclipse that tend to eat up a lot of memory.after a few days, even if I quit most apps, 1 gb of ram remains allocated (out of 2 gb).this appeared excessive, and I took the time to add up all values of the RES column in htop (for all users).the result was about 1/2 gb.am I trying to match the wrong values?or could some memory be allocated and not show up in the process list?this is the output of free

Code:
total used free shared buffers cached
Mem: 2055456 1940264 115192 0 123864 702900

[code]...

View 6 Replies View Related

Software :: Audacious Crashes When Adding A Large Amount Of Songs On Ubuntu

Dec 11, 2010

I use Audacious to play my music... But every time I want to add my music it crashes... even when I do it in batches. Is there somewhere I can find logging to see what the problem is?

View 7 Replies View Related

Ubuntu :: Copying A Large File From The Network?

Feb 17, 2010

I am trying to copy a file from a network resource on my Local Area Network, which is about 4.5 GB. I copy this file through GNOME copying utilities by first going to Places --> Network and then selecting the Windows Share on another computer on my network. I open it and start copying the file to my FAT32 drive by Ctrl+C and Ctrl+V. It copies well up-to 4 GB and then it hangs.

After trying it almost half a dozen times I got really annoyed and left it hung and went to bed. Next morning when I checked a message box saying "file too large to write" has appeared.

I am very annoyed. I desperately need that file. It's an ISO image and it is not damaged at all, it copies well to any windows system. Also, I have sufficient space on the drive in which I am trying to copy that file.

View 8 Replies View Related

Ubuntu :: Root Filesystem Fills Up When Copying A Large File?

Mar 17, 2010

I was just copying a large (50GB) file from one mounted partition to another mounted partition (a USB drive), but before the operation completed, my root filesystem, on a separate partition, filled up.Because it filled up I also couldn't get past the login when I rebooted. I think this is because there is no room to load temporary files. I'm expanding the root partition to temporarily fix this. how can I avoid my root file system filling up when copying a massive file between mounted partitions? the file is being cached in root during the transfer.

View 3 Replies View Related

Ubuntu Servers :: System Crash When Copying Large File

Jun 15, 2010

I am having a bit of a problem with my Ubuntu Server 10.04 install. I think it might be a kernel problem. Basically, what happens is when I copy a large file (a 160GB disk image) to my drive (>60GB) the system consistently crashes after about 60GB of the file is transferred. It doesn't matter if I am sending the file using cifs, or over SSH. Checking syslog (paste dump here), it seems these flush errors always appear shortly before the crash occurs. The destination filesystem is a hardware RAID 10 array with 2TB of space. It is formatted as EXT4.

View 7 Replies View Related

General :: Copying Large Number Of Files From One Directory To Another

Feb 10, 2010

I've a directory containing around 2.8 lacs of files. I want to move them to another directory.If I use cp or mv then I get an error 'argument list too long'. If I write a script like

for file in ls *; do
cp {source} to {destination}
done

then because of ls command , its performance degrades.How can I do this?

View 7 Replies View Related

General :: Copying Large Number Of Files In Windows?

Mar 15, 2011

I am facing problem in copying a large number of file 18 lakh (18,000,000) files from my personal hardisk to another hardisk each file is very small and size of folder is around 3.95 GB copying files using copy given by Windows is frustrating and I am not even able to compress file its giving me error that its not readable.And problem is I am not able to open this drive in Linux it showing me error there saying do diskchk in Windows and Windows disk check is also not able to repair this drive and goes into some mode unsolvable.Is there any way to open disk with error to open in Windows and if not any way I can copy data faster?ERROR: Disk labled EDU is corrupt go to windows and chkdsk /f there and reboot into window 2 times.

View 3 Replies View Related

Server :: Speed Up Single Large File Copying?

Apr 22, 2010

I'm planing to copy a productive mysql innodb file from one server to another, and the file size is around 300GB. As the file is keeping changing all the time, I have to shutdown mysql instance and copy the large data file to other server as quickly as possible.I should have to find a way to speed up file copying ... I'm wondering whether there's a way to copy file block by block.If the destination side block has same content, then bypass it.

View 4 Replies View Related

Ubuntu :: Copying Large File Blocks Externally Causes Disk Full

Oct 15, 2010

This has happened several times now, with 9.10 and 10.04. I back up my photos periodically to external drives, using Nautilus. At the next attempted login Gnome won't start and sometimes gives power manager incorrect installation error.

First time this happened I was stumped and eventually did a clean install. Second time, I found advice elsewhere in this forum to solve this by emptying root trash, which did the trick. This time, however, root trash has nothing in it and 2 users trash were insignificant (I emptied them all anyway with rm -r). Tried looking for enormous directories but couldn't find a smoking gun. I would rather not end up doing another clean install - a painful and extreme solution. I'm continuing to look for solutions to the immediate problem, but my question really is, what causes this and how do I prevent it in the future? I've run Computer Janitor regularly and ran apt-get clean but no help. Should I do all my large scale copying from terminal? I'm not a total noob, but close.

View 9 Replies View Related

General :: Measure The Amount Of Usb Data ?

Jan 21, 2010

I would like to measure the amount of traffic my webcam is sending. What is the best way to do this? I tried iostat command, but i do not see the webcam traffic back.

View 1 Replies View Related

Software :: See The Amount Of Data Downloaded?

Aug 3, 2010

I am on a limited broadband plan with 4GB download limit per month. Is there any software which can tell me how much data I have downloaded in a particular period of time?

View 1 Replies View Related

Ubuntu Networking :: Limit Amount Of Data To Be Received?

Jun 30, 2010

I was wondering if there was a Windows or Ubuntu way to limit the amount of data that is able to be sent over the internet between certain times, eg. Between the times of 7am and 7pm can only download 300 MB from the web, when this limit is reached the web is either disconnected or slowed down.

View 2 Replies View Related

Fedora Hardware :: Low Transfer Rate When Copying Large Files Over Wireless

Jan 11, 2010

I just bought a HP 3085dx laptop with an intel 5100 agn wireless card.
The problem: copying a big file over the wireless to a gigabit hardwired to the router computer only gives an average 3.5MB/Second transfer rate. If I do the same copy from my wireless-n macbook pro to the same computer. I get a transfer rate of about 11MB/sec. Why the big difference? I noticed the HP always connects to the 2.4 GHZ band instead of the 5GHZ bands...

On the HPL
[jerry@bigbox ~]$ ifconfig wlan0
wlan0 Link encap:Ethernet HWaddr 00:246:36:AC4
inet addr:192.168.1.75 Bcast:192.168.1.255 Mask:255.255.255.0
inet6 addr: fe80::224:d6ff:fe36:acc4/64 Scope:Link
UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1
RX packets:639243 errors:0 dropped:0 overruns:0 frame:0
TX packets:1293049 errors:0 dropped:0 overruns:0 carrier:0
collisions:0 txqueuelen:1000
RX bytes:53832795 (51.3 MiB) TX bytes:1888619922 (1.7 GiB)

[jerry@bigbox ~]$ iwconfig wlan0
wlan0 IEEE 802.11abgn ESSID:"<censored>"
Mode: Managed Frequency:2.412 GHz
Access Point: 00:24:36:A7:27:A3
Bit Rate=0 kb/s Tx-Power=15 dBm
Retry long limit: 7 RTS thr: off Fragment thr:off
Power Management: off
Link Quality=70/70 Signal level=-8 dBm Noise level=-87 dBm
Rx invalid nwid:0 Rx invalid crypt:0 Rx invalid frag:0
Tx excessive retries:0 Invalid misc:0 Missed beacon:0

I am not getting any errors. I don't know why the bit rate is not known. My airport extreme base station typically reports that the 'rate' for the hp is typically 250~300MBi and about the same for the MacBook Pro. The hp is about 6 inchs away from the base station. Is there anyway to get the rascal to go mo'faster? Is there anyway to get the rascal to use the 5GHZ band.

View 3 Replies View Related

General :: Cp Adds Exclamation Points When Copying Very Large Text Files?

Jul 13, 2009

For my research I have some very large files that are basically millions of lines of ten columns of numbers. These files can be up to 5 GB in size. Recently I noticed that when I made a copy of one of my files, some exclamation points appeared in it where there should not be any: in front of random numbers throughout the file. Making another copy of the file would result in exclamation points in front of different numbers in different parts of the file. Doing this many times has given me up to four exclamation points in different parts of the file. Sometimes the file copies just fine without producing any extraneous exclamation points.Additionally, I have occasionally seen a "^K" where there should be a newline (the data that should have been on the next line was instead on the previous line with a ^K in front of it) in copies that I have made of my files. I don't know if this is related or not.

View 7 Replies View Related

Programming :: Calculate The Amount Of Data Had Been Download?

May 3, 2011

i'm trying to write a program with c socket programming,what i am trying to reach is a program which will calculate a computer's downloaded data from the internet,just to know how much he/she download?

View 1 Replies View Related

Hardware :: Disks Speed Varies When They Have Same Amount Of Data?

Oct 30, 2010

My machine has 4 SATA 2 West Digital 1TB disks. I made 2 partitions on each of them, 500GB for each partition. When I started using them I check their I/O using iozone. The first partition has 100MB/s for read, 70MB/s for write. And the second partition has 80MB/s for read, 55MB/s for write. All 4 disks has the same result.

As I use on, the I/O speed on each partition decrease, to different extend. For example, for the 4 first partitions, the write speed varies from 69MB/s to 56MB/s. And I have same amount of data on each of them, all used 11%.

My guess for this is the disk block allocation policy. This is caused because some disk starts writing from inner location while others writes on the outer edge, even though amount of data on each disk is the same.

View 3 Replies View Related

Software :: Amount Of Data Passed Through A USB Device (from USB Stick)?

Aug 17, 2010

I have a probably kind of unusual problem - when a USB stick is connected to the PC and data is copied from/to that stick, I need to know how much data has been copied. The data itself if not interesting, just how many bytes. I unfortunately don't have access to the program that does the copying, and most of the data doesn't end up on any drive (it just gets read and discarded), so I can't simply check the size of a target directory or something like that. I have had a look at usbmon, but that seems to produce way too much data - the normal case would be around 10 gig of data being read, and I can't have that blown up by a factor of 10 and lying around on the hard drive

View 4 Replies View Related

Software :: Command To Display Amount Of Data Read And Written?

Apr 3, 2011

I know there's a command to display the live amounts of data being written and read to the disk.Like, it tells you how many blocks have been read/written so far to a device

View 1 Replies View Related

General :: Calculate Minimum Ext3 Partition Size For Certain Amount Of Data?

Aug 5, 2011

These following ext3 partitions contain identical data. As we can see, the larger the partition size, the more space is required for the same files:

Filesystem 1K-blocks Used Available Use% Mounted on
/dev/loop11 3965777 561064 3199964 15% [...]
/dev/loop19 573029 543843 29186 95% [...]

[code]....

View 2 Replies View Related

Ubuntu Networking :: Gui Bandwidth Monitor - Set Up And Just Shows The Amount Of Data Downloaded And Uploaded Per Month

Feb 16, 2010

my isp is putting a max bandwidth in my area and I need to monitor my downloads and uploads per month. Is there anything that has a gui that is easy to set up and just shows the amount of data downloaded and uploaded per month. Also if possible to do a pop up if you set a maximum bandwidth amount.

View 2 Replies View Related

Server :: RHEL Rel 5.4 Freezes With Large Jobs?

Feb 25, 2010

We're running
$ cat /etc/redhat-release
Red Hat Enterprise Linux Server release 5.4 (Tikanga)
$ uname -r
2.6.18-164.11.1.el5

It hosts an Apache/2.2.3 web server. We also run apache-tomcat-5.5.23. Most of our programs are mod_perl. Sometimes our users input over-sized data sets, or queries that generate too much output. (I realize that we should try to prevent them from doing that, but right now I'm looking for a more general solution.)

When a large job runs it can 'freeze' our system. The system becomes unresponsive to everything, including command line commands. Sometimes it unfreezes after a while. Once, in this situation I was able to create a high-priority shell. ps reported:

[Code]...

Observing the machine, I see at least one very busy disk. I suspect that some high priority system process (perhaps kswapd) is using all the cpus, preventing anything else from running. Unfortunately, I cannot find much info on kswapd, or debuggging this problem.

View 3 Replies View Related

Ubuntu :: File Copying Freezes - Flash Memory Card

Jul 3, 2010

Using 10.04 Netbook version. I am finding on my Asus EEE 901 that sometimes file copy just seems to freeze - seems to happen usually when copying from the built-in SSD memory to the plug-in SDHC memory card. I have tried reformatting the card and using a different card. It is not just this computer since I found the same thing on my last Asus which was the 900 model.

I am told that there are issues with Nautilus. Is there anything which can be done to improve this or is there anything else which I can install besides Nautilus? I am assuming that there is some issue related to Ubuntu's handling of SDHC memory cards.

It is becoming annoying because it seems to work sometimes and then not. When it happens only option seems to be to turn the netbook off and on again. Even if the file copy is cancelled the card seems to be unaccesible until rebooted.

Also after a certain point it seems that when I try and copy new files to the card, they appear to copy ok but obviously are corrupt in some way - when you try to play videos for instance they are faulty.

View 3 Replies View Related

Ubuntu Multimedia :: Large *.mp4 Files In Gtkpod - Hangs At "Copying Tracks" At 0%

Jul 27, 2010

I am dual booting XP and Ubuntu 10.04, but in the future I will be getting a new machine and I will only be running Ubuntu and won't have access to iTunes. Because I have an iPod Touch, I have been trying to find workarounds for syncing everything that iTunes took care of in the past. One problem I have is managing movies. I have looked through various media players/iPod management tools (Amarok, Rhythmbox, gtkpod) and I am using Rhythmbox to sync my music and and attempting to use gtkpod to sync my movies.

gtkpod is able to sync songs (Tested with a few minute test clip) and short *.mp4 files (15mb I know for sure from test). I am unable, however, to get it to sync a movie (~700mb) I am able to drag it onto my iPod in gtkpod, but when I try to save the changes and write the files, it hangs at "Copying Tracks" at 0%. It eventually crashes during the couple times I have tried to wait it out. So this being my situation, my question is, is there a size limit to the *.mp4 files I can sync to my iPod Touch via gtkpod? is there any other tools that you know of that I can sync videos to my iPod with?

View 9 Replies View Related

Fedora :: Random Crashes & Freezes Copying From NTFS?

Dec 22, 2009

I am in the process of trying to move my files from a windows 2003 install over to fedora 12 using the Fedora 12 x86_64 Live image.

After some initial problems with burning the ISO image (had to enable disk-at-once to get rid of I/O buffer errors). And a GPF in the xor_sse2 module while building my software raid 5 arrays (forced to reboot, unable to stop or restart the array) I finally managed to create a single LVM volume group ontop of it all.

Initially I ran some tests on the raid volume and wrote a large 100GB file filled with zeroes all over it to be sure that it was functioning normally. After that I proceeded to copy files over from the NTFS volume to the LVM volume. which resulted in lots of errors on my terminal and complete loss of the entire filesystem. Even a simple ls -l / resulted in "Bus Error" although the system still kept running for a few seconds, until it finally froze and spontaneously rebooted. Since then I ran memtest to be sure my memory is fine, and tried alternative ways of mounting the volumes, and copy methods. Each attempt resulted in either a complete freeze or instant reboot. After 6 reboots, I managed to copy 20GB out of 490GB. Obviously this will take forever this way.

/var/log/messages is unhelpful. There are a bunch of cryptic messages regarding ATA bus errors (mostly CRC errors it seems) and soft resets when the LVM volume get mounted, but everything functions fine until I touch the NTFS volume.

Here are some of them:

Code:
Dec 22 02:32:15 localhost kernel: ata6.00: exception Emask 0x0 SAct 0x0 SErr 0x0 action 0x6
Dec 22 02:32:15 localhost kernel: ata6.00: BMDMA stat 0x4
Dec 22 02:32:15 localhost kernel: ata6.00: cmd ca/00:80:02:a5:fb/00:00:00:00:00/e7 tag 0 dma 65536 out

[Code].....

View 1 Replies View Related

OpenSUSE Network :: Large File Download Will Start And Then Freezes

Mar 26, 2010

I've started having problems with large file downloads. A download will start and after a while freeze. The downloads window reports the correct connection speed and gives an estimated time to complete, but it stays frozen. Small downloads, torrents and surfing are not affected. I can do everything else normally even when the download is frozen. I've checked with my ISP and everything with my equipment checks out.

View 9 Replies View Related

OpenSUSE :: Dolphin Losing Files When Copying Many Files Or Large Folders?

Feb 14, 2010

I've discovered that Dolphin seems to lose random files when copying many large folders.

I first noticed this a few months ago when I tried to copy my music library from one folder to another on the same HDD. It consisted of around 600 folders and 6500 files. During the copy there were no errors but after the copy I found that some of the newly copied folders were missing files. I put it down to human error or a glitch.

Yesterday I tried to copy 13 folders containing rips of some of my DVDs. Each folder basically had one film of either 700MB or 1.4GB. Again no errors showed up during the copy but I found 3 of the newly copied folders were empty.

It's not so critical with music or films but I can't afford to lose work data like this.

Has anyone experienced or seen a similar problem with Dolphin? I'm going to have to do some more extensive testing but this is not good.

The first time I noticed the problem I was running KDE4.3.4 (I think) and now the latest was with KDE4.4.0.

View 9 Replies View Related

Ubuntu / Apple :: Data Size Too Large For Disk?

Apr 28, 2011

I am trying to burn mac osx 10.5 install disk from from a 6.7gb dmg disk image. I thought I would be using 2 DVD-R 4.7GB discsfor this burn, I was hoping when the first was full it would ask for another to finish the burn. Instead it get the message that the DVD will not hold the choosen DMG. file.

Can I do anything besides buy a dual layer DVD that would hold the whole file?

View 1 Replies View Related

Ubuntu :: How To Copy And Paste Large Chunks Of Data

Jun 16, 2011

I want to copy about 40GB - to a partiton. There are two hard drives in my box one won't boot but I can aaccess it and mount partitions and I aim to move data from it to a new bootable hard drive. Doing a simple cp copy command may not be the best way to copy and paste such a large chunk? Also I want to backup the data I plan to copy/paste using a USB hard drive to backup. But I could also paste data from the backup to the new drive instead of from old internal hd to new hd. - that's another option.

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved