General :: Ubuntu Slows To Crawl When Un-archiving Big File

Aug 4, 2010

I started unarchiving a RAR file that this several gigabytes big. The computer is now going really slow, it's almost frozen. Sometimes I can move the mouse a little, but that's it. The unarchiving process seems to have halted, so now all I can do is restart the system. I don't think I can unarchive this file in Linux.

View 5 Replies


ADVERTISEMENT

Software :: System Randomly Locks Up And Slows To A Crawl?

Jun 5, 2011

I noticed that sometimes my Linux server will randomly start to lag really badly, to the point where even a http request takes forever. It is an Intel Core2Quad with 8GB of ram running FC9. This is my "Everything" server so it does file, email, DNS, web (for local stuff only), VMs and so on. There are about 5-6 VMs running on it at any given time. I manage it through a VNC session and have some SSH consoles within that session. This way if I reboot my PC I don't lose all my SSH consoles. If I need to SSH to any server I do it from there. I treat it kinda like a terminal server to some extent.

When this slowdown happens, top is not really useful because I also do F@H so that will always be to the top, but it's low priority. The VMs are also always near the top. This does not change whether it's slow or not, so when it's slow, I have nothing to go by on how to troubleshoot. The load does seem to skyrocket though. Right now it's doing the slowdown thing and it's at 8.09. Normally it's at around 3 which imo is good as it it's under 4. I have 4 cores so anything more than 4 means it's queuing. At least that's how I understand it. This is the output of top:

Code:

[root@borg ~]#
[root@borg ~]#

[code]....

View 9 Replies View Related

Fedora :: Systems Desktop Slows To A Crawl When Kwin's Compositing Is Enabled

Jan 10, 2011

I was just curious for the ATI users out there that are running KDE on Fedora 14 have any of you experienced any performance issues? My systems desktop slows to a crawl when Kwin's compositing is enabled, however in KDE 4.4 there were no issues at all. My desktop I built is a rather modest build just for development nothing too High end on the machine, however KDE and it's effects surely should run. I also use xfce so I switched to that and gave up on KDE 4.5

View 14 Replies View Related

General :: Get Mail And File Archiving Software?

Jun 23, 2010

Which are the Open Source "file and email archiving" software for both Linux and Windows equivalent to Enterprise Vault Symantec?

View 2 Replies View Related

General :: Remote File Copy Through Archiving Software ?

Oct 6, 2010

I need to back up a fold on a remote machine to my local box; the remote hd does not have enough space archive it, neither does my local box. I know there's a cantrip to pipe scp through gzip (or similar), but I don't remember the syntax.

View 1 Replies View Related

Ubuntu :: Archiving Many (>100000) Files Into A Single File Via Cli

Mar 4, 2011

I'm trying to zip or rar >100000 files into a single file so that I can upload it to my server much faster than ftp downloaded it. Total they're all only 4gb, but because of the number of files Nautilus freezes just opening the folder they're in. They're all .jpgs and all in the same folder and I've tried a few commands but I keep getting error messages.

Anyone have a command that will archive all the files from a folder into a single zip (or rar, tar etc)? I can't just archive the folder because then I would have to move all the files out of that folder and just opening the folder to move them would crash it, and I don't have ssh into that server.

View 3 Replies View Related

Networking :: Large File Transfers Start Fas Then Drop To A Crawl?

Jul 19, 2010

I need to transfer 330G of data from a hard drive in my workstation to my NAS device. The entire network is gigabit and being run with new HP procurve switches. All machines have static IP addresses. The NAS is a Buffalo Terastation PRO which has the latest firmware, is set to jumbo frames, and has just been upgraded with 4 brand new 500G drives giving us a 1.4TB raid 5 setup. My workstation is a dual Quad core xeon box running on an Intel S5000XVN board with 8G of ram. My OS is Ubuntu 10.04 x64 running on a pair of Intel X25 SSDs in a raid mirror. The data drive is a 500G SATA drive connected to my onboard controller. The file system on the SATA drive is XFS. This problem was ongoing before I got my new workstation, before we had the GB switches, and before the NAS got new drives. When I transfer a small file or folder (less than 500M) it reaches speeds of 10-11 MB/sec. When I transfer a file or folder larger than that the speed slows to a crawl (less than 2MB/sec). It has always been this way with this NAS. Changing to jumbo frames speeds up the small transfers but makes little difference in the big ones. I verified with HP that the switches are jumbo frame capable.

View 2 Replies View Related

Software :: Mysql Full Backup Script Not Archiving The Dump File?

Mar 20, 2011

I have a problem with a script i wrote, the script runs fine if manually executed however it doesn't run *fully* when executed via cron

here's the script :

Code:

#!/bin/bash
FILENAME=mysql_full_dump_`date '+%m.%d.%y'`.sql
`which mysqldump` --all-databases -uroot -p************ -h127.0.0.1 > /root/$FILENAME
RETVAL=$?

[code]....

the script resides in /root/bin and the cron entry is as follows:

Code:

0 0 * * * root "/root/bin/mysql_daily.sh"

the result is the .sql file, but it doesn't archive it.

View 2 Replies View Related

General :: Using Wget To Recursively Crawl A Site And Download Images?

Mar 29, 2011

How do you instruct wget to recursively crawl a website and only download certain types of images? I tried using this to crawl a site and only download Jpeg images:

wget --no-parent --wait=10 --limit-rate=100K --recursive --accept=jpg,jpeg --no-directories http://somedomain/images/page1.html

However, even though page1.html contains hundreds of links to subpages, which themselves have direct links to images, wget reports things like "Removing subpage13.html since it should be rejected", and never downloads any images, since none are directly linked to from the starting page.I'm assuming this is because my --accept is being used to both direct the crawl and filter content to download, whereas I want it used only to direct the download of content. How can I make wget crawl all links, but only download files with certain extensions like *.jpeg?

EDIT: Also, some pages are dynamic, and are generated via a CGI script (e.g. img.cgi?fo9s0f989wefw90e). Even if I add cgi to my accept list (e.g. --accept=jpg,jpeg,html,cgi) these still always get rejected. Is there a way around this?

View 3 Replies View Related

General :: Archiving Bash - Script In VIM ?

Dec 14, 2010

I'm relatively new to Linux in general but have learned to do the basics with the CLI.

Well my main problem is writing my first "real" script in VIM. I just have no idea where to start. I was hoping you guys could point me in the right direction.

Well this is what the script needs to do.

"As the IT administrator for a large manufacturing company you have been tasked with producing a script for archiving the daily log files. Each week the daily log files are placed in a directory named weekXX, where XX is the number of the week. Each week directory should be archived and compressed and stored in a folder called log_directories. When the script has completed the task it should display on the monitor exactly what files and directories it has archived.

The script should be started by a user with the required week numbers added as arguments (e.g prog 13 14 15 should start the program and archive the daily log files in week13, week14 and week15).

A basic manual is required, showing how to start the program, the hardware and / or software requirements, a hard copy of the script and a brief description of the test strategy and test data."

View 14 Replies View Related

General :: Automate Archiving Of Log Files Using Tar?

Mar 29, 2011

I need to tar this logs, but i dont how to make it simplier to me. Everyday there are created this five logs. I need to make five tar files from every day from this files at the end of the month

For example

Till now i have tar it manualy (copied every file)

View 2 Replies View Related

General :: Difference Between Archiving And Backup Of Data?

Feb 24, 2010

I am a little confused in distinguishing Backup, compressing and Archiving data. Help me to figure out how these can be useful.

View 2 Replies View Related

General :: Control Gzip Compression Rates While Archiving Files With Tar?

Jun 4, 2011

I have a script which periodically backs up a directory using the command "tar -czvf [name] [directory]" but my problem is that the script has recently been putting a lot of stress on the server (Minecraft SMP) and tends to lag players as it backs up, which recently has been taking nearly 5 minutes.So I need to know if there's a way to control the GZip compression rate at the same time that it archives and backs up the files?I understand that I can first tar the files and then GZip them separately with a different compression rate afterwards, but this would not work because it names the files with the current server time, which sometimes changes in between commands.

View 1 Replies View Related

General :: Difference In Size Of The Tarred Files After Archiving Using Tar Command?

May 17, 2011

i am using red hat linux 2.4 . I have 3 folders dir1 dir2 dir3 I have tarred them like this.

1.tar cvfz tarball_1.tgz dir1 dir2 dir3

2.tar cvfz tarball_2.tgz dir1 dir2 dir3 2>& /dev/null (So that it does not display any error message or operation details to the user)

[usr@machine]$ ls -lrt
-rw-r--r-- 1 usr grp 199843988 May 17 13:39 tarball_1.tgz
-rw-r--r-- 1 usr grp 199837488 May 17 13:53 tarball_2.tgz

But can any one explain the size difference as seen in list output...

View 4 Replies View Related

General :: Archiving Log Files - Script That Accepts Two Parameters InputDir And OutputDir

Feb 28, 2011

I need a script that accepts two parameters inputDir and outputDir.

This script should copy all the log files in the inputDir to a folder like <BackupLogs-currentDaysDate>

The new folder with the log files should be tarred and gzipped <BackupLogs-currentDaysDate>.tgz

And this new <BackupLogs-currentDaysDate>.tgz file should be copied to the outputDir.

Also all the log files in the inputDir should be deleted.

View 16 Replies View Related

Software :: File Archiving Software - For Backups ?

Jul 29, 2010

I'm looking for ... or will develop if I cannot find any ... software that can keep archives of a tree of files. The tree of files is on a computer dedicated to backups. Various servers and desktop users periodically run rsync to update their slice of that tree of files. But that means old files are replaced. What I want to do is keep the old files around, with some limitations (such as a finite number of versions, finite age, etc). This would be similar to a repository like Subversion, except that data only comes in (there are no checkouts, though obviously a restore mechanism is needed). It would be like rolling backups, except the "rollover" would be on a per-file basis.

What I currently do, but want to get away from, is making a hardlink replica tree every few days with a command like "cp -al synctree/. archtree-for-today". But that eats up inodes very rapidly and is much harder to find things in. Making a file listing is very much slower this way, too.

View 3 Replies View Related

General :: Command Line - Unix - Program That Can Handle All Popular Compression/archiving Formats - E.g.tar, Gzip, Bzip2, Zip?

Jun 22, 2011

I sometimes get confused by the varying command line options I need to run common Unix archiving and compression software (e.g. gzip, bzip2, zip, tar).

Is there a program out there that can just Do What I Mean for common cases? For example:

View 2 Replies View Related

General :: Kubuntu Slows During DVD Playback?

Mar 7, 2011

I have an old laptop that used to run on Vista. It got so bogged down with excess software etc that it was barely running. There was no sound and it was unbelievably slow.

I know its supposed to be more stable (almost uncrashable!) and generally faster so I've done a complete reinstall with kubuntu (I tried a few different distributions with live discs and this was my favourite). All I need it to do is play DVDs - I have a netbook for everything else but the screen is too small for watching movies.

The reinstall worked perfectly and I love kubuntu. The sound now works fine and its MUCH faster.

Except ...

Half way through watching a DVD the system seems to suddenly slow down. The DVD playback becomes choppy and unwatchable and even when I close the player, the whole OS is painfully slow. Programs take ages to open. Even selecting shut down or restart takes over a minute.

And here's the weird part. A restart doesn't fix it. If I restart the system and try to watch the DVD again its instantly choppy. And the whole OS is slow. I actually have to leave the whole system off for a while (I usually give up for the night!) before it calms down.

The laptop is not noticeably hot. Leaving the machine on for a long while doesn't cause problems. It seems to be only when watching DVDs. The spec is not especially low (1GB ram and a reasonable processor) and it doesn't matter whether I use VLC, Dragon or Movie Player.

View 8 Replies View Related

Ubuntu :: Internet Slow - Web Surfing At Crawl Speed

Jun 16, 2010

I have been having come problem, my surfing speed has gone to zero. I can download at full speed but surfing is crawl speed or just timing out. I am pretty sure it's because firefox is clogged, I just want to know is there a way to clear the clog w/o losing my settings? I fixed this before by making a new user, but I don't want to loose all my saved form's and bookmarks.

View 8 Replies View Related

Ubuntu :: 10.4 Slowed Down To Crawl - Actions Taking Forever

Oct 5, 2010

I am a newbie to Linux, and so just installed Ubuntu.. for the first week all was fantastic, but now, everything has slowed down. It seems to run fine for a few minutes and then everything slows down. I try and drag a window or open a menu, type, anything, and it takes forever! Also the internet seems to stop working on a regular basis. I have checked processes to see if there was anything happening and one of the processors was sitting at 100%, but no actual processes in the list seemed to be using any processing power at all!

View 8 Replies View Related

Ubuntu :: Noticeable Sluggishness In Miro - Slowed To Crawl

Sep 1, 2011

Has anyone else who uses Miro noticed a very noticeable sluggishness to it lately? It's been a bit since I've used it, but opened it up today, and it's just performing at a crawl, not to mention sucking up a goodly amount of CPU cycles (process "miro.real"). I've tried the Repo version 3.5.x as well as the PPA version 4.0.3, both are just as slow and as choppy as it can be in 11.04. Any thoughts, or anyone just want to share my misery? It loves company...

View 1 Replies View Related

Networking :: Does Connection To Box Slow To A Crawl?

May 2, 2011

I have a linux server that has seemingly random network slow downs. This server is mainly my dvr. I'm starting to think it's a hardware problem, but that's just a gut feeling. I don't really know how to determine if it isn't.

Slow summary:SSH, HTTP, VNC incoming traffic are all affected
Outgoing traffic seems ok. I haven't tested this as much.
Rebooting mostly helps.
Stopping/starting network doesn't help
Load average is below 1.0
Updated Kernel with no change

[Code]...

View 6 Replies View Related

Ubuntu :: Evolution 2.28 - Archiving And Backup

Jan 31, 2011

I want to create retrievable archives of a my old emails say monthly to avoid the old emails running me out of memory. If I use the 'Backup Settings' procedure as explained in Evolution Help what happens when I wish to consult the /home/dbus/evolution-backup.tar.gz archive file?

1- Will it simply over-write my current Evolution data? [in which case its not what I need]
2- If not, how do I return the archive file to dead storeage and resuscitate my current data?
3- If it will overwrite using the 'Restore Evolution' procedure given in Evolution Help is there a workaround ... perhaps by ...
3a- renaming the archive file,
3b- or 'restoring' it in another version of Evolution,
3c- or archiving CURRENT data as a 2nd backup with a different name [eg: /home/dbus/evolution-backupJan11.tar.gz?] then restoring that?
4- Will I be able to retrieve successive archives if I rename them, say '/home/dbus/evolution-backup.tarDec10.gz' etc once Evolution's saved them?

Alternatively the following came from a dead thread [from commonlyUNIQU3 ] ... is it still valid? does it avoid the problem of potentially running out of memory?
A. Make a subfolder to the "Inbox" under the "On This Computer" tree (I call mine "Archive")
B. Drag and drop the emails you want to archive into this folder.*

This will move the selected emails off of your Exchange server/account and into this folder (and into local storage) - unless you do a copy & paste instead of drag & drop. You may need to have the setting for downloading emails for offline access enabled for this to work as desired. If I recall, the new integrated backup feature creates a compressed (.zip) archive from which you can later restore the email (haven't tried that just yet).

View 5 Replies View Related

Ubuntu Networking :: WUSB54G V4 Wireless Adapter - Continues To Slow Down To Crawl

Sep 6, 2010

I have a number of computers working just fine on my wireless network so I don't believe it is a router issue. I am running 10.04 on a dell gx280. I plug in the WUSB54g and ubuntu sees it and it connects to the internet just as it should. I can surf the internet thruough different browsers, and although a little sluggish, it works just fine. The problem is when I download a file. It starts out working as it should but continues to slow down to a crawl. Update manager had a 54MB update, it started out downloading, but then said it would take 2 hours so I just cancelled. I have several WUSB54g's and they all have the same issue. I changed to a different type of adapter, and the update manager only took a couple of minutes.

View 3 Replies View Related

Software :: PDF Editing And Archiving

Jul 22, 2010

I'm looking for an application that will give me some advanced tools for editing PDFs.

Here are the features I'm looking for:
Editing metadata (tagging with keywords)
Merge multiple PDFs
Rearrange page order
PDF bookmarking
Optical Character Recognition

I can give further clarification on these items if needed. My goal is to convert all of my paper files into digital files that I can store on my server. In order to effectively do this, I need the tools listed above.

Is there anything in the linux world that will give me these PDF editing abilities?

View 4 Replies View Related

Slackware :: High HD Activity Bringing System To Crawl When Browser Is Open

Apr 24, 2010

I'm running Slackware 13 with a custom kernel based off of 2.6.32.3. I tend to leave my system on 24/7, as well as my web browser. Originally it was Firefox and now it is Google's Chrome. Usually about a day of leaving the web browser open my HD activity spikes so high that I can barely do anything on the system until I kill the web browser. This has been happening with both Firefox AND Chrome! As soon as the browser processes are killed, the system returns back to normal.

View 10 Replies View Related

Ubuntu :: Performance Slows Down After 2-3 Days?

Jul 7, 2010

I hope somebody has a solution to this annoying situation: right after restart my system is very fast (at least for me), but after few days it slows down to the point when using it becomes a waste of time. As soon as I restart everything is OK. Usually I keep open 1-2 Open Office files, couple large PDFs, and some Firefox tabs at the same time, nothing special. No gaming/video on this rig, just basic internet and text processing. Here are my specs:

9.10 Karmic
Kernel Linux 2.6.31-22-generic
GNOME 2.28.1
Memory: 433.1 MiB
Processor: AMD Athlon 64 3200+

Am I wrong thinking that I have enough juice to keep these windows open? Maybe there is a way to tweak something?

View 4 Replies View Related

Ubuntu :: 10.10 Randomly Slows Down After Login

Jan 13, 2011

Using ubuntu 10.10 64 bit, last few days my desktop is not stable. Everytime my system becomes very slowwwwwwwww after login (after 5 mins, after 2 mins very random), sometimes it takes a while even to get to login screen. I checked System Monitor when this happens, i don't see high cpu usage/memory usage. Only thing i see is, when i shutdown my machine, it giving few errors about Virtualbox.

View 1 Replies View Related

Ubuntu :: Computer Locks Up Slows Way Down?

Jul 9, 2011

About a week after installing 11.04, I started noticing that my computer basically locks up after leaving it for many hours. When I come back, I cant get any programs to respond. I click on Firefox and it takes about one minute for it to finally respond. If I try to launch Konsole, it locks up too. But suddenly all programs start responding at the same time, after about one minute.

This is very frustrating and I just feel the problem is zeitgeist. Of course I am not sure because I cant even run System Monitor to check the problem since I cant run any apps for about a minute. By the time I run System Monitor everything is working again.

View 6 Replies View Related

Applications :: Ubuntu In Virtual Box Slows Way Down

May 15, 2010

I'm running Ubuntu in a virtualbox on a Windows 7 host. It works great - except after I start up the machine, the response to keystrokes and updating of the screen slows way down. It gets slower and slower until it is unbearable after about an hour. If I save the machine state and reload it, things work great again - so it sounds like a virtualbox problem, not an ubuntu problem. Do other people who run Ubuntu in a virtualbox see this behavior? Do other people who run virtualbox in Windows 7 see this?

View 8 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved