General :: Huge Cpu Consuming And Crash Afterwords?
Mar 4, 2011
Since a couple of months, the machine is hanging itself. I can't connect in any way to it. I have it in a Datacenter far away my office, so when it happens I have to ask the provider to reboot it manually because it always happen in working hours and my clients gets mad on that moments. It happens every 15 days (more or less). It is very difficult to get the procesess status, but today hangs again and I could connect (a very very slow connection) and save a "ps aux".
[Code]...
There are some scripts that you can see as "php /var/www/html/call_engine.php ..." that are consuming CPU more than I expect and I will correct, but I would like to know about the other processes like many crond or the "/usr/sbin/sendmail -FCronDaemon -i -odi -oem -oi -t". Or maybe the whole problem would be the php script?
View 2 Replies
ADVERTISEMENT
Jun 19, 2011
I'm new to netbooks and linux. I have an Asus 1215b (AMD FUSION E350) and I want to install an OS only for writting offline or browsing the web (and accessing to google docs or my email account) while making my battery last as most time as possible.
View 14 Replies
View Related
Jul 18, 2011
I'm using Ubuntu 11.04, Firefox 4 and the latest version of Flash. My other machine specs are
Code:
processor: 1
vendor_id: AuthenticAMD
cpu family: 16
model: 8
model name: Six-Core AMD Opteron(tm) Processor 2439 SE
stepping: 0
cpu MHz: 2800.112
cache size: 512 KB
fpu: yes
fpu_exception: yes
cpuid level: 5
wp: yes .....
Since this is a virtual machine, it is causing concern in our environment b/c other virtual machines are impacted. How to coax this plugin to use less CPU?
View 2 Replies
View Related
Oct 7, 2010
I am wanting to create the smallest server I can possible. I have a small server I use as a print server and for file sharing. The hard drive is about 80GB. Since I only run cups and samba, I see no reason to take up an addition few GBs with major distros just to run the OS. I know it is possible to get a small distro around 50MB or so. I would even be happy with an OS running around 200MB. what would be the best way to go about doing this? LFS is just too complicated and time consuming for the end result. Would something like Gentoo be better? Anything else that I may not know about?
View 11 Replies
View Related
Dec 26, 2010
suggest or advise the best practice of bz2 or tar gz. i have a directory /var/opt/axigen which has size 33gb on daily basis as per the schedule we need to take back.i want to know pros and crons of below commands, say best compression and decompresstar cvzf /var/opt/bkup_axigen/axigen_bkup_1.tar.gz /var/opt/axigenortar -jvzf9 /var/opt/bkup_axigen/axigen_bkup_1.tar.gz /var/opt/axigen
View 2 Replies
View Related
Jan 2, 2011
EDIT, 3 January 2010: SOLVED: In fact, it was the fonts that were being rendered larger and pushing everything out. Altering DPI in xorg.conf solved the problem, see details at bottom.ORIGINAL POST:I used jockey-kde to activate the nVidia (closed-source) driver in order to fix some full-screen problems I was having.It has fixed that, but now everything is really big. I have my resolution set to 1680x1050 but it appears to be lower than it was before - the K menu, for instance, takes up about 1/6 of the screen when I open it. I took a screenshot here:http://i.imgur.com/94lyN.jpgI know this isn't much information, but can anyone tell me why - while the desktop appears to be larger with a higher resolution - applications are actually appearing as though the resolution is lower?
View 7 Replies
View Related
Oct 31, 2010
I got a apple PowerBook 3400c with upgraded ram. I would like to boot linux on it but the ram is still not huge (96MB). I burned PowerPup to a CD but it would not boot and kept booting macOs 9.
View 2 Replies
View Related
Jul 8, 2011
I want to extract a huge .tar.gz file but when I do extract it stalls the server. The server is write heavy and extracting seems to choke the disk. Is there a nice way to extract without stopping the world? I've tried the 'nice' and 'cpulimit' command but they don't seem to do the trick.
View 2 Replies
View Related
Jul 13, 2011
I am running the command on a Mac but due to it being a generic unix command and a command line query.. I thought I can write on this forum.. I am running the command
Code:
df -h | grep '/dev/'
I get
Code:
/dev/disk0s2 389Gi 62Gi 327Gi 16% /
/dev/disk0s3 76Gi 24Gi 52Gi 32% /Volumes/Backup
/dev/disk3s2 500Gi 47Gi 453Gi 10% /Volumes/Misc
Note the huge space between the 1st and 2nd Column..
This is because currently I have some NAS drives mounted which are not showing due to grep. When they are not mounted. The output is fine with equal spaces between each column (like between col 2 and 3.. or 3 and 4). I want to do a (dare I say) sed or awk or something to reduce the space between 1st and 2nd col. So that it has space like between col 3 and 4.. or 2 and 3. This is because I am showing this output somewhere and because of the space its not showing up correctly.Also I hope the command will still work when the NAS drives (afp) are not mounted.. basically consistency. The spaces are not showing properly in the quote tag. Changed it to CODE tag.
View 12 Replies
View Related
Apr 21, 2011
I noticed Xorg consuming over 30% of my cpu cycles. Does anyone know what is going on here?
View 2 Replies
View Related
Apr 23, 2011
I have a huge log file of around 3.5 GB and would like to sample random sections in the middle of say 10 MB for the purpose of debugging what my application is doing.
I could use head or tail commands to get the beginning or end of the file, how can I grab an arbitrary portion from the middle of the file? I guess I could do something like head -n 1.75GB | tail -n 10MB but that seems clumsy and I'd need to determine line numbers for the midpoint of the file to get 1.75GB and 10MB line counts.
View 2 Replies
View Related
May 19, 2010
Is there any software in Linux to view huge .txt files, say, over 10 megas? I'm now using default "gedit", version 2.28.0, which seems to not be able to open huge .txt files. It's the same case for Windows default .txt browser, but in Windows, "Win Word" seems to work fine. software under Linux to browse huge .txt files?
View 5 Replies
View Related
May 30, 2011
I've a file with a size of 6GB. I would like to compress this file and split them into smaller files. I was also thinking in use bzip2 to compress it, because if offers a good compression rate. How can I split this file into small ones to compress it?
View 5 Replies
View Related
Nov 30, 2010
This doesn't happen all the time, but right now I can't kill skype!I noticed that it's consuming 100% of CPU, so I closed it on desktop, but it's still there consuming 100% of CPU. So I tried to send the kill signal "killall skype," and nothing happens.Then, I tried to get the process ID "pgrep skype" and then "kill process_id," but skype is still consuming 100% CPU.What the hell?
View 7 Replies
View Related
Mar 19, 2010
I'm trying to check how much ram memmory and CPU is a particular process consuming, i have checked free -m and top but is not esay to undesrtand the output from CPU
What i need is to check the consuming for a particular process; with free -m i'm able to see only the available space of memmry and i want to know how much is consuming a particular process, the same with 'top'
It's hard ot understan the output from top, to many columns and MAY process, not easy to check the process i want to monitor
View 5 Replies
View Related
Jan 9, 2011
Please why my scanning is always creating huge 50Mb to 100Mb PDF files ?Each A4 Pnm file is of 6.5Mbytes by resolution of 150.If I decrease the resolution lower than 100, then it starts to be unreadable my text ...
View 11 Replies
View Related
Aug 13, 2010
I am using openSUSE 11.2 x64 with KDE 4.3.5.I am experiencing sudden, sporadic drops in performance due to the kwin and Xorg processes suddenly consuming large amounts of CPU for a few minutes (maybe 30-40% each) before dropping back to normal (1-3% each). I am seeing this using top.
View 8 Replies
View Related
May 4, 2010
I've just upgraded to 10.04. I love it. I've been running dark themes for a long time and this upgrade merged seamlessly with my setup.There is a nasty problem that I've noticed on only my desktop box (laptop is fine, also upgraded to 10.04). The dbus-daemon is hogging approximately 50% of my cpu cycles. Is there a way to limit this? I'm watching this via the System Monitor and the % CPU. My laptop registers its dbus-daemon % CPU as 0.
View 3 Replies
View Related
Nov 13, 2010
bind9 is taking a lot of the RAM
process info:
ID Owner Size Command
17559 root 290396 kB /usr/sbin/named -c /etc/bind/named.conf
View 9 Replies
View Related
Dec 23, 2010
After I reboot, my java process consumes 100% of the CPU then settles down to about 40% CPU and 12% memory, status sleeping (4 core AMD) I've removed the OpenJDK and installed Sun JRE but no difference. by comparison, Firefox with a lot of tabs is at 3% CPU and 5% memory. Is it better if I go to 10.04 LTS, 32 bit?
View 3 Replies
View Related
Jul 11, 2011
With everything visible that I can see closed, my laptop is consuming half if of it's available RAM (and that does not include caches). I can only, at an eyeball, account for a quarter of it: [URL]. Where is the rest being spent at?
View 2 Replies
View Related
Jul 19, 2011
How to identify which processes (or PIDs) are consuming SWAP? In my RHEL box SWAP is nearly 100 % utilized.
Code:
$ free -m
total used free shared buffers cached
Mem: 144967 143212 1754 0 166 135259
-/+ buffers/cache: 7787 137180
Swap: 22367 21733 634
View 11 Replies
View Related
Feb 18, 2011
Has anyone used this script recently (or at all)? [URL]... The full instructions are a bit intimidating and seem very time consuming: [URL]...
View 4 Replies
View Related
Jul 9, 2011
I have 3.5GB of usable RAM as reported by System Monitor. Now after a couple of hours, System Monitor shows that around 52% of my RAM is being consumed. Switching to Processes (I have All Processes selected), I see firefox-bin consuming around 200-250MB of RAM while Xorg consumes around 100-140MB. The rest adds up to another 200MB consumed.
So it doesn't add up. Yeah, I can grab another RAM block and be done with it (giving a projected total usable RAM of 7.0GB, I have another slot). I just want to know where the rest of the consumed RAM is.
View 8 Replies
View Related
Feb 7, 2011
There is squid 3.1.8 on a Fedora 12 server with 2 GB of RAM. It is used for sharing the Internet for approximatively 80 PC's. The problem is that it is really a memory hog, when enabling delay pools. I am using the following configuration for the memory pools:
Code:
delay_pools 1
delay_class 1 2
delay_access 1 allow drumuri
[code]....
There are moments when the squid process uses approximatively all the RAM and goes into swapping. After that I restart it, it goes well for a while and again eats up all the memory. On the Internet it says that Squid uses a lot of memory, but should it use 2 GB of memory, even if all 80 people are online at the same time?
View 2 Replies
View Related
Jan 24, 2011
I use a program which makes a large image which I have to scroll to view. The program has no way to save the image, and I have no access to the source to modify it. The only way I have to get the image from the program is by screenshot. My goal is to save the full size image without having to piece together individual screenshots. I'm using this script to try taking a screenshot:
#!/bin/bash
window=$(wmctrl -l | grep "Program$" | awk '{print $1}')
wmctrl -v -i -r $window -e '0,0,0,6030,5828'
wmctrl -i -a $window
import -window $window ~/Desktop/screenshot.png
This uses wmctrl to get the window id ($window) for a window named "Program". It then tries to resize the window to the desired dimensions. It uses imagemagick (import) to save a screenshot.png on the user's Desktop. All of this works except the resize step. I can resize the window using wmctrl -r -e, but sizes greater than the screen size don't work. I'm using Ubuntu 10.04 and the Gnome Desktop. I run two monitors, but I've tried this with one of them disabled. Is there a way to resize the window larger than my screen to get a huge screenshot?
Part II: I tried using xrandr to set up screen panning, so as to have a bigger desktop than my monitor. xrandr --output LVDS --panning 2600x2500 This command makes the laptop screen pan over a 2600x2500 size desktop, even though it can only show 1440x900 at one time. To turn off the panning, I can use a similar command to set total size and with zeroes for the panning section. This gives me back my original laptop display behavior. xrandr --fb 1440x900 --output LVDS --panning 0x0 This is all done with xrandr, and does not require any Xorg.conf changes (my Ubuntu system doesn't even have an Xorg.conf).
My video card seems to only allow about 6.5 million pixels, even though the maximum dimensions are 8192x8192. That maximum seems to be the maximum for either dimension, but there is a limit to how many pixels can be drawn, which is the width multiplied by the height. Once I did the screen resize, I tried my script again and got a screenshot. The screenshot however is totally scrambled. I'm not sure if it's unable to take a screenshot of an off-screen window or if it is unable to handle the large dimensions of the window. With the panning display, the window should think it is visible, and the window manager should think it is on-screen. So there is a pixel buffer somewhere with those pixels in it, so there should be a way to get a screenshot.
View 1 Replies
View Related
Dec 15, 2009
I am a both rhel5 and fedora user.I can not configure my Samsung syncmaster 632nw monitor to display full screen at 1360X768.There is huge black space both left and right side of the monitor. I have tried many times to solve it but unable to do it.The max screen resolution is 1024*768 and minimum is 640*480.
Here is the xorg.conf file:
View 2 Replies
View Related
Dec 5, 2009
I just noticed this in my "df" output:
[Code]...
There's an entry in /etc/fstab which mounts this automatically. If I interpret that correctly, that's a 224 mbyte ramdisk that is consuming 224 mbytes of system memory whether or not I use it. Is that correct? If so, will anything break if I unmount it and delete it from /etc/fstab? Do programs typically depend on it? I'd like to reclaim the system memory (low DRAM machine) for other uses.
View 5 Replies
View Related
May 20, 2010
after successfully configuring the dwa-552 to work in master mode in ubuntu 10.04 (ath9k driver) I ran some file transfer tests. The download speed is very good (~50mbps) but the upload speed spikes at about 10-20mbps for the first few KB and then it's nonexistent (0-1kbps). This only affects file transfers or otherwise bandwidth consuming processes. Normal web browsing or ssh is not affected. After running a speedtest of my internet connection which is routed through the AP I could upload to the internet with 1mbps which is my inet connection maximum so apparently this is not affected. Tried the same file transfers with netcat to eliminate any other factors and had the same problem. dmesg and hostapd debug did not report anything unusual
View 2 Replies
View Related
Jun 14, 2011
I'm running Fedora 15 with current updates and kernel. I do not have anything special or non-standard about my configuration or setup. I use grsync to sync my home folder files to a remote rsync server on my network. I've checked my hard drives and my memory and everything else I can think. Here is the problem:
grsync will run for some time and once it nears completion it will crash. This, however, is no standard crash. It literally shuts my computer "OFF". I have shared the remote rsync folder through CIFS as well and I can copy those exact same files through nautilus with drag and drop without issue. I have had a few occasions where the rsync process will complete without issue, but this is a rare occasion. Since it powers my computer completely off I do not enjoy the luxury of having any log files or messages to attempt to diagnose from.
View 9 Replies
View Related