Red Hat / Fedora :: Get The Value Of CPU Usage - Memory Usage ?
Jan 13, 2009
I am sure that all of us know the result of top command in linux. i want to get the value that the top command return as CPU usage, memory usage. so how do i do(programming relation)?
I was trying to get the status of memory usage and disk usage using sigar in windows and ubuntu. done this in windows by just copying the sigar library into jdk library. But i was unable to do so in ubuntu. I've copied the library to java-6-sun library but still can't run the program.
I've come across a really strange issue with one of my RHEL servers. The "free" command shows that 7019 MB of memory are actually in use by my system, but when summing up the actual usage (or even virtual usage like the example below) it doesn't add up - the sum is far less than what is reported by "free":
I'm running into a problem where my system is running out of disk space on the root partition, but I can't figure out where the runaway usage is. I've had a stable system for a couple of years now, and it just ran out of space. I cleaned some files up to get the system workable again, but can't find the big usage area, and I'm getting conflicting results.For example, when I do a df it says I'm using 44GB out of 58 GB:
Code: [root@Zion ~]# df -h Filesystem Size Used Avail Use% Mounted on
I'm trying to understand the performance of my machine and memory usage just isn't adding up. When I run top it will typically show 301M of 308M used but the total of everything in the RES column is no where near 300M and the total of %MEM column isn't more than 20-30%. So how do I figure out what is using all the memory? Then is there some way to control it to optimize performance?
Memory of my Linux database servce is all used up. I first noted that this morning and rebooted the box. 5 hours later, it saw used up again. I want to find out which process is responsible for using most of the memories. What Redhat Linux utility can list processes sorting by their memory usage, like the Windows task manager?free and vmstat - summary but not for each processtop appears to be infomative, but sum of non-zero %MEM never add upp to 100
My manager has asked me to look at memory drop on system while I was doing test on Linux machine. there is a big dip in memory graph produced by another tool. I do not know which processes were responsible for those memory dips. Is there any way I can find memory utilization of process during last week?
I upgraded from Fedora 13 to 14 over the network. Everything seems to have worked. The one problem after my install is that I have noticed that setroubleshootd consumes alot of memory.
[Code]
It doesn't take long for setroubleshootd to jump in memory usage. I can kill the process but it will start up again. I have tried disabling the service but it doesn't show up in /etc/init.d. # service setroubledshootd stop setroubledshootd: unrecognized service So I am not sure what I can do to resolve the issue with setroubleshootd besides killing it off every 15 minutes.
In our database, when checking the memory used using top command, we are always seeing 32 GB RAM utilized. We have set the sga_max_size to 8gb and PGA to 3 gb. We have tried shutting down oracle db and then the memory went down to 24GB when checked using top command. After cold reboot of the DB server, it gone down to 1.5 GB.
But once the users are started using after end of day, the memory again gone back to 32 GB.
My server is keep on hanging So I have rebooted several times in the last couple of weeks, the system is eating more memory and the usage is keep on increasing and at particular time it became saturated and my server hungs. I could not find which process is eating more memory. I have used the below commands to check if any process is eating more memory but no luck. No such process are using high memory.
I have a computer with 16GB of ram. At the moment, top shows all the RAM is taken, (NOT by cache), but the RAM used by the various processes is very far from 16GB.I have seen this problem several times, but I don't understand what is happening.My only remedy so far has been to reboot the machine.
I am looking for free database that has low memory usage and innodb and memory like engins that has C API and support trigger and client/server support for using in embedded linux systems.
I was browsing my folder with lots of images, after finished i close nautilus and i notice that my computer became slow, so i'll check it with system monitor and had found that nautilus are using almost 100mb of ram (opening 4 tabs). I'm not sure if this was normal or not because i try to reopen the same folder with pcmanfm and it only consumes less than 20mb of ram (opening 4 tabs). here's the screenshot from system monitor .
Is there any way to monitor one process' CPU usage and RAM usage over time on Linux? I am trying to change to a cheaper VPS and need to work out what level of CPU and RAM I need!
Is this normal? I check system monitor and the top most is xorg followed by compiz. When I first started with 10.04 few days ago it was around 300-500MB.
I am having a few problems with a red hat box involving memory usage. I have 64Gb memory and 'top' tells me I'm using 60Gb of it, but if I add up all the '%MEM' figures I get no more than 20%. Where is the other 80% ?
We have an ORACLE instance that is using shared memory but this is ceilinged at 45Gb. That means there is about 15Gb unnaccounted for . What utilities can I use on red hat to ascertain memory usage other than 'top' ? Any better ones, more detailed, looking at shared memory etc and swap ?
I am a bit worried about my linux vserver box. No more memory is left. To investigate this issue, i was looking at "top". But it deeply confuses me. It seems that no more memory is left, altough the process list in top never adds up to 100%
My problem seems to be very simple, it's high memory usage. I occasionally will use movie player to watch a few shows and I use firefox as well. My memory usage starts out real small about 500 mb but after using firefox lightly and movie player it jumps to almost 2 gigs and this is after they've been closed what gives? I've attached an image so you can see what I'm talking about.
I've been having some problems with Lucid; all my applications seem to be hogging memory like no tomorrow. Within about 15 minutes from booting the system, processes like Google Chrome, Nautilus, Python, Pidgin all start to take seemingly too large amounts of memory.
Chrome is the worst one, easily shooting over 200-300MB of my 2GB's of RAM. I would have reported this as a bug in Chrome itself, but my other applications seem to share the problem to some extent. Also: My colleague has identical hardware and identical versions of Ubuntu / Chrome, while he has no memory problems whatsoever.Currently I am running Chrome, Geany, Pidgin, Thunderbird and FileZilla. For this and itself, Ubuntu now consumes 1.8GB of RAM (that's including 500MB cached).
I've installed my debian sid about one month ago (first xfce, next gnome) but noticed that it's kind of really slow. The upgrades take ages, launching (and using) firefox takes so much time,... In comparaison to my ubuntu, archlinux (on the same computer) or previous installation of debian there is clearly a problem somewhere.Today I tried to do a "top" sorted by mem usage : 3.5% xulrunner-stub, 2.1% dropbox, 1.4% aptitude (doing upgrade), 1.4% clementine,... nothing terriblebut still I've 2.7Gb or RAM used (more than 50%)
$ free -m total used free shared buffers cached Mem: 3967 26851282 0 79 1938
I use a Debian Squeeze system running off a flash drive, i.e. based on a custom Live image running in persistent mode. It runs great and I am grateful for the existence of Debian . However, I have a question. A lot of the machines I use this pen drive on are quite old, often with 512 MB RAM and old processors. I specifically built my system using XFCE and lightweight apps off an initial live image using the standard-x11 package list (basically just Xorg with drivers and the base system). At first things ran very well, blazing fast even on the oldest systems and could comfortably run Firefox along with LibreOffice side by side (I need LO as all of my colleagues use Word docs, often with track changes, which Abiword can't handle properly). However, over time, I've found that memory usage has risen, tot he point where Firefox is now automatically killed on the older systems every time I start LibreOffice.how does one figure out why memory usage is going up? I've checked for inessential services and turned them off with "insserv -r". I've used only lightweight apps, as mentioned before. Are there other general tips on reducing memory usage?
My memory usage keeps building up and up as time goes by, when it hits 100% it logs me out. I have a tad over 9 Gib in swap, that is how it was set up when I did auto install and let it do the partitions. I have 4Gib memory. When I first log in it shows about about 800+ MB being used and starts adding more from there. I can run applications for about 6-8 hours before it hits the 100% marker. At first I was blaming this on the video player, but when I put on a play list each song seems like it gets stuck in memory also adding about .02% per song to max memory.
This seems to pertain to each application I do. Uploaded with ImageShack.us. This pic shows my Computer setting and usage through system monitor and conky script. In another note, I never see my Swap partition being used at all. I am real green to linux and not sure if I was suppose to turn something on. In the partitioner it shows it being mounted and as swap.
In linux, how can I display memory usage of each process if i do a 'ps -ef'?I would like to the 'virtual memory', 'res memory', 'shared memory' of each progress. I can get that via 'top', but I want the same info in 'ps -ef ' so that I can pipe the output to 'grep {my process name}'.