Fedora :: Slow Down High Nautilus Memory Usage After Loading Folder With Lots Of Images?
Nov 22, 2010
I was browsing my folder with lots of images, after finished i close nautilus and i notice that my computer became slow, so i'll check it with system monitor and had found that nautilus are using almost 100mb of ram (opening 4 tabs). I'm not sure if this was normal or not because i try to reopen the same folder with pcmanfm and it only consumes less than 20mb of ram (opening 4 tabs).
here's the screenshot from system monitor .
I upgraded from Fedora 13 to 14 over the network. Everything seems to have worked. The one problem after my install is that I have noticed that setroubleshootd consumes alot of memory.
[Code]
It doesn't take long for setroubleshootd to jump in memory usage. I can kill the process but it will start up again. I have tried disabling the service but it doesn't show up in /etc/init.d. # service setroubledshootd stop setroubledshootd: unrecognized service So I am not sure what I can do to resolve the issue with setroubleshootd besides killing it off every 15 minutes.
In our database, when checking the memory used using top command, we are always seeing 32 GB RAM utilized. We have set the sga_max_size to 8gb and PGA to 3 gb. We have tried shutting down oracle db and then the memory went down to 24GB when checked using top command. After cold reboot of the DB server, it gone down to 1.5 GB.
But once the users are started using after end of day, the memory again gone back to 32 GB.
My problem seems to be very simple, it's high memory usage. I occasionally will use movie player to watch a few shows and I use firefox as well. My memory usage starts out real small about 500 mb but after using firefox lightly and movie player it jumps to almost 2 gigs and this is after they've been closed what gives? I've attached an image so you can see what I'm talking about.
I've installed my debian sid about one month ago (first xfce, next gnome) but noticed that it's kind of really slow. The upgrades take ages, launching (and using) firefox takes so much time,... In comparaison to my ubuntu, archlinux (on the same computer) or previous installation of debian there is clearly a problem somewhere.Today I tried to do a "top" sorted by mem usage : 3.5% xulrunner-stub, 2.1% dropbox, 1.4% aptitude (doing upgrade), 1.4% clementine,... nothing terriblebut still I've 2.7Gb or RAM used (more than 50%)
$ free -m total used free shared buffers cached Mem: 3967 26851282 0 79 1938
i am having a problem that i would call a bit "important" with my server. so, from last 3 weeks the used space of my hard disk (RAID I) started growing up. i have 2 x 1 tb HDD working on RAID I and i did not install anything those weeks. the space just started changing from 90 GB till 580 GB. now the situation is stable there but i think it's not normal.
the bandwidth usage is low (like 120 gb in 2 months) and i am running 6 counter strike gameservers, a forum, a very little website and some local stuffs... a friend of mine told me that my server could have been hacked but i am afraid it did... some useful informations: when i reboot the server the used space goes down again to ~100 GB and then it starts going up again. i cant really find where all those files are located:
I've been using Ubuntu consistently for about three days now. I really, really love the interface and how everything works and all that, but I've been having a couple of weird problems with speed.
Graphics things seem to work really well. When I go into the overview of all my workspaces, it's instant and looks great. The problem is when I open and use some applications.
For example, when I open up the software center, it takes longer than it did the first time to start up. Also, when I drag windows off from being maximized, it takes literally about five seconds for it to show up as being dragged around by my mouse.
When I look at the system monitor, about 20% of my CPU cores are constantly being used. That's 20% each. I have a 3-core CPU, could that be the problem?
Another example: when I went to ..... just now, it would take a second for any volume changes in the video to register.
And I also have smooth scrolling in firefox, but it's very unresponsive now. It's slow as all hell. Even notifications are showing up more slowly.
So, what's the deal? What could I have done wrong?
CPU: AMD Phenom II X3 2.8Ghz GPU: ATI Radeon HD 4860 1gb RAM: 4gb
One more thing: I have really bad screen tearing when I try to move windows around, as if there isn't any vsync on. Where can I turn it on or fix this?
I'm still pretty new to Ubuntu and Linux, but I'm an advanced computer user. Since I feel linux is killing my CPU I think I'm going to boot back into Windows 7 for now, but if I can fix this problem I think I want to keep Ubuntu as my main OS after this, having Windows for games, unless I can get Wine to work right with them.
Basically I have a machine with 16GB of RAM and have just discovered that using all of it can crash the whole system over one process. How could I run a process on the system in such a way that if more than 90% of system memory is used, the process immediately crashes?
I recently upgraded to Ubuntu 10.10 and I am experiencing an ultra high memory usage of the gnome-settings-daemon of 2GB after suspend! Killing and restarting the daemon solves the issue. Anybody else with this behavior?
My problem is extremely slow write on hard disk and 100% cpu usage and it happens when I want to write something on the hard derive not any other external derive.
Tried a fresh ubuntu install. No change. I am not even sure if it is a software or hardware problem.
I am sure that all of us know the result of top command in linux. i want to get the value that the top command return as CPU usage, memory usage. so how do i do(programming relation)?
I have been using ubuntu for a while and i like it a lot, im a web developer and i have windows xp installed in virtual box, i moved completely to linux and just use windows to test in ie, it had been a while since i didnt use windows and i had to use in the last few days and noticed how much faster it is, the thing that bothered me the most is when opening folders in the desktop or the recycle bin, in windows its instant, in ubuntu opening a folder takes a long time to open nautilus, is this normal or is my installation bad, any comments are appreciated, i dont want to abandon ubuntu, i really like it but it really bothers me that nautilus is so slow to open.
top says there's only 12MB free (out of 1GB), but I can't figure out what's using all the RAM. rtorrent is using 13MB, and the rest are in bytes. (ran top as root)
so my server is doing fine, but there is one odd thing I would like to fix. When I start and stop services, the CPU is maxed out for about five seconds each time. The services start the same speed, but it still does this. Small things like lm_sensors don't do this, just things like httpd and sendmail. THis server was upgraded to Fedora 11 with a netinstall CD a few months ago.
I upgraded to 10.10 a few months ago. All was well until a week or so ago, my PC was incredibly slow to respond to anything - mouse clicks, etc. After investigating, I found Evolution using 90% of my RAM - I have 4G. I restarted the PC, started up Evolution and everything was better for a very short while. I then found Evolution using 2Gb+ again later that day. I have noticed that immediately upon startup of Evolution, watching top, evolution uses 900Mb+ Virt and 450+Mb Res. If I open a single email, it jumps to 1300+ Virt, 500+ Res. The more I read a few more emails, the more it jumps.
I have updated everything possible via Synaptic. I only have a handful of accounts - probably 3 POP3 that download to local folders, and about 5 IMAP (GMail) based ones. Nothing excessive IMHO. Now, Evolution is basically useless. All I can do is open it, check email, and close it as soon as possible. I'd rather not switch to another email client - I did that back in 10.04 when I switched off Thunderbird in favor of Evolution because of all the Evolution integration in Ubuntu/Gnome.
I was trying to get the status of memory usage and disk usage using sigar in windows and ubuntu. done this in windows by just copying the sigar library into jdk library. But i was unable to do so in ubuntu. I've copied the library to java-6-sun library but still can't run the program.
I use nautilus and its awful. The only good thing are the tabs and the bookmarks on the cons:
* it freezes now and then when moving lots of files.
* Slow to navigate while large amounts of data are moving or displaying.
* Slow presenting of files where there are a lot of them (it should only render whatever is in screen and maybe render a properly sized navigation bar. Anything else shouldnt freeze the client. Render at its once pace if at all. I dont move the scroll bar it shouldnt freeze
* It doesnt queue (meaning it moves everything at the same time instead of queueing stuff, what makes moving stuff take longer) .
I basically am hoping for a line of bash script I can put into "Open With" for folders so I can get a terminal with the right path. I hate manually typing in paths to places when I am looking right at them in nautilus. "gnome-terminal" doesn't work - it just opens a terminal to ~.
I'm trying to understand the performance of my machine and memory usage just isn't adding up. When I run top it will typically show 301M of 308M used but the total of everything in the RES column is no where near 300M and the total of %MEM column isn't more than 20-30%. So how do I figure out what is using all the memory? Then is there some way to control it to optimize performance?
Memory of my Linux database servce is all used up. I first noted that this morning and rebooted the box. 5 hours later, it saw used up again. I want to find out which process is responsible for using most of the memories. What Redhat Linux utility can list processes sorting by their memory usage, like the Windows task manager?free and vmstat - summary but not for each processtop appears to be infomative, but sum of non-zero %MEM never add upp to 100
I have a computer with 16GB of ram. At the moment, top shows all the RAM is taken, (NOT by cache), but the RAM used by the various processes is very far from 16GB.I have seen this problem several times, but I don't understand what is happening.My only remedy so far has been to reboot the machine.
I am looking for free database that has low memory usage and innodb and memory like engins that has C API and support trigger and client/server support for using in embedded linux systems.
My manager has asked me to look at memory drop on system while I was doing test on Linux machine. there is a big dip in memory graph produced by another tool. I do not know which processes were responsible for those memory dips. Is there any way I can find memory utilization of process during last week?
I've just noticed that unrar is suddenly taking minutes to extract instead of seconds.
I can remember if its recently been updated, but I've uninstalled and reinstalled it and the version is: unrar.x86_64 0:3.7.8-3.fc10
I've found a few Ubuntu posts about it on Google, but in true Ubuntu fashion nobody has any answers!
What's odd is that when I unrar a file from (Nautilus or "unrar x *.rar") it takes say 4mins, then if I do it again it takes 15secs, like as if its caching somewhere.
My server is keep on hanging So I have rebooted several times in the last couple of weeks, the system is eating more memory and the usage is keep on increasing and at particular time it became saturated and my server hungs. I could not find which process is eating more memory. I have used the below commands to check if any process is eating more memory but no luck. No such process are using high memory.
I'm having issues playing HD videos ( some with 720p level, much more apparent with 1080p level one) The video appears to slow then becomes choppy, becoming de synced from the audio that carries on as normal. This occurs with both VLC player and XBMC, though the latter is better. My specs: Fedora 12 3.2 Ghz P4 processor 1Gig ram Radeon 9800 xt gfx card (I couldn't get the 9.3 ati driver to work)
I've come across a really strange issue with one of my RHEL servers. The "free" command shows that 7019 MB of memory are actually in use by my system, but when summing up the actual usage (or even virtual usage like the example below) it doesn't add up - the sum is far less than what is reported by "free":
I use sftp in nautilus to transfert file to my server but it's very slow. For example for tthe same file to the same IP with nautilus i upload at 1.8Mb/s adn with Filezilla I upload at 8.0mb/s.