I wrote a program that multiplies 2 matrices using multi-threads and another one using multiple processes and shared memory. Both in C.I need to find the total memory usage of these programs. I know of the top command, but when my matrices are relatively small they don't even show up on top because they complete so fast, how can I find the memory usage for these instances?Also, how can I find the total turnaround time of my programs?
I am able to build a shared library under solaris with /usr/local/bin/g++ -G -o output.so file1.o file2.o file3.o. How do I build the shared library under linux using the same files? I have tried to use the same command /usr/local/bin/g++ -g -o but I got some undefined references, even if those references are defined in one of the object files.
I think that the solution is very simple, but I cannot reach this solution. I'm trying to build an B.so that uses A.so.
A.so is compiled using C; B.so is compiled using C++;
Inside "Aso.h" file I'had declared:
Code:
#ifdef __cplusplus extern "C" { #endif
[code]....
There's no error to compile that, this library seems to be compiled correctly, but using the "nm" command the Aso.so functions appear with "U" of undefined. Trying to build an executable using the Bso.so library, I got this error: /lib/../lib/libBso.so: undefined reference to `foo(int, int, int)' I think that to solve this problem it's only link the Aso.so with the .o files generated at the compilation phase of my Bso. Using the "ldd" command I'm able to see that Bso.so depends on Aso.so, so what am I missing?
I am trying to install the WebSphere MQ Client on a Red Hat Version 5 server with OS x86_64bit.. When I try the first step of there process it fails trying to find shared libraries:ERROR: Installation will not succeed unless the license agreement can be accepted. The MQ Client is 32 bit, but I am told it should work on 64 bit server...
I was trying to get the status of memory usage and disk usage using sigar in windows and ubuntu. done this in windows by just copying the sigar library into jdk library. But i was unable to do so in ubuntu. I've copied the library to java-6-sun library but still can't run the program.
I have a java program that runs on Debian as a background processor. Yesterday the Java program stopped running. I looked at the memory usage, the system only had 5MB memory left, so my guess is that the java program ran out of memory to use.
However, after we restarted the java program, we could see that the free memory count started to go up. It kept going up from 5MB to over 400MB. The increase of memory happened slowly, when I measured it, I could see that with each minute passing by, there were a bit more memory added into the free memory pool, and meanwhile, the java background process was running.
I wonder why this would ever happen. It's as if our java program first brought the machine done because it consumed all the memories, then after restart, it starts to give back memories.
I recently installed a Slackware-13.1 32 bit system and I encountered an odd problem. I had a texlive-2010 package from slackbuilds.org, previously compiled on another 32 bit Slack-13.1 system. I just installed the precompiled package on the new system. However, whenever I tried to issue a latex command, kpathsea complained that it cannot find the shared library file libkpathsea.so.6. I googled a bit and I found that this could be circumvented by setting the environment variable LD_LIBRARY_PATH to "/usr/share/texmf/lib", where the library in question actually is.
This solved the problem. The weird thing is that on other machines I have installed, kpathsea had no issues whatsoever and I did not have to set the LD_LIBRARY_PATH. The only difference is that on those systems I had compiled and installed texlive, not just installing a precompiled package. Could that be causing the issue?
Top only show the memory usage for individual processes. Apache often runs hundreds of processes, each of which may use only a small amount of memory, however the total memory consumed by all apache processes can be fairly large.Is there a way to see the total memory usage for all apache processes?
I have a computer with 16GB of ram. At the moment, top shows all the RAM is taken, (NOT by cache), but the RAM used by the various processes is very far from 16GB.I have seen this problem several times, but I don't understand what is happening.My only remedy so far has been to reboot the machine.
I have just started to have a problem with Xorg it is always using at least 30% of my CPU, and the whole system does not run smooth so if I play a video it does not run smooth, it judders, also even if I drag an icon it judders across the screen. Im running Ubuntu 10.10 2.6.35-25-generic x86_64 VGA compatible controller: nVidia Corporation G98M [GeForce G105M] (rev a2)
I use a Debian Squeeze system running off a flash drive, i.e. based on a custom Live image running in persistent mode. It runs great and I am grateful for the existence of Debian . However, I have a question. A lot of the machines I use this pen drive on are quite old, often with 512 MB RAM and old processors. I specifically built my system using XFCE and lightweight apps off an initial live image using the standard-x11 package list (basically just Xorg with drivers and the base system). At first things ran very well, blazing fast even on the oldest systems and could comfortably run Firefox along with LibreOffice side by side (I need LO as all of my colleagues use Word docs, often with track changes, which Abiword can't handle properly). However, over time, I've found that memory usage has risen, tot he point where Firefox is now automatically killed on the older systems every time I start LibreOffice.how does one figure out why memory usage is going up? I've checked for inessential services and turned them off with "insserv -r". I've used only lightweight apps, as mentioned before. Are there other general tips on reducing memory usage?
I have a program that creates and uses a shared memory segment. I am trying to find out how to detach and delete this shared memory segment when I hit crtl-C, and I still need the process to terminate.shmdt() and shmctl() have variables that are local to the main passed to them(shared and shmid)
Code: //Prototype void leave(int sig); //part of code trying to use signal handling if(signal(SIGINT, leave))
in this example, my memory 993.4 MiB memory is said to have 575.9 MiB of it used and 163.4MiB of my 2.8 GiB swap memory used. but in my processes tab, the most memory hogging program is 98.3 MiB, and Pidgin, 25.9 MiB, and 18.9 MiB, 14.9, 6.2,6.1,5.2,3.4,3.3,1.8,1.8,1.7, etc. I'm certain these don't add up to 575.9 MiB so where is all this extra memory usage coming from?
My server is keep on hanging So I have rebooted several times in the last couple of weeks, the system is eating more memory and the usage is keep on increasing and at particular time it became saturated and my server hungs. I could not find which process is eating more memory. I have used the below commands to check if any process is eating more memory but no luck. No such process are using high memory.
Basically I have a machine with 16GB of RAM and have just discovered that using all of it can crash the whole system over one process. How could I run a process on the system in such a way that if more than 90% of system memory is used, the process immediately crashes?
I'm reading about shared, static, and dynamic libraries. What is SDL? Is it static, shared, or dynamic?
I always thought a library would be a lot of .h and .cpp files compiled separately into .o files and then if you compiled your own program you could use the -l parameter to link the library and it was all compiled together. Now I'm not so sure.
I don't even see any SDL .cpp files in my system anywhere. All I have are lots of SDL .h files in /usr/include/SDL and I don't really understand the code in them.
I'm making a wild guess here: SDL is a shared library. SDL itself is NOT compiled into my program, therefore SDL must be on any system my program tries to run on. When I compile and link SDL all it needs is the header files to know what SDL function and objects it can use. And then on every system it uses an already compiled SDL shared library thingy somewhere.
So... where is that part of SDL? All I can find are header files.
I'm thinking the advantage of shared libraries is that someone could say update SDL on their own system and take advantage of the new features without having to download new executables with the new version of SDL compiled into them for every program that uses SDL.
So if I'm making an editor and a game engine and they both use a lot of the same .cpp and .h files that I wrote and I'm tired of updating one and then the other and I need to turn them into a library, then a shared library might be kind of a silly solution. I could just make a static library. Right? Because it's not SDL. Nobody else is ever going to use this library.
Is there any way to monitor one process' CPU usage and RAM usage over time on Linux? I am trying to change to a cheaper VPS and need to work out what level of CPU and RAM I need!
Due to a ton of research I believe I now understand the output of ps, top, and free better than ever, and also have a relatively decent grasp on memory management (virtual address space, etc.) than I ever did before. With that being said, my server is super low on available memory and I can't make 1+1=2 on why it is. I suspect it's Tomcat/JVM (which I admittedly know precious little about). I am rebuilding this server (for a number of reasons) and plan to install 8GB but solving this mystery is key to supporting/promoting my design plans.
Relevant info So, I have very little memory left, I am swapping pretty hardcore, and even though I suspect it's the Tomcat/JVM stuff, it sure doesn't look like it from the memory tools. For that matter though it looks like "nothing" is using memory, or certainly not enough to cause such a low memory problem. The server was rebooted 24 days or so ago because it actually ran out of all virtual memory. How do I solve this mystery? Am I using the wrong tools? Am I misunderstanding my tools? What can I do to track down the processes depleting my memory?
I am looking for free database that has low memory usage and innodb and memory like engins that has C API and support trigger and client/server support for using in embedded linux systems.
I am a bit worried about my linux vserver box. No more memory is left. To investigate this issue, i was looking at "top". But it deeply confuses me. It seems that no more memory is left, altough the process list in top never adds up to 100%
I have a similar question: How to make a share folder of virtualbox if I have installed linux Ubuntu 10.10 in virtualbox machine virtualbox is my guest machine and Linux Mint is my host machine. I have installed VirtualBox OSE in Linux Mint and I have installed Windows XP/7, made a sharing folder from guest machine Windows XP/7. My host machine is Linux Mint/Ubuntu, I mean it is on my PC. How to make a share folder in virtual machine linux ubuntu 10.10 LTS in virtualbox OSE to host machine Linux Mint 11 Katya?
I want to create a "Shared Memory" in linux, then create multiple "Shared Objects" that can access to a Table for example; And one of them can write something into the Table and the other can access and read it, so that these operations can be handled by programmer! I'm using Ubuntu 9.04 and I've set it's runlevel at 3 (I have commandline environment now!) I've searched the Internet so much, but couldn't find a good sample code for this! I have no experience about it and need your help to introduce me a sample code about it and advise me how to compile and use it with "GCC"?!
In linux, how can I display memory usage of each process if i do a 'ps -ef'?I would like to the 'virtual memory', 'res memory', 'shared memory' of each progress. I can get that via 'top', but I want the same info in 'ps -ef ' so that I can pipe the output to 'grep {my process name}'.