Programming :: Resources Locking In C?
May 9, 2011I am going to write two or more programs that will take control of the same resources on Linux. What are the common methods/functions that can do locks/synchronizations in C?
View 1 RepliesI am going to write two or more programs that will take control of the same resources on Linux. What are the common methods/functions that can do locks/synchronizations in C?
View 1 RepliesI am starting a new job, and need to brush up on my expect scripting. does anyone know of any online resources, e.g. telnet sites that I can play about with?
View 4 Replies View RelatedThanks to a recommendation and extensive help at phpbuilder.com, I have a multithreaded PHP script running in the cloud which fetches images and stores them the cloud. It's fast and seems quite stable.However, I believe I'm faced with a situation where I need to introduce an additional lock to my code so I'm faced with the task of managing multiple locks while avoiding deadlock.I was wondering if anyone had experience managing multiple locks in a MT environment and could recommend specific functions and data structures. I've been reading up on resource management in MT code and understand a few things:
1) Any shared resource (global and static vars, memory, file descriptors, etc.) generally needs to be protected by some kind of mutex, lock, or other sync var.
2) There are certain necessary conditions for deadlock to occur.
3) You generally need a sequence or hierarchy of your resources so that all threads request them in the same order. This sequence must be the same and immutable for all participating threads, whether local or remote.
4) Recursion is a very common cause of deadlock, so a process should know what locks it has acquired previously in order to avoid blocking in the attempt to re-acquire those same locks again.
5) There are a variety of algorithms described such as the Banker's Algorithm, the Chandy/Misra solution, etc. to help avoid deadlock.
I'm hoping to come up with a technique for properly handling multiple locks/mutexes/sync vars that I can re-use in the future, but I'm still coming to grips with the algorithm descriptions and am unsure precisely what sorts of data structures or functions I'll need.
If I have two threads working on the same array but on different sections of the array, can they both write and read from it without locks and not have any sort of problem? If they only use completely different parts of the array always without affecting the other part?
View 3 Replies View Relatedlocking mutex (phtread_t type) in a signal handler function (installed by function signal()) for Linux. It seems that if the mutex has been previously locked by another thread outside the signal handler function and then the signal handler function tries to lock it, the whole process hangs.
View 5 Replies View RelatedI have a linked list that two threads work on simultainiousley.The first thread is adding elements to end the linked list while the second is removing them from the front. Can this be done without a lock on the linked list head when elements are beingadded/removed?
I think this lock is causing a performance hit to my application. If there isnt any safe way without it then thats fine but just thought I would check.The first thread uses this fuction to add elements to the list. Full source here. [URL]
Code:
/* Lets add the new packet to the queue. */
pthread_mutex_lock(&workers[queuenum].queue.lock); // Grab lock on queue.
if (workers[queuenum].queue.qlen == 0){ // Check if any packets are in the queue.
[code]...
I have a OpenSuSE 11.1 box that is running mysql and apache. My database is only about a megabyte in size and I only have a few users per day on my site. How is it that with 8GB of RAM over 5GB is being used?
View 5 Replies View Relatedthere slackers, Can you give me some resources on using TOR? [URL]... I have used TOR and privoxy using a tutorial online, however, if you have some good links, I will study them.
View 2 Replies View RelatedDoes anyone know of any decent web guides that will help me set up a DNS service running on 127.0.0.1 that will automatically forward requests like [url] to httpd running on 127.0.0.1 yet forwards requests to [url]to a 'proper' DNS service?
View 5 Replies View RelatedFedora 15 Alpha
3861 user 20 0 904m 128m 33m S 0.7 6.4 1:11.52 xulrunner-bin
1323 user 20 0 1555m 95m 31m S 13.5 4.8 4:06.87 gnome-shell
3494 user 20 0 1028m 50m 21m S 12.8 2.5 1:43.32 evolution
I just wondering what is the difference between RES, SHR, and VIRT.
1) The VIRT always seems to be higher. Is this using the paging file system. (virtual memory on the harddisk, the swap memory)
2) Is the RES memory the actual physical RAM memory?
3) Is shared memory sharing memory with other processes?
4) Just a final question. As I am running on a HP Mini 210, memory and CPU is a resource I don't have a abundence of. So if was to compare for example 2 difference browsers i.e. firefox and midora. What should I brench mark between to 2 to find what one uses less resources?
I got a fairly powerful media system standing in the living room. Most of the time it is idle, maybe playing some music. At the same time I got a pc in my office that is less resourceful. Sometimes I'll run programs through ssh with the -XC command. This way a program will seemingly run on my office pc, while it is actually running on the media system. However:
- 1 Sound doesn't transfer to the office pc.
- 2 When using a program I need to constantly be aware where I save my files. Sometimes it is located on the remote computer, sometimes it isn't.
Is there an alternative way to use the resources on the media system? E.g. run a program that is stored on the office pc and process it on the media system?
Both systems run Ubuntu (office 10.04 - media system 10.10).
I just recently upgraded to 10.04 (LTS), and I have been using stardict app for a long time with nothing but success on each of my laptops anc omputers I have had over the past 5 or so years. However, now on one laptop after starting the stardict app, using panel the applet I create on my xfce4-panel (as per normal), and looking up a word (highlight and press shift key which i use for quick word search using stardict), the app begins using 100% cpu resources and I am forced to kill the process.
The problem is intermittent, meaning it only occurs half of the time I am using startdict, while the other half of the time it operates normally. I have been trying to see if there is some sort of pattern, perhaps other apps running at the time the problem occurs, but I have not noticed anything that looks suspect, since I generally fire up stardict after boot in and have had it happen (the problem) without even any other apps open (just testing to try and find some sort of pattern to locate the potential issue).
I use a standard set of dictionary files for each of my comps, and currently have stardict operating fine on 2 other laptops, both running 10.04, with the same set-up (xubu, same apps, and settings pretty much) and no indications of this problem on either (other) laptop.
I installed ubuntu 10.10 a few days ago which ran very well on the live CD but after I upgraded to 11.04 Ubuntu uses just under 100% of my cpu after a few mins loged in and my ram usage increases by about 5MB per sec starting the second I log in. I am using Classic Gnome and it seems to do this wether I have metacity or compiz turned on. Does anyone know what is going on or know about a way to lower ether my cpu or ram usage
View 3 Replies View RelatedI was thinking of converting openproj to kplato, but I can't edit the resource I've added. Even when adding the resource all I could do was add the name and work type.
I didn't find anything in a google search, or at koffice.org/kplato, or in the deskop help.
How do I edit the initials, email, available... all the other resource fields?
I have a problem with kmix, which uses 100% of my processor without stopping. And this is providing to rise of temperature above 80 degrees Celsius. The problem became actual after one of updates of 11.4 series.
View 9 Replies View RelatedDoes ubuntu use less resources than windows 7/vista? what exactly does this mean? when i'm using ubuntu, my laptops gets somewhat hot, but when i'm on windows 7, it doesn't.
View 9 Replies View RelatedI don't know if anyone has used Damn Small Linux (dsl), but they've got a cool little system resource monitor builtin to the desktop on the latest version-- Not the old wmnet and wmcpu that they had before, and not the ones the come with ubuntu-- Attached is a screenshot, it's in the upper right corner outlined in red .
Was wondering if anyone knew the app that this was-- I checked their packages list and didn't see it, I don't know if they modified wmnet/wmcpu or something, but I think it'd be cool to have this on my desktop.Especially if it's from DSL, because that is a tiny distro that runs fast, it can't take up too much room.
I've been using 11.04 for a few weeks now on my laptop. All of a sudden this morning, when I boot, it tells me that my laptop doesn't have enough resources to run Unity, although I've been using it the whole time. It's dropped me back to the old interface. Does anyone know why it did this?
How can I set it back?
i have 2 front ends that receive traffic (http server) and should run some scripts in crontab, some of the scripts should just being running by 1 server at a time (active one) and others should run on both. Regarding the http like is load-sharing i think i cant use heartbeat, right? heartbeat is just for active-stanby or can we use to a active-active as watchdog? i have a cisco css to load sharing the http, and i can make a watchdog script to the apache. Regarding the cron crontrol i was thinking to make a script that replaces the crontab file to whatever is the correct one.
When the heartbeat start what parameter is sent to the script that are resources? a start if active node and nothing if is the standby?allways start?how should i config the haresources to do it? what is the best way? i have other situation that is making a nfs server in solaris 10, i have 2 servers with shared disks ( sun array), can i use heartbeat to this too? it is possible to make it in such way that if i had i failover in nfs server the clients doesn't need to reconnect?
For some reason after the computer is on and idle for a little while, xorg starts using all available CPU resources. When I come back to use the computer, the screensaver will either be running nice and smooth, or the screen will be black. Either way, I won't be able to get back into X. I'll ctrl+alt+F1 into the console (or ssh if it's still not responsive) and "top" shows me xorg is hogging all the available CPU. I'll kill -3 <xorg> and it the computer will come back to the kdm login screen. What the %^&*??? It's a little irritating. I'm using Bouncing Cows and Seti@home when idle, but both do their job and close when they detect a key press. Even via ssh when I already tried waking up the computer with a key press, it shows the screensaver closed and seti is idle.
I have Debian/Lenny installed with all the latest updates and ATI's driver installed with direct rendering enabled (350fps for the cows, >2000fps for glxgears). it's a Dell Inspiron 8600, 1.8GHz, 2GB Ram, 80GB HD, ATI 9600pro/128MB.
How would I demonstrate that sensitive application resources are not shared across processes that are owned by different users?
View 2 Replies View RelatedOn a HA Cluster, the 2nd node noticed that the 1st node was down, but the first node wasn't down. This resulted in that the 2nd node tried to take over the resources but failed, because the resource was stil in use by the first node. This caused that the first node was left behind in a fuzzy state. I had no other choice to kill the heartbeat service, and reboot the server to solve the issue. There where no network issues, all hardware is ok. Are there any bugs know? Is there a way to avoid it from happening again?
View 1 Replies View RelatedFew months ago I'd got problem with Apache on my server. Some requests required few gigabytes of memory, much more than was avaliable. I've set ulimit -v XXX and this fixed problem for single request.
I still have got problem with multiples requests. I'm using mod perl, which runs few processes for requests, we can assume, that there is one process per request. ulimit cannot handle this, because it can only limit virtual memory per process, not for a group of processes.
Is it possible to prioritize internet resources for a particular application or a set of applications and control their maximum bandwidth, etc ?
I've a download manager and a bittorrent client where I want to prioritize resources for the bittorrent client followed by the download manager but I want them to collectively have a speed less than a manually defined max value since even the other apps need some resources.
I just installed CentOS 5.2 and then applied the updates to 5.3. Now I get repeated popups saying:
The NetworkManager applet could not find some required resources. It cannot continue.
I killed it and ran it from a terminal and saw the following output:
** (nm-applet:4648): WARNING **: Icon nm-device-wwan missing: Icon 'nm-device-wwan' not present in theme
** (nm-applet:4648): WARNING **: No connections defined
I am not an expert and have only been managing my server not too long. My server is running kind of slow so someone suggested running the 'top' command via shell, and I found a few things using major resources, but I don't know what they are or how to fix them. can someone suggest some things.
View 6 Replies View RelatedSince I've moved the /home directory to a new mountpoint
(UUID=160b687f-3e30-472e-97c2-7fd6149ec2a0 /home ext4 nodev,nosuid 0 2 in the fstab menu)
I have troubles with the links on the places menu. all the links there that point to /home or to one of its subdirectories don't work any longer.
The links to usb drivers and other places work just fine.
I've tried reinstalling xdg and gnome-menu, but no success.
if I do cat .xsession-errors right after trying to click on one of the links in the menu this is what I see:
giulio@giulio:~$ cat .xsession-errors
...
/home/giulio: /home/giulio: is a directory
After I reboot, my java process consumes 100% of the CPU then settles down to about 40% CPU and 12% memory, status sleeping (4 core AMD) I've removed the OpenJDK and installed Sun JRE but no difference. by comparison, Firefox with a lot of tabs is at 3% CPU and 5% memory. Is it better if I go to 10.04 LTS, 32 bit?
View 3 Replies View RelatedThe PC is AMD Duron, 370 Mb memory, Hd Maxtor 80 Gb, S.O. 10:10 maverick Ubuntu, Linux Kernel 2.6.35-24-generic.
Gnome 2.32.0
Who can I analyze this screen:
Code:
In particular, what are all the mistakes, how much ram maximum support, what kind and where can I find the physical space for the three DIMMs that are missing in the list.
way to run .exe programs (with a bunch of resources) in Ubuntu 11.04 (natty).
View 3 Replies View Related