Fedora Servers :: Getting A Cheap UPS, The Powercuts Will Last Usually A Maximum Of 30 Seconds?
Jan 17, 2009
I'm out in a village and we get more powercuts than I like, and recovering the journal on my server is getting rather irritating. I'm looking into getting a cheap UPS, the powercuts will last usually a maximum of 30 seconds, so I only need a few minutes.
I've been looking at:Plexus V 500VA UPS
Plexus V 1200VA UPS
APC SUA750I Smart-UPS 750VA
I know near to nothing about these things, my question is will those work with a machine with a 700W PSU? How do you know? The 500VA doesn't really mean much to me. Ideally I'd like to get my desktop on there too, but that's more for convenience than anything.
Will any of these do the job? Any Linux compatibility issues I should plan for? Any recommendations from personal experience is greatly welcome Edit: I will be happy with a UPS that can inform Linux power is down, and get the server to cleanly shutdown straight away, I'm more interested in a clean shutdown than maintaining power to use the machines during the outage. Edit 2: Can UPS devices be piggybacked to one another to provide extra uptime? i.e. Could I run 2 of the 30's so when the first runs down, the 2nd carries on?
I want to try and set up a old cheap computer with ubuntu and run it as a home server to toy around with. how to get started? (Where to get an old computer? Craigslist? What specs should I be looking out for? Wireless vs. ethernet? What software to use? ssh?)
My system clock loses about 10 seconds every minute. The hwclock is fine. I've tried different kernel args (clocksource=acpi_pm, nohz=off, highres=off). None of these have any effect. I am running Fedora 11 with kernel 2.6.30-105.2.23.fc11.x86_64 on an AMD Istanbul node (Processor 2439 SE).
With three 1.5TB, 7200RPM drives in RAID0, they thrash. And yet the network out, 4 gigabit ports LAG'd together to create a single 4 gigabit connection, can't even push a single gigabit a second.
Here's what I've done so far:
I've enabled jumbo frames on the bond: ifconfig bond0 mtu 9000
Tweaked SAMBA performance:
Tweaked hdparm:
I haven't enabled jumbo frames on the switch but I'm almost sure that won't help me much after trying all this.
I'm running out of ideas here guys. The clients connected are pulling down images in both Ghost and WIM (ImageX) format. Large files too, upwards of 12 gigabytes.
I'm writing a client-server program. There are more than 500 clients. I start a thread to process and response to each client and the processing needs some MySQL query. I'm looking for any possible hazards on my server!
1- Any limitation on "Maximum Simultaneous Socket Connection"?
2- Any limitation on using mysql?
3- As socket on Linux are file, Any limitation on number of sockets or threads?
I'm using a Linux server (Centos or Fedora or Ubuntu) and clients are both Linux and Windows.
I'm running nginx for static files and as a proxy server for a comet IM server on ubuntu Jaunty. On high load I'm hitting a limit of 1024 file descriptors. I've tried increasing this limit but still can't pass 1024. Does "more /proc/sys/fs/file-nr" gives me the global count of used file descriptors? Why do I see a maximum of 1024 open file descriptors in /proc/sys/fs/file-nr if this is the global count for the machine and each user should have at least 1024 allowed file descriptors by default? Is there a way to increase the limit while the server is running?
Some relevant info on my server: sudo more /proc/sys/fs/file-nr 1024038001 sudo sysctl fs.file-max fs.file-max = 38001 sudo nano /etc/security/limits.conf ... * hard nofile 30000 * soft nofile 30000
I also added this to /usr/local/nginx/conf/nginx.conf: worker_rlimit_nofile 10240; Uncommented the following line in /etc/pam.d/su: session required pam_limits.so
I have a script which copies(scp) .war file to tomcat's webapps and the .war file extracts creating a folder under webapps directory on a ubuntu system. My next line of the script is restarting the tomcat which is executed immediately before the .war is extracted and this causing problem with
Is it possible so that it waits for few seconds/minutes before it executes the next line of the script. The script is of normal ubuntu commands.
I run KVM on 9.10 host and guest 9.10. Both are x64. When I halt the guest os: My host OS looses its connection with internet for 10-15 seconds and then it comes back on. Probably something goes wrong with default gateway, as it looses only external network (I still can reach it on my internal network). I've googled a lot, but cant find anything about that. I use bridged network.
I'm looking for VPS provider in the US. I'm just trying to get around geo-location restrictions and I figured that the easiest way would be to get a cheap VPS and set up squid. (I'm planning on sharing it with a few friends) Anybody has any experience with any VPS providers?
I hava a ASUS P5GC-MX/1333 and I want to install the driver from my Realtek audio, because, in windows 7 the volume is very loud, but in fedora even in the maximum is very low.
MY current screen resolution is 1200X800 but from my monitor manual I can see that it support these resolution also 1366X768, 1280X960, 1280X1024 but there is no such option to change in my fedora 12. How can I change that?
I'm looking for a cheap tablet to could possibly run Ubuntu with a price range of 50-140 euros, I've searched everywhere on the internet and didn't find much but some used ebay ones. Could ubuntu 10.10 run on an arm11 cortex chipset?
I have a laptop running Ubuntu (not the problem) and I want to connect to a FTP server. Filezilla works really well and so does good old ftp. But, the FTP server I need to connect to will only accept connections from a white-list of addresses. My office IP is static and is on the list. Thus, connecting to this FTP server from the office is a piece of cake. But, I want to connect to this FTP server from home. Connecting to it directly from home is impossible because I have a dynamic IP address here at the house (surprise) so we can't just add my home IP address to the list.
Fortunately, I have full admin rights on the network at my local office and I installed cygwin on the Windows 2008 server. I can successfully connect to it using ssh. That gets me half way there. I can securely connect to a machine that is allowed to talk to the FTP server. I'm half way there. Now I want to figure out how to forward my ftp port over ssh through the server at the office and to the FTP server that I really want to connect to.I've tried various incantations of ssh and I can't seem to come up with the right combination. Anyone have any experience doing something like this? In affect, I want to use ssh as a simplistic proxy and it need only really handle a single port.
Is there a limit to the number of files ext3 can support?
Reason I'm asking is because on one of my internal drives, I have around 750,000 files. The drive is 500Gb and currently using 150Gb... I noticed recently that when I try to copy a new directory or file, the transfer rate is extremely slow at times. It is sataII and sometimes it gets as low as 500kb/s (yes, kb!)
Would somebody please shed some light?
I noticed it might be related to the process gvfsd-metadata
I recently switched from Ubuntu to Fedora and had this screen brightness issue with my HP Pavilion dv3 notebook. The brightness is so blinding and I cannot adjust it.Fn keys works and responds but the screen stays the same.
guys when double-clicking the bar where it is close, maximize and minimize, or maxima and it will not restore I do not know what more I moved here I'm not getting to solve this problem anyone have any idea what can this be?
I upgraded to FC15 in early june, come June 15 Thunderbird suddenly ceases to send or receive any email. It pops up with the message "You may have exceeded you maximum connections" and I've tried changing the cache connection count to 1 instead of the default 5 and every OTHER post about that error suggests to no avail. Funny thing, my primary desktop is still on FC13, same thunderbird version, arch and account setup (with SSL enabled) and works flawlessly. Even with -safe-mode enabled it doesn't work. My lightning calendar also doesn't sync since then.
I have found that if I turn SSL off IMAP works again, SSL SMTP still fails. If I turn SMTP SSL off it sends mail. The calendar is an https (SSL) link. Obviously the issue is SSL and Thunderbird in FC15. There are no errors/messages in the console, maillog or the in-built error console.I will NOT use mail without SSL as I may occasionally use unencrypted WiFi.
how to build a cheap and fanless server? It's main uses would be web and file servering, but it could be a day when I'd like to add some streaming and mailing capabilities as well.
I'm currently searching for a good ftp storage (with sftp supported) to back up my stuff. I had a look at amazon's S3. That looks good, but maybe a bit pricey. Do you guys have any ideas?
I have about 60 old PC's that have an 8X AGP slot. What used/old AGP video cards are there that are supported, that are cheap and will provide a 1280x768 (or there abouts) for a generic lcd wide screen display?I've seen several Dell nVidea 5200 w/64mb AGP cards for under $10.
So I got myself a domain. I would like to have ssh access to the hosted account, free email, storage, bandwidth, and I dont want ANY ads. Can anyone save me lots of headache and point me in the right direction?
Does anybody have any experience with the Virtual 7.1 USB sound intetrface dongle with Linux? They are selling on several sites pretty cheap and ads say they have Linux drivers.
Whenever I play streaming videos, (mpg, wmv, etc), the volume will move from where it's set to maximum level. Then I have to adjust the sound. If I play another streaming video it will do it again. I'm using F11 with Gnome. How to stop it from happening? No matter how I adjust player's volume, it starts off at maximum level. This is my sound info I'm using: Audio device: nVidia Corporation MCP73 High Definition Audio (rev a1)
I am looking to build a new desktop. What is the lowest end video card that will fits the following:
Supports 2 monitors at 1920x1200 or 1600x1200 Works with Linux.
3d performance isn't much of an issue, since I don't play computer games. I use the computer mostly for programming, which is why I like having the large resolution, so I don't have to scroll around so much.
I'm looking for a PCI RAID card that will support 4 SATA disks in RAID 5.
Essentially it must be possible to monitor the raid from the Linux operating system (FC10+), I will not have physical access to the machine most of the time, so I need to be able to talk to the RAID card from Linux (like the 3Ware one does), It is possible to get reasonably priced RAID cards that can be monitored from Windows.
Ever since upgrading to Fedora 14, my system has been almost unusable. I can only have a few windows of anything open before I start getting messages about:
Maximum number of clients reachedMaximum number of clients reachedMaximum number of clients reachedMaximum number of clients reachedxwininfo: unable to open display ':0.0'
I am unable to open any new windows then and lots of programs crash.