We have quite a few SEO clients who require multiple IPs which are all from different class c allocations. (10 - 50 IPs) used for projects lasting weeks to months at a time. Since we don't have connected IPs in ranges we can't use the range scripts but I don't know of a way to easily add a long list of individual IPs. These are CentOS servers, by the way
I'm trying to sync the clock of an ubuntu desktop with the network with ntpdate -u <ntp-server> and in a matter or minutes it's growing an offset of a few hundred milliseconds and even more than one second. Is there anyway to guarantee that the time keeps synchronizing with the network more frequently? does it make sense that the offset grows so fast? Am I doing something wrong?
I'm looking for a way to quickly remove all data/partitions/boot records from my hard drives while running linux (distribution is irelevant). There are lots of ways to do this that I know of, but they all have some problems. Here's a list of what I've tried/thought of already. The most obvious is fdisk: Simply delete all the partitions. This usually works just fine and is very quick, but there are times it just doesn't....I'm realy not sure what gets left behind...I remove the MBR as well..but whatever it is, it's in the way. A couple other options are:
Both of these approaches are great if you're selling the components and want to make it very difficult for anyone to recover data. The draw-back is they take so very long to run. I've got four 1.5 TB drives that I've been writing zeros to for 2 days now. If you thought watching grass grow or paint drying was boring. A hundred years ago or so, when I was doing tech support for Windows 95 users we used this nifty dos-based debug script to wipe the hard drive. It was sort-of a last resort thing, but it worked beautifully, most of the time. If the customer had already formated, fdisked, fdisk /mbr, reinstalled Windows, but still couldn't get the thing to work, this would clean the drive so you could do a fresh install.
Just in case someone wants this, I'll post it. To use: first boot to some type of DOS environment in which you have the program "debug".
Usually to view individual cpu % we press '1' in the interactive mode of top.
However I am not able to figure out the logic to get the same output in batch mode i.e,
top -n1b
I am redirecting this output to a file to view it later and stuff like that, so I need the batch mode. Is it possible? Installing a separate tool for this is not gonna be possible.
I'm looking for an option that I am missing or some way to capture the output.
I have captured a file in my linux showing logs captured from many modules concurrently. Please find attached a sample of the file. As you see, there are logs from individual modules that have been captured concurrently. For example, there are logs from IPTR,SNMP,HLR,TCAP,XAPP,and SCCP modules but they are coming concurrently.Each log has the header name of its accompanied module in the beginning. I need to have the log of each modules separately. Can you please show me the power of linux on how to separate individual module's logs from the whole?
i have an ntfs mount that i wish to change permissions of individual directories.i have mounted many ntfs volumes successfully, mounting is not the issue. the issue is that when mounting, i need to specify 'blanket' permissions, owner, group etc. i have no idea how to change permissions for individual folders.
I am new in perl, i have a question i.e 'How to read individual logs from linux server into another log file using perl script', I need to capture the individual logs from different paths and output the result of those log files and store to a file in another location.These Logs are generated in Linux Server..
It does not send mail if I only add contact_groups in host definition.
Code:
define host{ use generic-host ; Inherit default values from a template host_name NAME1 ; The name we're giving to this switch address XX.X.X.X ; IP address of the switch
currently there's a xxx dir already in /home/yyy I'm trying to overwrite itcp -fr ../xxx /home/yyy/doesn't work still prompts me to overwrite the individual files. how do I fix it?
I have ~200 c files in my makefile[$(SRCS)], and it compiles all of the files using a single gcc command. So each time I make a change in one c file, it ends up re-compiling all the files, then linking to make the binary. How can I break out the compilation into individual gcc commands for each c file, so that make checks the timestamp and accordingly compiles only the modified files.
I have a database (Mysql) with records of our club members. It is quite small only about 300 records. Unfortunately there are gaps in the information and I would like to be able to print out (two records to an A4 sheet) the records in form format so that I can pass these to the members who will fill in any blanks in the record.
If it is not possible to do what I want from the database I can convert the records to a spreadsheet and do it from there.
I am using squid proxy server for sharing Internet in my internal network. I would like to know that how can I check the browsing history by individual users web surfing history by their IP addresses?
i have a large directory of .bsp files that i would like to convert .bz2 archives. I've been searching for some time and all i can find is the obvious compress multiple files into one large archive. If anyone knows how to convert each file individually, while retaining the original file name (testmap.bsp would be archived as testmap.bsp.bz2)
I want to write a shell script which will simultaneously collect OS user information and write in an individual text files.Can anyone tell me the syntax of the script.N.B. The user name will be mentioned in an array within the shell script.
Using a GUI file browser, I would like to be able to mark files with an emblem or something similar as quickly as possible, with a single click. I'm currently using Gnome Nautilus in my Ubuntu 10.04, which doesn't seem to offer the functionality. I'm not keen on trying the extension Nautilus-Actions as it doesn't seem to be open source. I've set up some scripts though, but accessing them through the pop-up submenu is just clunky enough to still have me searching for a faster solution. Does a file browser exist that would let me set up a toolbar button for marking files? Or a button for launching scripts, which would amount to the same thing.
when i play a song in banshee, or a videos video, i get no sound at all. however, in banshee 1 3 minute song will appear to play in about 20 seconds (the progress bar speeds along), and videos videos play back in ultra-fast motion with no sound. weirdly, when i go to system settings -> multimedia, and click test on the default sound device, the kde login sound plays back fine. this worked when i first installed; does anyone know what's going on here? i think it may be a problem with gstreamer?
I'm running badblocks on some new disk drives just to be safe before my return period expires. I'm running the command
Code: sudo badblocks -b 4096 -p 4 -c 65536 -w -s /dev/sdb1 on an empty partition taking up the entire device. I am getting the following output:
[Code]...
1) When it gets to the "reading and comparing" phase, it seems to complete in <1s. I see other posts where people say this takes them as long as it took to write to the drive. I don't see any mention of this being quick, so I'm just afraid that badblocks is being denied read access or something.. Anyone familiar with this behaviour?
2) Obviously since it is on a sixth pass, badblocks thought that it found new bad blocks on the second pass (or later). When I check SMART, though, the drive has not re-allocated anything (despite having plenty of spares)... Doesn't this seem odd? Is there some reason why the disk would not step in upon a write fail? In fact, I thought a write fail should be transparent... Which makes me wonder if question 1 (above) is causing this...
I need to view 10.858 text files by this friday to see which ones I need (searching file contents is not a solution), I was wondering if anyone knew any way/program to quickly view them as if they were, for example, images? And, optionally, to be able to save them to a different location from there? It doesn't necessarily have to be for Ubuntu as I also have WinXP and Vista, but it would save me having to transfer those couple of gigabytes to another computer.
Ubuntu 10.04 - when I logout I get screenful (at least) of messages all with the same format. They disappear too quickly to see what they're about. I've looked at the logs but can't see anything like what I see on the screen. Is there any way I can find them?
I bought a new Hp Pavillion DV6T. For the most part, everything worked right out of the box with ubuntu, but the computer seems to be heating up really quickly for no reason whatsoever even if I'm not doing anything intensive. Why this is happening. The fan always seems to be running at full blast. I logged onto windows for a minute to see if there was any difference in heating between the two OSs and windows seems to run much cooler and much more quiet. On a side note I'm thinking of returning the HP and buying a dell XPS 15. I hear they are more compatible with ubuntu and are better built.
HP Pavillion DV6t ATI Radeon 6570 1 GB graphics card Intel I7 2.0GHZ quad
So I am prompted in desktop for updates, but no information is given about the size of the packages involved. Sometimes, I don't like to wait too much for the dowload or to update packages which are too large.
Any settings dealing with that, so the individual size and the total per groups (security updates, etc) would be show without going in yum?
After dual booting into Windows to play Mass Effect 2, I hit the windows key by mistake, and the screen quickly turned black! Since then, I couldn't boot into Fedora, nor Windows; not even Fedora's linux rescue. I see the GRUB, I hit enter on any one of the two options, and all I see is a black screen!
I keep losing my Internet Connection(DSL) very quickly. Usually after a couple of minutes, it's absolutely unpredictable. And then, it gets difficult to reconnect. And if I succeed in reconnecting, I again lose the connection quite quickly.
Here are some logs from the ifcfg-dsl0.log code...
Recently, I'm getting an annoying problem: When I scroll quickly the text smears or duplicates itself -- doesn't happen with all applications. I haven't had this problem until I started to use OpenGL compositing with my NVidia card.
Working with a scientific code that uses more RAM+swap then i generally have (system has 12GB RAM + 24GB swap, but this thing is crazy)It's kind of a one use problem, so I'm not looking to get more RAM, is there a quick way to add more swap space (not on the swap partition, because i have that set at 24GB) so that my system can use it immediately?I don't want to drive up to the office tonight to get this fixed, so a command line setup would work best.