Programming :: Getting Weekend Out Duplicates From A List Of Servers?

Dec 26, 2010

I have a list of about 425 servers that are mostly redundant. I need to weed out the duplicate names so that I have a count of only the unique server hostnames. What is a good command to do this?

View 3 Replies


ADVERTISEMENT

General :: Find A List Of Files That Are Named Duplicates ?

Jul 2, 2010

How can I find a list of files that are named duplicates i.e. have same name but in different case that exist in the same directory?

View 7 Replies View Related

Fedora Servers :: Gateway Autoshutdown During Weekend

Jan 5, 2009

I have one linux server act as gateway which is run on fedora platform. The problem occured recently when it keeps on shutting down itself during the weekend.. I have no idea what caused this autoshutdown. For your info, the server is NOT scheduled to auto-shutdown mode.

View 2 Replies View Related

Programming :: Removing Duplicates With Offset - Datetime

Oct 20, 2010

I have a column of datetime entries which I sorted to removed duplicate entries.

I have still lots of entries which are adjacent to each other by 1 second.

How would I go about removing any entries which have an offset of the previous or after entry by 1 second?

Code:

View 5 Replies View Related

Programming :: Bash: Combine Arrays & Delete Duplicates?

May 23, 2011

I would like to combine two arrays and delete duplicates that might occur. I tried to solve it like this:

Code:
combined=( "${results1[@]}" "${results2[@]}" )
printf "${combined[@]}" | sort -n | uniq
Example:
result1 (4 elements):

[Code]....

In my code printf seems to have a problem with elements that have the same letters but a space inbetween. For instance "new foo", "newfoo" are the same for printf

View 5 Replies View Related

Programming :: Bash Output With Timestamp Removing Duplicates

Nov 20, 2010

I'm writing a bash script to auto run on boot in Tinycore.

This is a watered down version.

Code:

I need it to either not add the time stamp if the awk finds a duplicate or write over the time with the new time if awk finds a duplicate.

BTW this is all pretty much cut-and-paste scripting so please feel free to comment if you know a more elegant way.

View 3 Replies View Related

Programming :: List 4 Names From Users List And Output Them To Fbusers In Numbered Ascending Order?

Feb 2, 2010

How would I list 4 users ID numbered 10, 11, 12 and 13 from my users list and output them to a file busers where their names are numbered by ascending order? How would I accomplish that on a one line command?

View 4 Replies View Related

Programming :: Downloading A List Of Files From A Remote Server Using A List?

Feb 10, 2009

I am trying to get this script to work. The purpose is to download a list of modules from the slax.org the list consist of a list of module numbers. What I am trying to do is Download the file or the file name corresponding to the number in the list.the list is comma delimited. this is what I have done so far and I am a stand still.

#!/bin/sh
# Wget script to retrieve modules from slax.org modules
#
# ----Begin of user defined values -----
# Path to wget

[code].....

View 7 Replies View Related

Fedora :: Cron Shutdown But Not On Weekend

Jul 5, 2011

Just read through some cron manuals. (man -k cron).Didnt understood that much though.I'm a gamer and loosing control over time at nights.
So what i want is that my computer shuts down itself at latest 1:30 am. Do i need to set up the command for each day (monday, tuesday.. and so on) or is there an option simliar to my 'example' below?

€dit: Both examples should do the trick.

Code:

# m h dom mon dow command
30 1 * * 1,2,3,4,5 shutdown -h +5 Get some sleep dude!
30 1 * * 1-5 shutdown -h +5 Get some sleep dude!

Where as 1 to 5 would be all weekdays, but not saturday or sunday morning at 1:30am.If i can shorten it to 1 line instead of 7 lines (one per day of week) that would be great.

View 14 Replies View Related

Networking :: How To Stop Internet On Weekend

Feb 5, 2011

I am using squid in my Linux box. I want to allow my office staff that they may use internet from Monday to Friday.The IP range that they are using is 192.168.1.20. to 192.168.1.50 In /etc/squid/squid.conf file under acl tag

Code:

acl weekend time A S 00:00-23:59
acl myoffice src 192.168.1.20 192.168.1.50/24
http_access deny myoffice

is it correct or I have to add some things more to stop internet access only on Saturday and Sunday.

View 1 Replies View Related

Fedora Networking :: How To Stop Internet On Weekend

Feb 5, 2011

I am using squid in my Linux box. I want to allow my office staff that they may use internet from Monday to Friday.The IP range that they are using is 192.168.1.20. to 192.168.1.50n /etc/squid/squid.conf file under acl tagPHP Code:

acl weekend time A S 00:00-23:59
acl myoffice src 192.168.1.20 192.168.1.50/24
http_access deny myoffice 

[code]....

View 1 Replies View Related

Ubuntu :: Message And Kern.log File Exploded Over Weekend

May 10, 2010

I walked into work this morning and had a message from the system that my root drive had very little space. I traced it down to my message.1 file and kern.log.1 file. Together, they were over 2 GB.I am able to browse the file shares.

View 1 Replies View Related

Ubuntu :: Xserver Crashed After A Weekend Without Touching Computer

Dec 13, 2010

I have installed on friday a new distro on my computer, ubuntu 10.04 lts on my 1GB ram and 3Ghz computer. It has dual booting system. Everything worked fine, i have even been using the xwindows and althought i must say that it felt a bit slow in time reaction to my typing still good enough. I left the computer running with no programs, just idle and now my computers screen is dead. I can not see anything.

I restarted the computer and in normal mode it goes black.I started in recovery mode and i manage to get to a console where I type startx and i see the next error.So I had a fresh install, i worked with the xserver (so i do not think is a driver problem) fine until i left the computer without running anything and the xserver crashed.

View 4 Replies View Related

Programming :: Active Forum / Mailing List For C Programming?

Apr 19, 2010

Just i want to ask doubts in c programming. I dont know whether this is the right place to ask doubts in c. If this is not a correct place, may i know where can i get help for c programming?is there any active forum or mailing list for c programming?

View 6 Replies View Related

Software :: Keep The Duplicates Not Remove Them

Jan 17, 2011

Open office Calc all the googles i can read all have removing the duplicates from the spreadsheet. I want to do the exact opposite. I want to keep the dupes and remove the others.

View 1 Replies View Related

Ubuntu Servers :: ProFTPd Will Not List

May 25, 2010

I have set up proftpd many times now one Ubuntu 9.10 and never ran into any problems. I decided to go ahead and do a clean install of the new 10.04 and set everything back up (ie ssh ftp apache... ect)I got done with ssh with no problems and started working on getting proftpd up and running just like I've always have. But now every time I try to login it gets to where it should list all the files in my dir and it just times out. If I connect through my network (192.168.1.101) everything works fine so I dont think its my .conf file.ll ports are open that are needed and I even tried opening up the passive ports to see if that would help but it does not.

View 1 Replies View Related

OpenSUSE :: Which Repositories Duplicates Or Unnecessary

Apr 16, 2010

I am running KDE 4.4.2 release '241'. I have no clue if that's the latest dev version, but I do know that each day I have to install a 150MB KDE update, so it probably is. Either way it's pretty annoying. The way opensuse handles repositories is different than Ubuntu's, so somehow I ended up creating duplicates too. So basically, I want to remove any duplicates and also stop receiving the daily KDE updates.You can ignore the google repositories. My repositories

[code]...

View 8 Replies View Related

Ubuntu :: Script For Removing Duplicates?

Oct 24, 2010

Although having used Ubuntu for a good couple of years, I'm still a total beginner when it comes to scripting. However, what I need to do should be fairly straightforward:

Importing images from my digital camera, both RAW "originals" and JPG "copies" end up in the same folder. I typically flip through the JPG:s in Image Viewer and remove those that I'm not interested in. Now, this leaves me with the tedious job of going though all the RAW files in the folder manually to get rid of those too! It sure would be wonderful to get Ubuntu to do the work for me...

The script would simply need to go though all the RAW files in a folder one by one, check for a corresponding JPG file - and if there isn't one, remove the RAW file. How could I accomplish that?

View 3 Replies View Related

General :: Duplicates In Text File ?

Jul 1, 2011

I have a text file which is a list of all my contacts. So far i have only found software and commands which remove duplicates but i would like to remove all duplicates AND their original entries too so only contacts which have no duplicates are left.

View 10 Replies View Related

General :: Keep Duplicates Based On First Word Only

Mar 11, 2011

I have a large file and want to keep lines which are duplicates, but the test for duplicates is performed only on the first blank-delimited word.

View 6 Replies View Related

CentOS 5 :: Which Is The Correct Dag Rpm To Install When There Are Duplicates?

Sep 24, 2009

i am getting duplicate entries in dag rpm repository with different names in different cases !

# yum search fileinfo
php-pecl-Fileinfo.x86_64 : Fileinfo is a PHP extension that wraps the libmagic library
php-pecl-fileinfo.x86_64 : PECL package to get file information through libmagic[code]....

which is the correct rpm to install ?

View 8 Replies View Related

Ubuntu Servers :: Cannot List The TAPE Files?

Mar 2, 2010

I've having problems in listing files from a TAPE. I did not create this tape and I'm assuming that the files were successfully backed up to this tape.Some commands that I've tried:

PHP Code:
atenreiro@intranet:~$ sudo mt -f /dev/nst0 status
SCSI 2 tape drive:

[code]...

View 1 Replies View Related

Ubuntu Servers :: See List Of Blacklisted IPs In Apache?

Aug 19, 2010

I'm using Apache server's dos_evasive module to block DoS attacks. How do I see the list of blacklisted IPs? because I want to delete from time to time to monitor the IPs that are blacklisted, but do not know how to.

View 6 Replies View Related

Debian :: Rsyslog Remote Logging Duplicates

Jul 30, 2015

I'm having issues setting up rsyslog to receive syslog from another server and only log to one file. I'm receiving the syslog from the remote side, however its putting the entries into more than one log file.

I configured /etc/rsyslog.conf to enable udp, and I have implemented a filter to log only from that IP address, and then stop processing more rules, but it seems to continue on.

I have found that the remote syslog events are using local0 and local1. There are two custom rsyslog config files in /etc/rsyslog.d that handle those two facilities. If I use that same if statement at the beginning of those custom config files, I can get it to work. Seems like a hack though.

Not working:

I put my if statement before the include statement, thinking I could stop it from hitting the custom rules.

Code:
Select all#  /etc/rsyslog.conf    Configuration file for rsyslog v3.
#
#                       For more information see
#                       /usr/share/doc/rsyslog-doc/html/rsyslog_conf.html

#################
#### MODULES ####
#################

$ModLoad imuxsock # provides support for local system logging
$ModLoad imklog   # provides kernel logging support (previously done by rklogd)
#$ModLoad immark  # provides --MARK-- message capability

[Code] ....

This works:
A custom config file in /etc/rsyslog.d
Code: Select allif $fromhost-ip == '<my ip>' then /var/log/<my directory>/syslog.log
& ~
local0.*       /var/log/<a log file for local0>.log

This is on a WD Mycloud device:

Code: Select allLinux WDMyCloud 3.2.26 #1 SMP Tue Jun 17 15:53:22 PDT 2014 wd-2.2-rel armv7l

The programs included with the Debian GNU/Linux system are free software; the exact distribution terms for each program are described in the individual files in /usr/share/doc/*/copyright.

View 1 Replies View Related

General :: AWK: Rm Duplicates Based On Multi Fields?

Feb 7, 2011

I'm trying to use awk to remove rows that are duplicates based on 3 fields, and I want to keep the on that has the higher value in another field. I'm working in C-Shell. For example the below is greped out of a larger data set to use in here as example:

Input (Field separator is a comma:

Code:

4180,-6999,MA,BARNSTABLE,BOURNE,1,1.7,1700,PM,1/26
4180,-6999,MA,BARNSTABLE,BOURNE,1,3.5,2025,PM,1/26
4180,-6999,MA,BARNSTABLE,BOURNE,1,1.0,1511,PM,1/26
4180,-6999,MA,BARNSTABLE,BOURNE,1,5.7,0540,AM,1/27

[Cpde]....

View 1 Replies View Related

Ubuntu :: How To Recursively Move And Rename Duplicates

Feb 8, 2010

I am hoping someone already has a script or knows of an app that will let me do this fairly easily - I have a fairly large folder structure that goes several levels deep, etc. In many cases there are duplicate file names that are not really different, e.g.,
/home/chris/folder/folder1/doc1.doc
/home/chris/folder/folder2/folder3/doc1.doc

I want to recursively go through /home/chris/folder and move everything to /home/chris/another_location/ without subfolders and renaming duplicates as appropriate, e.g.,
/home/chris/another_location/doc1.doc
/home/chris/another_location/doc1_1.doc

View 1 Replies View Related

Ubuntu :: Control Duplicates In Bash History?

Oct 19, 2010

I have tried a combination of the following lines in .bashrc to try and control duplicates in the bash history file:

export HISTCONTROL=ignoreboth
export HISTCONTROL=erasedups
export HISTCONTROL=ignoredups

[code]....

View 5 Replies View Related

Ubuntu :: Tar Update Creates Duplicates Instead Of Updating?

Nov 17, 2010

I've tried using a script to run incremental backups. The idea was to use the update switch to just update the files in the .tar instead of creating new ones. However it seems to have created duplicate files in the tar instead of just updating them (refer to screenshot). Is this normal?

Here is the command in the script;

Code:
tar -uvpf /home/jonny/.BackUps/Updating/Documents.tar /home/jonny/Documents

Is there a way stop this but still have them update to the latest version?

View 1 Replies View Related

Networking :: Get Duplicates Packates From Other Remote Machine?

Jun 18, 2010

when i ping a remote machine to 172.16.1.55 then i get this result...how can get normal packets......

64 bytes from 172.16.1.55: icmp_seq=1 ttl=128 time=0.468 ms (DUP!)
64 bytes from 172.16.1.55: icmp_seq=2 ttl=128 time=0.448 ms
64 bytes from 172.16.1.55: icmp_seq=2 ttl=128 time=0.469 ms (DUP!)

[code]....

View 1 Replies View Related

Software :: Duplicates Or Hard-linked Files ?

Jul 18, 2011

I'm trying to trim down Linux so that it fits on an appliance, and noticed that some related files in /bin have the exact same size:

Code:

Are those file duplicates, or are they just hard-linked, ie. there's really only one file in the flash memory but it looks like there are more than one? How can make sure?

View 7 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved