Software :: Parsing A Text File To Display Certain Data On Website In Php?
Dec 26, 2009
Im trying to figure out how to display parts of a .db file created by the scorch2000 server to display a player name, games played, score and maybe more...
I dont want to display everything, of course ^^, but how do i get the player name, the number of games he played and his score to display it in a webpage in this fasion:
Name Games Played Score
joe blow1 25 9876890
joe blow2 31 8989767
joe blow2 26 7989767
joe blow2 17 5989767
joe blow2 13 4989767
and by highest score because the log doesin't put them in in score order....
please help, i asked the maker because he has one runing already but no answer back, well the game is pretty old so i didn't really expect an answer anyways and tryed to figure it out but i dont know functions in php, this is to include in a php-nuke block (this i know how to do
here is an example of a working page at the developper website:url
Is there a way to process individual characters one-by-one from a text file in Bash, or is that hoping for a little too much from this lovable old clunker?
EDIT1: I should note that I was trying to do this in PERL, not sure if other alternatives are more simple?EDIT2: I should note that for text file 3 (reference), it's a long list of MANY cnp_id values and their corresponoding chr, start, and end values. So, the code will have to take the cnp_id from text file 1 and/or 2 and search through textfile 3 (reference) to match on the cnp_id and then take the corresponding chr, start, and end values and add to the relevant line in the output.EDIT3: Sorry, I should mention that the text file entries are all tab-delimited.I have 3 text files:File 1:Columns represent sample IDs (sample_id) and rows represent CNP IDs (cnp_id). Cells represents the confidence level (confidence) for each sample and CNP.Quote:
cnp_idP5E6_SNP6.0_JHP5_010408.CELP5E11reh_SNP6.0_JHP5_011808.CELP7C7_SNP6.0_JHP7_021208.CEL ... etc. CNP100.0044798340.0027929510.00305613
I'm trying to write a shell script which finds bits of data from a text file. at the moment i'm using grep and basically i need a function which will look through the text file and take the data out of it. the file has days, months, years etc and i want it so i can type feb 06 and it finds all of the data for feb 06.
the problem i have is i can type feb and all the information comes back for feb, but i can't get it more precise e.g. feb 2009 and it finds just feb 2009, it seems to ignore that latter half. I've tried experimenting with egrep and having two inputs but i can't seem to fuse them together, it only takes the first input.
i have the following text in a file located in /home/anoopm101/.task [description:"this is the text" entry:"1306682734" status:"pending" uuid:"25c54e1b-824f-52bc-4933-dfe7cda34bc7"]
i have to display on my desktop "this is the text" using conky.
I'm having a slight dilemma on reading data from a text file and outputting it into a table then displaying it. Basically I'm writing a shell script that takes information from text files then outputs the data into a table with 4 headings. The extracting of the data is fine, but creating a table i'm having problems with. My code extracts the data outputs the string to another file which works fineThe text file looks like this
mr smith 1 purchase oct 2007 mrs smith 2 purchase nov 2006 i want it to look like this
I am writting a program that reads a text file (music.txt) & stores it in a Data Structure. I am a novice learning over the internet so I this is something I have never done. How do I do this?
Quote:
Write a program that reads the data from the music.txt file and store it in a data structure. To implement this data structure you must use an array of songs and each song must be represented by a struct with appropriate fields. So far all I can do is open to file to read it (very simple I know) but so far is it correct?
i want to open a txt file and then read line by line and display data with first line and label it 1 and then incremented by 1 for each additional line.
I'm having a slight dilemma on reading data from a text file and outputting it into a table then displaying it. Basically I'm writing a shell script that takes information from text files then outputs the data into a table with 4 headings.he extracting of the data is fine, but creating a table i'm having problems with. I think it is possible to do it using the awk function, but so far i'm having a lot of difficulties.
Running SunGard Banner software on RHEL 4.2 x86-32 bit Linux server Oracle Application 10.1.2.3 samba enabled. Users run processes/reports that are logged in a daily log file. In our daily job submission log files the user password shows up as clear text.The password shows up as $PSWD (sample from the logfile):
We have a system called Skynet, which is basically a bunch of monitoring tools, including Nagios. What I want to do is output the status of 'critical' processes in conky. The conky part I'll worry about later (how hard can that be?), but I'm looking for some feedback on how I'm parsing the initial data. I figure that the simplest way to get the information is to query the cgi, then take what I need from the results...
All I basically want is the server name and the process name, the above example giving server0/server1 and 'update status' as the service. How would you go about extracting merely these two pieces of information, bearing in mind that the server name and process are variable?
I am trying to send data through a USB port to a printer which can recieve text file. Now to send the data to the printer I am doing echo "Hello World" > /dev/usb/lp0
but the data doesnt seem to be sent to the printer because I have an analyzer to monitor the data sent to the printer. Is there anything else which I need to do
I am looking for ways in where I could parse a web-page, say lURL... and need to parse data of the web-page. URL....Now, as can be seen the page has lot of information. I just need/want to only take the names of the packages rather than version numbers.
As a curious side project I'm playing with mzXML data(an xml format for holding mass spec data). A typical scan can be quite large, even up into GB size. I'm wondering how would one go about parsing an xml file in sections, one section at a time. The idea being if the computer doesn't have enough memory to load up the entire data file, work on chunks of it at a time.
trying read serial COM port and want to write that received data to file, now its writing only one sentence, but i want to write full file which coming on serial port, as i'm sending file from hyper terminal and reading on linux pc, If i put while loop its not writing anything,without while loop its writing only one line and if send big file then application terminates and then writes to file.But i need do write any size which coming on serial port.Finally i want write full file which is coming on hyper terminal, after writing the file it has wait for next data. This is my code,
I need a loop that pulls out the user name into a variable and then pulls out the LastUpdate field into another variable so I can then perform a comparison against the last update field. Requirements are AIX tools including AWK, SED and Perl I am writing a script to check AIX users password expiration dates and if they are within the alerting period (ie. 7 days etc) it will email the user. I will release the full script into the public domain once completed. The text file I want to parse is formatted like:
i need a php script to format and display a select amount of text from a file or a cron job that'll do it (with grep etc) and out put to a file but whatever is easier i want to display all the banned ip's i have in my .htaccess file i have a lot of lines in my .htaccess file but i have a line that read like this:
the 3 test ip's are 123.123.123.123 and i want the "" stripping and the "|" removing and each ip to be on it's own line like this:
123.123.123.123 123.123.123.123 123.123.123.123
i'm not bothered if this is a php script that'll read the .htaccess file and display the output or a cron job that'll run every x mins and output it to a .txt file but i'd prefer a php script
i am trying to get a script that i'm calling to have information from a sql populate into rows... but i'm not getting the data to output correctly into the rows. can someone please help?
PI'm trying to write a script to list all open ports in the MINIUNPND chain in iptables and use the procotol, port and destination ip to open ports on another router using upnpc.Here is the output of iptables -L MINIUPNPD
No matter what i do i cant seem to remove the first 4 characters from the MYPROT array to leave only the digits. Also i cant seem to read the array back???
I thought it would simply be a loop reading each line and passing the fields in variables, executing upnpc commands i need then moving to the next line of the file until it reached the EOF.
I need to get the modified date on a file in linux to use in a script.I tried using 'ls -l' on the file, but this caused problems when the date turned from a single digit into a double. The reason for the problem was because I was parsing the result string on spaces.How can I get the date of the last time a file was modified so I can use it in a script? For example, if a file was modified on 1/11/2010, I need the 11.
im trying to output a list of running processes via a shell script. At the moment i got this which outputs the processes to a text file called out.
echo $(ps aux) >>out
The problem is though, the processes are all just one big block of text which makes it hard to read. Does anyone know how to sort the output to a text file so that it prints to the text file at 1 process per line? I know its probably simple but im very new to linux.
I have logs files from freeradius that have looks as follows:
$ grep "Login incorrect (rlm_ldap: User not found" /var/log/radius/radiusd-inner-tunnel-20090831.log Mon Aug 31 09:25:27 2009 : Auth: Login incorrect (rlm_ldap: User not found): [John Doe] (from client oficina port 0 via TLS tunnel)
[code]....
I use the following line to get the amount of users that don't exist on ldap:
Code:
grep "Login incorrect (rlm_ldap: User not found" /var/log/radius/radiusd-inner-tunnel-20090831.log | awk '{print $14}' | sort -fu | wc -l
Now, awk on line one for example parses [John Doe] and [Joon Williams] as "[John" and that it's not what I'd want. I mean how could I do for awk looks username field as closed between squared brackets?