Ubuntu Security :: The Requests Are Listed In The Order In Which They Appear On The Stats Page?
Apr 13, 2011
Below is the print out of requests, with the website address "#.com". The requests are listed in the order in which they appear on the stats page. What does it mean?:
In my logs for Apache I have lots and lots of failed attempts for incorrect incarnations of [URL]. None of them are anywhere near my alias for the index.php but yet phpmyadmin is broken. Is there away I can mess up robots like this. Send IP's that create multiple wrong page requests on my server back to their own IP address maybe? I would then just set thresholds to decide how strict to be. I did try fail-to-ban before but it is cryptic. I don't have it on this particular server.
I use Ubuntu 10.04 and I want to be able to move around the system without having to frequently enter my password. For example, when waking up the system from a power save state or when accessing Synaptic Package Manager I do not want to be asked to enter my password. There is nothing on my system that matters if its security is breached. Is there a way to turn off these requests for a password?
Ok i think Tor has some way of making the dns queries anonymous by default. I did the DNS nameserver spoofablity test here at [URL] and the results i got showed about 30 different dns servers. Normally when i carry out this test on my standard isp connection or the vpn i use i just get one dns servers settings consistently.
I've lately been getting some strange nfs mount requests for non existant users' home directories on a F14 machine to my file server (CentOS).The message log on the file server shows the following
May 23 03:10:53 data mountd[4835]: can't stat exported dir /export/home/httpd: No such file or directory May 24 03:21:13 data mountd[4835]: can't stat exported dir /export/home/httpd: No such file or directory May 25 03:26:53 data mountd[4835]: can't stat exported dir /export/home/httpd: No such file or directory
i've tried blocking ping requests with iptables.. and it didnt work Quote: iptables -A INPUT -p icmp --icmp-type echo-request -j DROP
also tried editing sysctl.conf.. which worked perfectly but after i restarted the system i was able to ping my ubuntu machine from my lappy here is what i added to sysctl.conf and then executed it with sysctl -p
Quote: net.ipv4.icmp_echo_ignore_all = 1 here is another atempt to block.. this one worked too... but again after the restart i was able to ping my machine.. Quote: echo "1" > /proc/sys/net/ipv4/icmp_echo_ignore_all
I've been trying to configure ufw to drop ping requests for a couple days now, and I can't figure it out. I've tried a couple different methods in some different guides, still nothing. Anyone know how to do this?
I have suspicious requests in my haproxy logs from multiple sources to the same target. I could deny them in /etc/hosts.deny, but there are too many to keep track of. Is there a way to deny all requests to a specific target either in haproxy or through iptables?
Here's an example of the request: Apr 12 15:11:37 127.0.0.1 haproxy[28672]: 41.105.42.150:27072 [12/Apr/2011:15:11:37.315] web_servers frontend_farm/######## 3/0/1/1/169 404 1073 - - --NI 3/3/2/1/0 0/0 "GET /images/comment_icon.gif HTTP/1.1"
I've commented out my amazon instance id for security purposes. The request is for comment_icon.gif which does not exist. All requests go to that. The source IPs are from different countries as well. Blocking a certain country won't work either. Basically, if there was a way to send all requests for comment_icon.gif to /dev/null or something it would work.
I had some help via email from someone drafting my CV into the correct table format with open office. It's a .pdf file but now unfortunately lists the author in the document tab of properties as that person.
Is there anyway to change it to my own name, and also how do I 'secure' the document so that it's not easy for people viewing it to copy and paste, I've heard this is why many people now use .pdf for their CVs/rsums?
i just upgraded to ubuntu 10.04 the netbook distro. at the desktop view there is a list of about 10 buttons/menus listed on the left hand side, is there anyway to control what buttons/menus are listed and which icons are listed under each of them? having a netbook i would like to remove and unclutter the desktop view as much as possible but i dont want to remove those apps i still want to be able to open those apps if i want to even if by removing those icons and menus/buttons makes it a pain.
How can one ensure that one is looking at the real web page? This is important if you are going to download an iso image or validate md5 sums.For example, the Ubuntu web page here:https://help.ubuntu.com/community/In...tion/MinimalCDHas information about certificates available in the URL bar. (If you look at the left of the URL, one can click on the certificate information.)Surprisingly, there is no such certificate information available at this other Ubuntu web page:
I connect to the internet through a password-protected network which stores my login for one day (basically every morning I get a redirect to a website that prompts for a username and password). The problem is that since my homepage is also 'https://' I get chromium's 'This is probably not the webpage you are looking for!' message. Quite simply, I want chromium to trust the login site enough that it will simply proceed to it without confirmation without applying this reduced security to other sites.
What is the easiest way to block one specifiek web page?Can I use the file /etc/deny host, or should I use another program to do this?I have already search the web andfound iptables, but that is to difficult for me, and I found squid
I'm on the Slackware Security mailing list and I check the Slackware Security Advisory page daily because the mailing list, for me at least, is unreliable. After Firefox nagged me about a security update, I went to one of the FTP mirrors to check for an update and the Firefox update was there, but it still hasn't shown up on the official Slackware Security update page.
This is a very trivial thing, but when I SSH into my newly updated 10.04 server, the stats of the server are displayed on the screen twice. How can i fix it to just display once? This is what it is doing.Quote:Linux xxxxx 2.6.32-21-generic-pae #32-Ubuntu SMP Fri Apr 16 09:39:35 UTC 2010 i686 GNU/LinuxUbuntu 10.04 LTS
Welcome to Ubuntu! * Documentation: https://help.ubuntu.com/ System information as of Tue May 4 08:41:22 EDT 2010
I didn't setup our old system but that server has since been retired due to hardware failure, but we have some new webervers, and a new postgtres database server in place which I need monitoring. Some items include database size, transactions per second, io stats, memory monitoring, or as the DBA put "anything I can get". We had used CACTI in the past for these items, but I didn't do the setup and sondering if that is what is recommended, or something else. I am looking at groundwork open source which include nagios, etc. but figured I would consult the uforums since all these servers are running userver 9.x and 10.x .
Just wondering if there is a way to show the stats of the hard drive(s) in the computer mainly the current read and write transfer rates. Similar to the network graphs that show the Download and Upload speeds, but concerning the hard drive(s) in the computer. I know you can monitor the hard drive temperature, but I haven't seen anybody monitor the stats for the hard drive.
A small "mom and pop" WISP would like to provide account usage information to customers.Basically, when a person connecting to the WISP's web site is a customer with an IP address from within the WISP's subnets, a link would appear on the web page where customers could read total bandwidth usage (daily, weekly, monthly, and yearly totals and averages) and public IP address. Information could include the top five bandwidth URLs visited; graphs or charts of usage; and usage during specific periods, such as business hours (8AM-5PM), evening hours (5PM-10PM), night (10PM-8AM), and weekends (10PM Friday-8AM Monday).
The WISP has installed cricket (http://cricket.sourceforge.net) and rrdtool (http://oss.oetiker.ch/rrdtool). The next trick is to grab and format the data for customers.I'm not looking for answers like "look at xyz package." Helpful responses will include a rudimentary outline to solve the problem. That is, "xyz package" might indeed be what the WISP needs, but some guidance how to use xyz is needed to move down the road.I have no experience with this type of thing. I appreciate responses from people who are experienced.
I am on Fedora 13, and have a Radeon HD 3870 graphics card. Using the standard open source graphics driver. I am doing some video editing and watching some large movie files (5gb+) so I would like to see if my graphics card is actually doing anything to help. I'm up against a brick wall though - I can't get the GPU usage, memory, or temp. It should do something through GPU hardware video decoding with motion compensation etc. But I can't tell if it doing anything because I can't find anyway to test it.I've enabled direct rendering in mplayer but I don't know if it is doing anything as I have no way to test.
Am I able to know how much data I received and sent till date since my MINT 9? I wanted to know is there method that counts total stats of upload and download since I installed My linux...System monitor is good for only one session..Once you restart the PC all gone.....
I'm planing to write a bash script that will make some web stats reports and I'm stuck on beginning because I don't know how can I read a directory content, put everything in a variable, compare the variable value with current date and go further.More specific ...
I have /var/apache/log/. Here I have access logs on date ( like access.log.24.06.2010.gz and so on ).
How can I do to automatically zgrep (in a bash script) last day .gz ??
I'm working on an application used for backup/archiving. That can be archiving contents on block devices, tapes, as well as regular files. The application stores data in hard packed low redundancy heaps with multiple indexes pointing out uniquely stored, (shared), fractions in the heap.
And the application supports taking and reverting to snapshot of total storage on several computers running different OS, as well as simply taking on archiving of single files. It uses hamming code diversity to defeat the disk rot, instead of using raid arrays which has proven to become pretty much useless when the arrays climb over some terabytes in size. It is intended to be a distributed CMS (content management system) for a diversity of platforms, with focus on secure storage/archiving. i have a unix shell tool that acts like gzip, cat, dd etc in being able to pipe data between applications.
Example:
dd if=/dev/sda bs=1b | gzip -cq > my.sda.raw.gz
the tool can handle different files in a struct array, like:
Is there a better way of getting the file name of the redirected file, (respecting the fact that there may not always exist such a thing as a file name for a redirection pipe). Should i work with inodes instead, and then take a completely different approach when porting to non-unix platforms? Why isn't there a system call like get_filename(stdin); ?
If you have any input on this, or some questions, then please don't hesitate to post in this thread. To add some offtopic to the thread - Here is a performance tip: When doing data shuffling on streams one should avoid just using some arbitrary record length, (like 512 bytes). Use stat() to get the recommended block size in stat.st_blksize and use copy buffers of that size to get optimal throughput in your programs.
i am able to try ubuntu and everything works fine until i try to install. the menu comes up to the first page where it tells you to plug in your machine and make sure there is enough disk space and network connectivity. when i hit next on this page the mouse icon changes but the next page never loads. the longest i let it hang there was 2 hours. ive tried multiple times with the same result.
im running from a flash drive on an ASUS Eee PC 1001P-PU17
10.04; 64 bit In Firefox, the size of the fonts varies greatly from site to site. Some are too small to read, others huge. Some headings and menus overlap.Screen size: 1152 x 864
My settings are: Proportional: serif 14 Serif: Times New Roman Sans serif: Arial Monospace: Courier New 14
On FC11 64 bit with Adobe flash plugin for Linux installed, I see segfault errors from "npviewer" in /var/log/messages. The only browser I have tried yet, Firefox, has glitches every now and then. Sometimes it shows the title of a page in a tab, but the page is blank. This can even happen when I try the Google main page. Is it true that npviewer has something to do with Adobe flash? Is there a way to fix the problem? If it is caused by Adobe flash, is there a different plugin that will replace Adobe flash player?