Server :: Generate Very Large Files And Simulate A Consistent Throughput To The LUN

Jul 6, 2010

I have a test RHEL5 box, sitting in a brand new Dell blade rack on a PowerEdge M610, with a lovely Emulex OCm10102-F-M FCoE card connected to a Cisco Nexus 5000 switch. The whole setup is extremely new (the cards only recently became available for purchase.) We've finally worked with Emulex to get the cards functional, and we are ready to some stress testing to the SAN. My question now is, is there a good tool I can use to generate a large amount of traffic to a LUN? The Wintel team used a windows-only tool that showed an average of 6gigs/second throughput, so I need something that can generate very large files and simulate a consistent throughput to the LUN. I found iozone, but I'm having a devil of a time with it.

View 2 Replies


ADVERTISEMENT

Server :: Website Which Generate Alot Of Core Files?

Jan 26, 2011

i have a server & there is a website which generate alot of core files, these core files takeup alot of space by time, so i decided to stop it, in WHM the core dump option is disabled, & i used the following command

Code:
ulimit -c 0 but nothing stoped generating core files, i also used the following method http://www.cyberciti.biz/faq/linux-disable-core-dumps/ also this method doesnt stop generating of core dump files, my question is : how i can permenantly stop generating core files for all users or for specific user

View 1 Replies View Related

Ubuntu :: Server Crashes When Transferring Large Files

Feb 6, 2011

Every time I attempt to transfer over a large file (4 GB) via any protocol, my server restarts. On the rare occasion that it doesn't restart, it spits out a few error messages saying "local_softirq_pending 08" and then promptly freezes. Small files transfer fine.

Relevant information:

Ubuntu server 10.10
Four hard drives in RAID 5 configuration
CPU/HD temperatures are within normal range

View 7 Replies View Related

Red Hat / Fedora :: Calculating The Throughput Of A Server?

Feb 21, 2011

I have read many articles on hdparam to calculate the disk read and write speeds and some on interface and CPU limits. But is there a structured way of calculating the maximum throughput of a server including all the subsystems. Like storage, CPU, network, memory and so on? So that I can create a script that i can run on a newly installed Linux machine and calculate the maximum throughput .

View 4 Replies View Related

Server :: Enable Buffered I/O For Increased Disk Throughput?

Sep 1, 2010

How do I enable buffered I/O for increased disk throughput on Linux ?

View 3 Replies View Related

Networking :: Tuning For High Throughput Reverse-proxy Server?

Jan 18, 2011

I have an enormous quadcore machine with 16gb ram and dual gigabit NICs. It used to be for MySQL but we have upgraded the whole database infrastructure so now this server is left floating. I had the great idea of turning this into a reverse-proxy (using apache mod_proxy) and it really handles a ton of requests. But I have a feeling that we are not getting the most use out of what it can offer.

Our traffic consists of a few thousand very small (less than 10 byte) ajax calls per second, and frequently I find we are running out of kernel allocated network stack to handle all the requests. Often we get the kern.log warning "possible SYN flooding on port 80. Sending cookies." and other things like this. Obviously we are not getting SYN flooded, we just have very high demand.

So far I have found a few kernel tuning guides to tell the kernel to allocate more of the base system memory for networking but every guide I have found has been for the purpose of increasing the performance between WAN links (direct backbones between offices etc) and usually with very large file sizes being the priority. One such example (and great) write up is here:

cyberciti.biz/faq/linux-tcp-tuning/

I was hoping some people could provide further input, such as along the lines of disabling nf_conntrack (to speed up socket set up/tear down time) or anything that will speed up a high throughput proxy like mine. Any links to studies or benchmarks between different configurations or hardware gets extra points!

View 7 Replies View Related

Ubuntu :: Command With The -r Option To Compare A Large Number Of Files And Files In Subdirectories

Jun 16, 2011

I am using the diff command with the -r option, to compare a large number of files and files in subdirectories. My main interest is to find out which files have been changed, and not what the actual changes are, and since a lot of files has been changed, it would be a lot easier to view the file names only. Is there and option for diff that might do this, or does there exist a similar tool/command that could do the job?

View 1 Replies View Related

OpenSUSE :: Dolphin Losing Files When Copying Many Files Or Large Folders?

Feb 14, 2010

I've discovered that Dolphin seems to lose random files when copying many large folders.

I first noticed this a few months ago when I tried to copy my music library from one folder to another on the same HDD. It consisted of around 600 folders and 6500 files. During the copy there were no errors but after the copy I found that some of the newly copied folders were missing files. I put it down to human error or a glitch.

Yesterday I tried to copy 13 folders containing rips of some of my DVDs. Each folder basically had one film of either 700MB or 1.4GB. Again no errors showed up during the copy but I found 3 of the newly copied folders were empty.

It's not so critical with music or films but I can't afford to lose work data like this.

Has anyone experienced or seen a similar problem with Dolphin? I'm going to have to do some more extensive testing but this is not good.

The first time I noticed the problem I was running KDE4.3.4 (I think) and now the latest was with KDE4.4.0.

View 9 Replies View Related

Software :: Differentiate Two Large Text Files Using Shell Script / Files Are Like Below?

Jan 20, 2009

I want to automate this using script.How to automate it?

File1:
s.no# 1 name:aaaaaa
city:abcd

[code]...

View 1 Replies View Related

General :: How To Generate Report - Sa* And Sar* Files ?

May 29, 2011

I have collected the sa* and sar* files for the past two weeks , i need to generate the report for these files how can i do so i am using centos 5.5. assist me with a tool or a command to do the same.

View 6 Replies View Related

Ubuntu :: Generate Compressed Files In Zip Format ?

Oct 30, 2010

I was trying to figure out how to generate compressed files in zip format and searched on here. The search produced a list of forum entries on the topic, but all of the instructions were on how to do it in terminal, how to download obscure programs and install them from terminal, then run them from terminal, with all these arcane sets of switches and parameters. Eesh.

It comes with Ubuntu, after all. In the case of zipping files, all you have to do is to go to the File Manager, find the file(s), select it or them, right-click on it or them, and select Compress and file type zip. It's so simple.

There have been a number of other tasks where I wind up spending hours figuring out how to implement the advice offered in these forums through Terminal. The folks who offer the advice often are so good at it that they leave out steps obvious to them, but that take a lot of work for somebody not as skilled at it to find out. After crawling through broken glass to get the job done, and normally screwing something up so it's not quite right once I get it going, I figure out how to do it through the GUI and find out it takes a fraction of the effort.

View 9 Replies View Related

Ubuntu :: Generate A List With Files And Folders?

Feb 2, 2011

if there is an application available to generate a list with files and folders from a location, like a hard drive or a folder? The list could be in any format, even a text file would be just fine.

View 3 Replies View Related

Software :: Automatically Generate Text Files ?

Sep 17, 2010

I have a personal server that is password protected. I convert my movies, and make it so that I can watch them through a browser when I want to. I have quite a few movies. They are sorted into different directories. In each directory there is a WebM folder that contains the video file.

Here is what I want to do:Scan all of the directories and subdirectories in /var/www/ for files with extension of webm.

When a file is found, go to the parent directory and create an html file with the same name as the video, only with an html extension instead of webm.

Automatically enter the following html code into the file with the file name matching the file that was found.

Code:

Is it possible to do this with a script? Is there a GUI program that can do this? I don't mind running the script every time after I convert a movie, but if it could monitor the folders that would be nice.

View 11 Replies View Related

CentOS 5 :: Can't Get Kdump To Generate Dump Files?

Dec 7, 2010

I'm posting because I've read everything I can find on google and in the forum and the man page and still can't get it to work. I did read the FAQ, I hope I have adhered to it.I've tried several things and I don't remember exactly everything I tried and in what order.I've got several (12) HP ProLiant DL140 G3 servers running CentOS 5 that lockup about once a week. These are in a remote colo cage so I all i have access to is the built-in HP lights-out management interface, which includes a console, and ssh. I've been trying to get kdump setup to try to figure out what's going on. As an aside, if I run top on the console (via the management interface) the servers stay up for about a week, if I don't run top they crash within about 48 hours.I've used yum update to update to the latest available kernel (2.6.18-194.26.1.el5.x86_64) and installed the debuginfo and debuginfo-common RPMs from http://debuginfo.centos.org/5/x86_64/I have a single command in the /etc/kdump.conf file:ext3 /dev/sda5

View 4 Replies View Related

Ubuntu Security :: Encrypt Files Using The Keys - Generate ?

Sep 8, 2010

I recently upgraded to Ubuntu 10.04. I love the passwords and keys application, but was somewhat surprised at the lack of a context menu in gnome to encrypt a file.

In general, I cannot find how to encrypt files using the keys I generate. Maybe I'm missing something? Probably, I just thought since Ubuntu comes with OOB key generation it would have OOB encryption capabilities.

I've read about seahorse and other ways to ADD encryption, I'm just wondering if ubuntu does it natively. It'd be a good idea to add to brainstorms, right click and encrypt.

View 6 Replies View Related

Ubuntu :: Should Mediatomb Generate 3 Separate Of Huge Log Files With 5gb Of Data

Jul 4, 2011

I streamed video through a my computer with mediatomb yesterday. The problem is that now, I got these huge log files. I am running out of memory (less than 1 gb left) as we speak. They're filled with ufw entries, but my question is:

I read somewhere about a program called logrotate that were supposed to keep logs from getting to big, is this wrong and should mediatomb generate 3 separate log files with 5gb of data each for just 2 hours of streaming?

View 2 Replies View Related

Ubuntu Networking :: How To Generate Default Samba Configuration Files

Sep 1, 2011

I had some troubles on samba, so I re-installed it.After I uninstalled samba, I noticed old /etc/samba folder/ files were left, so I deleted all of them. Then I installed samba, however, no /etc/samba files were installed.How can I generate default samba configuration files??

View 4 Replies View Related

Hardware :: Print To File Printer - To Generate Postscript And PDF Files

Jan 12, 2010

My Ubuntu Karmic install has a very useful Print-to-File Printer configured which allows me to generate Postscript and PDF files.

I have Windows XP running on Virtualbox and I want to be able to generate PDF files using this printer but there appear to be no Printers configured.

View 1 Replies View Related

Fedora :: Generate Report Other Than Going Through All The Nagios Configuration Files And Putting It Together Manually?

Feb 23, 2010

I have an existing Nagios installation from before I started working here. It seems to be working well but my boss has just asked for a report of what servers and the associated processes/functions are being monitored. Is there any way to generate this report other than going through all the Nagios configuration files and putting it together manually?

View 3 Replies View Related

General :: Generate Random List And Determine Size Of Arbitrary Block Of Files In Dir?

Mar 4, 2010

I want to generate a temporary random list from a directory of files and then determine the size of an arbitrary block of files from this list (say 1-25 or 26-50) and add their names to a file along with some other info for each name. I can generate a random list with file sizes like this: ls -l | sort -R | cut -d " " -f 6 but i'm not sure how to add up the sizes of just a certain block of these files and at the same time save the file names.

View 2 Replies View Related

OpenSUSE :: Burn Large Files With K3b Udf

Nov 3, 2010

Problem is with files greater than 4G onto dual layer or BD disks.mkisofs crashes. I believe the problem to be when this requires udf filesystem and disk does not get written to. As yesterday version is suse11.3 patched to date. k3b writes standard dvd disks ok. I am seeing a lot of searches saying this is because of cdrkit rather than cdrtools, can I replace this easily? Is this the case? Tried setting the k3b options when burning to udf same error. Other searches show k3b has fixed probs with these issues, so appears to point to underlying mkisofs stuff full error log in yesterdays post if it helps. I will also try on ubuntu to see if it works there

View 1 Replies View Related

Software :: Cp To Have A Progress Bar For Large Files?

Mar 10, 2010

I always wanted cp to have a progress bar for large files. I came across this:[URL]... I just wonder, how could you install it as an Arch package? Is it possible?

View 9 Replies View Related

Software :: RPM Not Supporting Large Files?

Jan 5, 2010

I currently using RPM version 4.4.2.3. It fails to build with files larger than 2 GB and file sets larger than 3 GB. I have files larger than 2 GB and sets larger than 3 GB.Is there a work around, possibly a switch or option for RPM, that will ignore this limitation?

View 8 Replies View Related

Slackware :: Get Ftp To Download Large Files?

May 29, 2010

I am using xfce desktop and terminal.

1) How do i check to see what is the default FTP Client used by Slackware12.2 when i type:
$ftp

2) How do i get ftp to download large files? > 2gb from ftp server elektroni.phys.tut.fi or any ftp server.

View 14 Replies View Related

General :: Downloading Very Large Files Via SFTP

Apr 1, 2011

I need to download some very large files (circa 75 GB) from a remote server via SFTP. I've been using SFTP via the command line on my Linux netbook. Around halfway through, the transfer stops and says "stalled." Can anybody recommend a reliable way to download these files?

View 1 Replies View Related

Ubuntu :: Rsync Really Slow On Large Files?

Mar 1, 2010

I have Ubuntu on both my laptop and desktop machines, both are connected to the same network. I back up the laptop to the desktop by running the following on the laptop:

rsync -avv --stats /home/alisdt alisdt@xxx.xxx.xxx.xxx:/home/alisdt/laptop_backup (with the IP address of the desktop instead of the many x, obviously). Whenever rsync hits a large file (greater than a few MB), the network use rapidly drops to ~60KB/s (that's kilobytes not bits). When I copy the same file to the same place using scp, I get > 500KB/s throughout the transfer. Things I've tried:

* mounting the desktop home dir on the laptop using SSHFS -- a simple file copy is fast, rsync is still slow
* ditto with NFS
* rsync --whole-file option, in case the delta-transfer algorithm was choking on large files
* rsync --inplace option
* HPN-SSH (http://www.psc.edu/networking/projects/hpn-ssh/) to enable dynamic window and unencrypted bulk transfer, just in case it was some ssh bottleneck I think it's either an rsync application problem, or a network problem that is only affecting rsync. Any ideas, or other ideas of what I can try to debug? In case it's relevant, I'm using 9.04 on both machines. (A standing bug prevents me from upgrading the laptop, and I haven't bothered to upgrade the desktop).

View 3 Replies View Related

Ubuntu :: Moving Large Amounts Of Files

Mar 6, 2010

I am trying to move a large amount of files (over 30k and 86GB) to another HDD but I get a Augment list too large error?? I tried rsync, cp, mv and still the same error

View 1 Replies View Related

Ubuntu :: Cups: Large Files Do Not Print?

Nov 29, 2010

i have a problem with cups on a lucid/64 machine.

"Unable to write print data: Broken pipe"

The pdf-file to print has a size of 4,7MB. After sending the file to the printer the size of the file is more than 18 MB.

We use a Xerox WorkCentre 7232 which is via

socket://ip_adress:9100

connected to cups. The same configuration had been working fine for several years with hardy.

Cups refuses to print large files. When splitting up the print-file all works fine.

View 4 Replies View Related

General :: Creating Random Large Files?

Aug 27, 2010

how I can randomly write / create a 1 GB file in bash to test disk / network i/o? I was told I could use the 'dd' command but I don't know if there are some better ways and or what the 'dd' command looks like.

View 7 Replies View Related

General :: Tar Not Working With Large Number Of Files?

Dec 6, 2010

in my middle of script i am using tar command to tar some 1000 images and size of each image is 5MB.

all the images are provided as argument as tar -cvf file.tar <all images as argument>

but my tar file file.tar does not contain all the images.

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved