General :: Creating Random Large Files?

Aug 27, 2010

how I can randomly write / create a 1 GB file in bash to test disk / network i/o? I was told I could use the 'dd' command but I don't know if there are some better ways and or what the 'dd' command looks like.

View 7 Replies


ADVERTISEMENT

Fedora :: Creating A DOS Based USB Flash Using A Large (2GB) USB Key?

Oct 5, 2009

Looking for an up-to-date guide for creation on DOS based USB flash using a large (2GB) USB key. All the guides are OLD and not working.

View 14 Replies View Related

OpenSUSE :: Creating Large Partition In Hard Drive?

Apr 14, 2010

I am so sorry but this is not for me. I cant make this work. I want to install windows xp back in my pc, i just give up with Linux, I lack the expertise to do anything here.

View 2 Replies View Related

General :: Creating A Script To Move Or Copy Files Into Multiple Directories Below The Files?

Aug 25, 2009

How can you create a script to move or copy files from a main directory into multiple directories below the main directory.

View 7 Replies View Related

General :: Downloading Very Large Files Via SFTP

Apr 1, 2011

I need to download some very large files (circa 75 GB) from a remote server via SFTP. I've been using SFTP via the command line on my Linux netbook. Around halfway through, the transfer stops and says "stalled." Can anybody recommend a reliable way to download these files?

View 1 Replies View Related

General :: Tar Not Working With Large Number Of Files?

Dec 6, 2010

in my middle of script i am using tar command to tar some 1000 images and size of each image is 5MB.

all the images are provided as argument as tar -cvf file.tar <all images as argument>

but my tar file file.tar does not contain all the images.

View 4 Replies View Related

General :: Copying Large Number Of Files From One Directory To Another

Feb 10, 2010

I've a directory containing around 2.8 lacs of files. I want to move them to another directory.If I use cp or mv then I get an error 'argument list too long'. If I write a script like

for file in ls *; do
cp {source} to {destination}
done

then because of ls command , its performance degrades.How can I do this?

View 7 Replies View Related

General :: Transferring Large Files Using Scp With CPU And Memory Considerations?

Oct 5, 2010

I want to transfer an arbitrarily large file (say >20GB) between 2 servers. I have several considerations:

Must use port 22 (ssh) because of firewall restrictions
Cannot tax the CPU (production server)
Memory efficiency
Would prefer a checksum check but that could be done manually
Time is not of the essence

Server A and Server B are on the same private network (sharing a switch) and data security is not a concern, Server A and Server B are not on the same network and transfer will be via the public internet so data security is a concern, My first thought was using nice on an scp command with a non-CPU-intensive cypher (blowfish?). But I thought I'll refer to the SU community for recommendations.

View 1 Replies View Related

General :: Copying Large Number Of Files In Windows?

Mar 15, 2011

I am facing problem in copying a large number of file 18 lakh (18,000,000) files from my personal hardisk to another hardisk each file is very small and size of folder is around 3.95 GB copying files using copy given by Windows is frustrating and I am not even able to compress file its giving me error that its not readable.And problem is I am not able to open this drive in Linux it showing me error there saying do diskchk in Windows and Windows disk check is also not able to repair this drive and goes into some mode unsolvable.Is there any way to open disk with error to open in Windows and if not any way I can copy data faster?ERROR: Disk labled EDU is corrupt go to windows and chkdsk /f there and reboot into window 2 times.

View 3 Replies View Related

General :: Share A Large Number Of Files Into Chroot Env?

Aug 17, 2010

I understand that chroot is usually used to provide security, however, for my issue, security is a big don't care. I am very new to using chroot and don't fully understand how the chroot'd env works.

problem: Trying to use a vendor supplied cross compile environment. The environment runs as a chroot'd env and works just fine. I have a large number of additional modules that I wish to compile in the chroot'd environment. FYI, these modules are also (succesfully) compiled for other targets not using chroot'd env's. Copying the source files into the the chroot environment is not an option (don't have hours to wait for copies to finish and it would break the make system). Having them live in the environment is also not an option (the chroot build is a tiny part of the build process and we cannot revamp our entire source tree to accommodate it).

I am looking for a way to have the compiler in the chroot'd env have access to a path that is outside of the env and typically higher up in the same path that holds the chroot'd env. I have tried soft links (they don't work as expected). Hard links only work for single files and there are 10's of thousands of files that would need to be linked. I am not sure how I would go about exporting the additional files and then mounting the exported files in the chroot'd env (or if that would even work).

View 2 Replies View Related

General :: Searching Text Files For Large Numbers?

Dec 31, 2010

I am looking for a way to search for large numbers in text files and print the nearby lines.

For example if I had a text file like:

Event: 11
blah: 3
blah: 41 bleh: 19
Event: 2
blah: 31

[Code].....

View 1 Replies View Related

General :: Uploading And Downloading Large Files Between Clients?

Jun 5, 2011

I am looking for a file sharing program to install on my dedicated server that will allow me to upload large MP3 files and allow my clients to download them. these files are recordings of counseling sessions for families who are seeking help for their children.

What I am looking for is similar to the system this company uses [URL].

View 4 Replies View Related

General :: Manually Sort Random Text Files?

Jun 6, 2010

I have about 1000 text files and I need to view each, and move it to a folder if it's the correct one. I can only do basic sorting by length/size, and I can't grep because the text is random. How can I do this besides manually openiing + saving each in gedit.I'm on Ubuntu Linux.I've already done all the sorting I can based on ize,wordcount,greps,date,etc. This is what's left over. I'm trying to find an easy way to view +save/ignore the rest.

View 6 Replies View Related

General :: Windows - How To Batch Rename Files With A Random Name

Jun 30, 2011

I have a bunch of photos with varying names. I want to give each photo a random name(*), how do I do that? (*)I'm going to put them on a digital photo-frame that can't shuffle

View 4 Replies View Related

General :: Nas - Most Effective Backup Software -> When Dealing With Large Numbers Of Files?

Jul 18, 2010

I have two NASes. I work off of one, and the other is used as a backup. As I have it set up now, it's slow. Running a backup takes a week. Even for 7 TB, with 1,979,407 files, this seems a bit outlandish,particularly as both systems are RAID-5 and the network is all gigabit. I've been digging about in the rsync man pages, and I really don't understand what differentiates the various topologies.Right now, all the processing is being done on the backup NAS, which has the main volume from the main NAS mounted locally over SMB. I suspect that the SMB overhead is killing me, particularly when dealing with lots of files.

I think what I need is to set up rsync on the main nas as a daemon, and then run a local rsync client to connect to it, which would hopefully allow me to completely avoid the whole SMB-in-the-middle affair, but aside from mentioning that it's there, I can find very little information on why one would want to use the daemon mode for rsync.

Here's my current rsync command line: rsync -r -progress --delete /cifs/Thecus/ /mnt/Storage/input? Is there a better way/tool to do this? Edit:Ok, to address the additional questions: The "Main" NAS is a Thecus N7700. I have additional modules installed that give me SSH, and it has rsync, but it's not in the $PATH, and I havn't figured out how to edit the local $PATH in a way that persists between reboots. The "Backup" NAS is a DIY affair, built around a 1.6Ghz Via Mobo with a Adaptec Hardware RAID card. It's running CentOS 5 with a full desktop environment. It's the hardware I'm running rsync from. (Gigabit is through a additional PCI card).

Further Edit: Ok, got rsync over SSH working (thanks, lajuette!).I had to do a bit of tweaking on my command line, I'm running rsync with the args:rsync -rum --inplace --progress --delete --rsync-path=/opt/bin/rsync sys@10.1.1.10:/raid/data/Storage /mnt/Storage (Note: I'm specifically not using -a, because I want to change the ownership to the local account, to not freak-out SELinux)

View 5 Replies View Related

General :: Cp Adds Exclamation Points When Copying Very Large Text Files?

Jul 13, 2009

For my research I have some very large files that are basically millions of lines of ten columns of numbers. These files can be up to 5 GB in size. Recently I noticed that when I made a copy of one of my files, some exclamation points appeared in it where there should not be any: in front of random numbers throughout the file. Making another copy of the file would result in exclamation points in front of different numbers in different parts of the file. Doing this many times has given me up to four exclamation points in different parts of the file. Sometimes the file copies just fine without producing any extraneous exclamation points.Additionally, I have occasionally seen a "^K" where there should be a newline (the data that should have been on the next line was instead on the previous line with a ^K in front of it) in copies that I have made of my files. I don't know if this is related or not.

View 7 Replies View Related

General :: Generate Random List And Determine Size Of Arbitrary Block Of Files In Dir?

Mar 4, 2010

I want to generate a temporary random list from a directory of files and then determine the size of an arbitrary block of files from this list (say 1-25 or 26-50) and add their names to a file along with some other info for each name. I can generate a random list with file sizes like this: ls -l | sort -R | cut -d " " -f 6 but i'm not sure how to add up the sizes of just a certain block of these files and at the same time save the file names.

View 2 Replies View Related

General :: Creating Files In Red Hat ?

Feb 14, 2011

Creating Files in Red Hat Linux ?

View 1 Replies View Related

Ubuntu :: Command With The -r Option To Compare A Large Number Of Files And Files In Subdirectories

Jun 16, 2011

I am using the diff command with the -r option, to compare a large number of files and files in subdirectories. My main interest is to find out which files have been changed, and not what the actual changes are, and since a lot of files has been changed, it would be a lot easier to view the file names only. Is there and option for diff that might do this, or does there exist a similar tool/command that could do the job?

View 1 Replies View Related

OpenSUSE :: Dolphin Losing Files When Copying Many Files Or Large Folders?

Feb 14, 2010

I've discovered that Dolphin seems to lose random files when copying many large folders.

I first noticed this a few months ago when I tried to copy my music library from one folder to another on the same HDD. It consisted of around 600 folders and 6500 files. During the copy there were no errors but after the copy I found that some of the newly copied folders were missing files. I put it down to human error or a glitch.

Yesterday I tried to copy 13 folders containing rips of some of my DVDs. Each folder basically had one film of either 700MB or 1.4GB. Again no errors showed up during the copy but I found 3 of the newly copied folders were empty.

It's not so critical with music or films but I can't afford to lose work data like this.

Has anyone experienced or seen a similar problem with Dolphin? I'm going to have to do some more extensive testing but this is not good.

The first time I noticed the problem I was running KDE4.3.4 (I think) and now the latest was with KDE4.4.0.

View 9 Replies View Related

Software :: Differentiate Two Large Text Files Using Shell Script / Files Are Like Below?

Jan 20, 2009

I want to automate this using script.How to automate it?

File1:
s.no# 1 name:aaaaaa
city:abcd

[code]...

View 1 Replies View Related

General :: Creating Multi Volume Tar Files?

May 29, 2010

full snapshot of ArchLinux Repo for x86_64I want to use this as my restore backup should I need to reinstall Arch without network support.How do I build several *tar volumes of my /mount/my_repo to fit into 4.5GB DVDs ... the thing is 18 GB size...How do I extract all the *tar created to a folder later on...? is it the same as extracting a single *tar, will tar find all volumes in the same directory level so as to continue extracing or do I need to merge them in some way

View 1 Replies View Related

General :: Concatenating 2 Files Without Creating Newline Between Them?

Dec 23, 2010

Concatenating two files without creating a newline between them? So how is it possible to do that? I've tried the following:
Code:
echo 123 > file1
echo 456 > file2
cat file2 file2 > file3

[Code]....

View 8 Replies View Related

General :: Download Helper Creating Zero Size Files

Mar 30, 2011

I've got a small problem with dwhelper. Now when I'm trying to download anything from ..... DH doesn't download and creates only empty files.

View 1 Replies View Related

General :: Sendmail - Creating Queue Files Directly?

Jul 27, 2010

I am directly creating "qf" & "df" files into the sendmail queue folder and then processing this queue by command line. This is the only way to export data and email from this old application I am stuck with.This works quite well in my test enviornment but I am really new to linux/sendmail and just looking for any feedback on this process. Is this direct creation of queue files safe and any pitfalls that I should be aware of?

View 2 Replies View Related

General :: Creating A Script That Removes The Oldest Files On /tmp Directory?

Nov 17, 2010

I'm trying to configure a script the deletes the file and directories with more than 10 weeks on my /tmp directory, this is what i'm trying to do:

Quote:

#!/bin/bash
#Script para borrado de ficheros y PDFs de /tmp
### Directorios/PDFs a borrar

[code]....

View 6 Replies View Related

General :: Creating Watch On One Directory In Which Files Are Continuously Coming?

May 4, 2010

Actuaaly i am creating watch on one directory in which files are continuously coming.Is there any command which can give listing of all files who have come in last 24 hrs.

View 8 Replies View Related

General :: Why Scanning 10 Pages Is Always Creating Huge >45Mb To 110Mb PDF Files

Jan 9, 2011

Please why my scanning is always creating huge 50Mb to 100Mb PDF files ?Each A4 Pnm file is of 6.5Mbytes by resolution of 150.If I decrease the resolution lower than 100, then it starts to be unreadable my text ...

View 11 Replies View Related

General :: Transfer Large Number Of Files Host To Host

Oct 20, 2010

I have two servers, one has an empty / and the other has a subdirectory with a large number (4 gig) with many, many files. I need a way to transfer the files en masse from the server with the large number of files to the one that is essentially blank.I don't have space on the used host to simply gzip all the files. I've googled this and see that there may be some combination of tar and/or gzip that will let me do this with some sort of redirection.

I really need and example line of how this can be accomplished. If my explanation seems rather sparse, I can supply more details.

View 3 Replies View Related

OpenSUSE :: Burn Large Files With K3b Udf

Nov 3, 2010

Problem is with files greater than 4G onto dual layer or BD disks.mkisofs crashes. I believe the problem to be when this requires udf filesystem and disk does not get written to. As yesterday version is suse11.3 patched to date. k3b writes standard dvd disks ok. I am seeing a lot of searches saying this is because of cdrkit rather than cdrtools, can I replace this easily? Is this the case? Tried setting the k3b options when burning to udf same error. Other searches show k3b has fixed probs with these issues, so appears to point to underlying mkisofs stuff full error log in yesterdays post if it helps. I will also try on ubuntu to see if it works there

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved