General :: Uploading And Downloading Large Files Between Clients?

Jun 5, 2011

I am looking for a file sharing program to install on my dedicated server that will allow me to upload large MP3 files and allow my clients to download them. these files are recordings of counseling sessions for families who are seeking help for their children.

What I am looking for is similar to the system this company uses [URL].

View 4 Replies


ADVERTISEMENT

General :: Downloading Very Large Files Via SFTP

Apr 1, 2011

I need to download some very large files (circa 75 GB) from a remote server via SFTP. I've been using SFTP via the command line on my Linux netbook. Around halfway through, the transfer stops and says "stalled." Can anybody recommend a reliable way to download these files?

View 1 Replies View Related

General :: Uploading Files To S3 Account From Command Prompt?

May 6, 2011

I've got several large files sitting in my linux hosted account that I need to upload to my S3 account. I dont want to download them first and then upload it into S3. Is there any way I can "upload" it via the linux command line environment? Or access it via a website working with lynx?

View 1 Replies View Related

General :: Uploading Files To A CentOS-server With Vsftpd

Sep 16, 2010

I'm having difficulties with uploading files to a CentOS-server with vsftpd. I have the exact same configuration on a Fedora10 and there I have no problems...

[Code]...

View 2 Replies View Related

Ubuntu One :: Uploading Files/sync Files Doesn't Work

Aug 23, 2010

OS: ubuntu10.04 LTS running on latest oracle virtualbox. This works: I have opened a ubuntu one account. And I can log into that account. But I have to do so by: clicking 'ubuntu one' in top bar, and click 'account' in prompt that appears. Shouldn't log in take place automatically? Being logged in I'm able to make new folders. And appearently able to enter them (they are empty I quess) If I try to upload a file clicking 'upload file' in ubuntu one. A prompt appears and I choose the file to upload and click 'continue'. The prompt says "uploading" but nothing happens. If I choose a document folder and click 'mouse click right'. And then click 'sync with ubuntu one'. It prompts that it syncs the folders. But nothing happens.

View 2 Replies View Related

General :: Creating Random Large Files?

Aug 27, 2010

how I can randomly write / create a 1 GB file in bash to test disk / network i/o? I was told I could use the 'dd' command but I don't know if there are some better ways and or what the 'dd' command looks like.

View 7 Replies View Related

General :: Tar Not Working With Large Number Of Files?

Dec 6, 2010

in my middle of script i am using tar command to tar some 1000 images and size of each image is 5MB.

all the images are provided as argument as tar -cvf file.tar <all images as argument>

but my tar file file.tar does not contain all the images.

View 4 Replies View Related

General :: Downloading Files With Same Directory Structure

Apr 8, 2010

How do I download all the files form here: [URL]. I am on freeBSD 7.0 and I tried wget with the -r switch and it gives me URL's only. Maybe this is simply not an ftp site I don't know. How I can download all those files with the same directory structure.

View 5 Replies View Related

Ubuntu :: Uploading Music Files From MTP?

Nov 1, 2010

How does one transfer music from a MTP device to the Music folder? Or upload music from the MTP device to programs such as RhythmBox, Banshee, etc.? I have tried using Gnomad2 to no avail and I have exhausted my self searching the online Ubuntu community forums and doing searches on search engines regarding this subject reading through countless article of users trying to get Linux system to recognize or mount their MTP devices.

My system does recognize and mount my MTP device (Creative Zen Micro) and the Music Players (RhythmBox, Banshee, Amarok, VLC, XBMC) all access and play the music without any problems. I just want to transfer or copy the music from my MTP device to my system.

View 5 Replies View Related

General :: Copying Large Number Of Files From One Directory To Another

Feb 10, 2010

I've a directory containing around 2.8 lacs of files. I want to move them to another directory.If I use cp or mv then I get an error 'argument list too long'. If I write a script like

for file in ls *; do
cp {source} to {destination}
done

then because of ls command , its performance degrades.How can I do this?

View 7 Replies View Related

General :: Transferring Large Files Using Scp With CPU And Memory Considerations?

Oct 5, 2010

I want to transfer an arbitrarily large file (say >20GB) between 2 servers. I have several considerations:

Must use port 22 (ssh) because of firewall restrictions
Cannot tax the CPU (production server)
Memory efficiency
Would prefer a checksum check but that could be done manually
Time is not of the essence

Server A and Server B are on the same private network (sharing a switch) and data security is not a concern, Server A and Server B are not on the same network and transfer will be via the public internet so data security is a concern, My first thought was using nice on an scp command with a non-CPU-intensive cypher (blowfish?). But I thought I'll refer to the SU community for recommendations.

View 1 Replies View Related

General :: Copying Large Number Of Files In Windows?

Mar 15, 2011

I am facing problem in copying a large number of file 18 lakh (18,000,000) files from my personal hardisk to another hardisk each file is very small and size of folder is around 3.95 GB copying files using copy given by Windows is frustrating and I am not even able to compress file its giving me error that its not readable.And problem is I am not able to open this drive in Linux it showing me error there saying do diskchk in Windows and Windows disk check is also not able to repair this drive and goes into some mode unsolvable.Is there any way to open disk with error to open in Windows and if not any way I can copy data faster?ERROR: Disk labled EDU is corrupt go to windows and chkdsk /f there and reboot into window 2 times.

View 3 Replies View Related

General :: Share A Large Number Of Files Into Chroot Env?

Aug 17, 2010

I understand that chroot is usually used to provide security, however, for my issue, security is a big don't care. I am very new to using chroot and don't fully understand how the chroot'd env works.

problem: Trying to use a vendor supplied cross compile environment. The environment runs as a chroot'd env and works just fine. I have a large number of additional modules that I wish to compile in the chroot'd environment. FYI, these modules are also (succesfully) compiled for other targets not using chroot'd env's. Copying the source files into the the chroot environment is not an option (don't have hours to wait for copies to finish and it would break the make system). Having them live in the environment is also not an option (the chroot build is a tiny part of the build process and we cannot revamp our entire source tree to accommodate it).

I am looking for a way to have the compiler in the chroot'd env have access to a path that is outside of the env and typically higher up in the same path that holds the chroot'd env. I have tried soft links (they don't work as expected). Hard links only work for single files and there are 10's of thousands of files that would need to be linked. I am not sure how I would go about exporting the additional files and then mounting the exported files in the chroot'd env (or if that would even work).

View 2 Replies View Related

General :: Searching Text Files For Large Numbers?

Dec 31, 2010

I am looking for a way to search for large numbers in text files and print the nearby lines.

For example if I had a text file like:

Event: 11
blah: 3
blah: 41 bleh: 19
Event: 2
blah: 31

[Code].....

View 1 Replies View Related

Ubuntu :: Can't Select Multiple Files For Uploading

Feb 13, 2010

I pushed the browse button and selected the first picture, but I was unable to select all of the pictures.
This happened again on an email attempt to do the same thing.

In the gnome environment, I am able to select the first then hold down shift, and select the last pic and all the ones in between will be selected. I expected this behavior when selecting for upload, but it didn't happen. I had to select each one, one at a time and upload each one.

I tried making a folder to upload the whole folder and that would not select at all.

View 8 Replies View Related

Ubuntu Servers :: Uploading Files To Apache?

Dec 23, 2010

I have set up a very basic apache server to host my own website (have not set up sql or database or php yet) and I am trying to find out how to fpt or copy my website. I am creating the site in windows and need to know how to transfer it to the server, preferably into the /var/www folder directly.

View 7 Replies View Related

Ubuntu Servers :: Uploading Files / Folders Through SSH?

Sep 1, 2011

I am done setting up an ubuntu server on Amazon Cloud Service.

I connect to it via SSH.

I would like to upload my website into the /var/www folder through SSH. I would like to upload a complete folder's worth.

View 5 Replies View Related

Software :: Multiple Files Uploading By Rsync?

Jun 16, 2010

I wrote script which should help me to upload multiple files. But I have problem. I want that files uploading executed parallel not Here is my code:

relVersion=1.1a
path="path"
pack="ls *.zip"

[code]...

View 3 Replies View Related

Ubuntu One :: Connect Activity - Uploading Folders Instead Of Files?

Jan 30, 2010

Whenever I used to start up a window would open asking me for my password. No problem. One day I clicked cancel and it has never come up again, when i start up the ubuntu one logo switches to updating folders for a bit and then it switches to the default but with a little red x on it. I haven't been able to get it to connect. I can place files into the folder on my computer but they don't show in my ubuntu one account folder also. Also I heard, that soon we'd be able to upload folders instead of file by file is this true?

View 1 Replies View Related

Ubuntu :: Uploading Files Completely Freezes FIrefox

Jun 6, 2010

Firefox grays out and doesn't unfreeze until the entire upload is complete.I can't do anything else on the internet while this is happening.This didn't bother me until now, because I need to upload some pretty high quality pictures to Flickr, and it takes about fifteen or twenty minutes to do so.I'd like if the pictures could simply upload in the background while I do other stuff, instead of completely freezing my internet usage in the process.I don't have a problem downloading files quietly in the background not even large files so why can't I upload files without freezing everything up? Is this normal, or is there a fix for this?

View 5 Replies View Related

General :: Nas - Most Effective Backup Software -> When Dealing With Large Numbers Of Files?

Jul 18, 2010

I have two NASes. I work off of one, and the other is used as a backup. As I have it set up now, it's slow. Running a backup takes a week. Even for 7 TB, with 1,979,407 files, this seems a bit outlandish,particularly as both systems are RAID-5 and the network is all gigabit. I've been digging about in the rsync man pages, and I really don't understand what differentiates the various topologies.Right now, all the processing is being done on the backup NAS, which has the main volume from the main NAS mounted locally over SMB. I suspect that the SMB overhead is killing me, particularly when dealing with lots of files.

I think what I need is to set up rsync on the main nas as a daemon, and then run a local rsync client to connect to it, which would hopefully allow me to completely avoid the whole SMB-in-the-middle affair, but aside from mentioning that it's there, I can find very little information on why one would want to use the daemon mode for rsync.

Here's my current rsync command line: rsync -r -progress --delete /cifs/Thecus/ /mnt/Storage/input? Is there a better way/tool to do this? Edit:Ok, to address the additional questions: The "Main" NAS is a Thecus N7700. I have additional modules installed that give me SSH, and it has rsync, but it's not in the $PATH, and I havn't figured out how to edit the local $PATH in a way that persists between reboots. The "Backup" NAS is a DIY affair, built around a 1.6Ghz Via Mobo with a Adaptec Hardware RAID card. It's running CentOS 5 with a full desktop environment. It's the hardware I'm running rsync from. (Gigabit is through a additional PCI card).

Further Edit: Ok, got rsync over SSH working (thanks, lajuette!).I had to do a bit of tweaking on my command line, I'm running rsync with the args:rsync -rum --inplace --progress --delete --rsync-path=/opt/bin/rsync sys@10.1.1.10:/raid/data/Storage /mnt/Storage (Note: I'm specifically not using -a, because I want to change the ownership to the local account, to not freak-out SELinux)

View 5 Replies View Related

General :: Cp Adds Exclamation Points When Copying Very Large Text Files?

Jul 13, 2009

For my research I have some very large files that are basically millions of lines of ten columns of numbers. These files can be up to 5 GB in size. Recently I noticed that when I made a copy of one of my files, some exclamation points appeared in it where there should not be any: in front of random numbers throughout the file. Making another copy of the file would result in exclamation points in front of different numbers in different parts of the file. Doing this many times has given me up to four exclamation points in different parts of the file. Sometimes the file copies just fine without producing any extraneous exclamation points.Additionally, I have occasionally seen a "^K" where there should be a newline (the data that should have been on the next line was instead on the previous line with a ^K in front of it) in copies that I have made of my files. I don't know if this is related or not.

View 7 Replies View Related

General :: Internet Explorer Not Downloading Files From System Server / Work Around This?

Jan 27, 2010

I have a very simple php web application deployed on linux (centOS4) machine. It creates a file and stores the file in /tmp folder on my linux machine. The path for this file is specified in the href attribute of the link. Ideally when we click this link the download manager should pop up so that the file can be downloaded on client machine.
When i access this website remotely from my window xp machine on firefox it downloads the file properly but when i run on internet explorer (i have IE7 on my windows XP) and click the link, the download manager does'nt pop's up. even when i right-click that link and select save as, an error message pop's up saying "file path not found". possibly IE is not able to determine the linux file path .so how do i work around this. is there some specific way for specifying the linux file paths to be downloaded by IE?

View 7 Replies View Related

Ubuntu One :: Uploading Files / Folder Sync Doesn't Work?

Aug 29, 2010

OS: ubuntu 10.04 lts. Runing on latest oracle virtualbox vers.

This works:

Established ubuntu one. Able to log into ubuntu one. Able to make folders.

This doesn't work:

When uploading files, it prompts "uploading". But never finish. And files do not get uploaded. To sync a folder on desktop and ubuntu one never works.

View 2 Replies View Related

OpenSUSE Network :: Uploading Files To Remote FTP Server Using Passive Mode With Enabled SuSeFirewall2

Feb 28, 2010

I've got problems uploading files to a remote FTP server using passive mode with enabled SuSeFirewall2 (Using 11.2)

I've disabled Firewall and everything went ok. Why firewall disables or terminates OUTGOING connections to a remote FTP?

View 7 Replies View Related

Ubuntu :: Command With The -r Option To Compare A Large Number Of Files And Files In Subdirectories

Jun 16, 2011

I am using the diff command with the -r option, to compare a large number of files and files in subdirectories. My main interest is to find out which files have been changed, and not what the actual changes are, and since a lot of files has been changed, it would be a lot easier to view the file names only. Is there and option for diff that might do this, or does there exist a similar tool/command that could do the job?

View 1 Replies View Related

OpenSUSE :: Dolphin Losing Files When Copying Many Files Or Large Folders?

Feb 14, 2010

I've discovered that Dolphin seems to lose random files when copying many large folders.

I first noticed this a few months ago when I tried to copy my music library from one folder to another on the same HDD. It consisted of around 600 folders and 6500 files. During the copy there were no errors but after the copy I found that some of the newly copied folders were missing files. I put it down to human error or a glitch.

Yesterday I tried to copy 13 folders containing rips of some of my DVDs. Each folder basically had one film of either 700MB or 1.4GB. Again no errors showed up during the copy but I found 3 of the newly copied folders were empty.

It's not so critical with music or films but I can't afford to lose work data like this.

Has anyone experienced or seen a similar problem with Dolphin? I'm going to have to do some more extensive testing but this is not good.

The first time I noticed the problem I was running KDE4.3.4 (I think) and now the latest was with KDE4.4.0.

View 9 Replies View Related

Software :: Differentiate Two Large Text Files Using Shell Script / Files Are Like Below?

Jan 20, 2009

I want to automate this using script.How to automate it?

File1:
s.no# 1 name:aaaaaa
city:abcd

[code]...

View 1 Replies View Related

General :: Unable To Block Downloading Od Mp3 Files And What Is Purpose "$" Ich At The End?

Apr 23, 2011

acl FILE_MP3 urlpath_regex -i .mp3$ http_access deny FILE_MP3. I HAVE SET THIS RULE; ACL rule in Squid to block downloading of .mp3 files

But I don't understand the purpose of "".mp3$ here the ""? even without it ("") I am able to block downloading od mp3 files and what is purpose "$" ich at the end?

View 4 Replies View Related

General :: Transfer Large Number Of Files Host To Host

Oct 20, 2010

I have two servers, one has an empty / and the other has a subdirectory with a large number (4 gig) with many, many files. I need a way to transfer the files en masse from the server with the large number of files to the one that is essentially blank.I don't have space on the used host to simply gzip all the files. I've googled this and see that there may be some combination of tar and/or gzip that will let me do this with some sort of redirection.

I really need and example line of how this can be accomplished. If my explanation seems rather sparse, I can supply more details.

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved