Ubuntu Servers :: NCFTPD Error Large File On X64 6.06LTS

Feb 25, 2010

I'm unable to download large files from ftp.

The ftp server is: NCFTPD.

My server is:

Version:
istributor ID:Ubuntu
Description:Ubuntu 6.06.2 LTS
Release:6.06
Codename:dapper
Linux morpheus 2.6.15-55-amd64-server #1 SMP Tue Dec 1 18:31:51 UTC 2009 x86_64 GNU/Linux

when I download, it gives an 131 error unknown.

I tried the same file on a 32 b server, same version of all except 32 instead of x64.

View 1 Replies


ADVERTISEMENT

Server :: NFS Large File Copies Fail - Error Writing To File: Input/output Error?

Jun 27, 2009

I recently upgraded my file/media server to Fedora 11. After doing so, I can no longer copy large files to the server. The files begin to transfer, but typically after about 1gb of the file has transferred, the transfer stalls and ultimately fails with the message:

"Error writing to file: Input/output error"

I've run out of ideas as to what could cause this problem. I have tried the following:

1. Different NFS versions: NFS3 and NFS4
2. Tried copying the files to different physical drives on the server.
3. Tried copying the files from different physical drives on the client.
4. Tried different rsize and wsize block sizes when mounting the NFS share
5. Tried copying the files via a different protocol. SSH in this case. The file transfers are always successful when I use SSH.

Regardless of what I do, the result is the same. The file transfers always fail after approximately 1gb.

Some other notes.

1. Both the client and the server are running Fedora 11 kernel 2.6.29.5-191.fc11.x86_64

I am out of ideas. Has anyone else experienced something similar?

View 13 Replies View Related

Ubuntu Servers :: Large File Transfer On LAN?

Nov 11, 2010

I'm trying to create an Ubuntu Server file server that will handle large file transfers (up to 50gb) from the LAN with Windows clients. We've been using a Windows server on our LAN on the file transfers will occasionally fail... though the server is used for other services as well.

The files will be up to 50gb. My thoughts are to create a VLAN (or separate physical switch) to ensure maximum bandwidth. Ubuntu server will be 64bit with 4tb of storage in a RAID 5 config.

View 2 Replies View Related

Ubuntu Servers :: System Crash When Copying Large File

Jun 15, 2010

I am having a bit of a problem with my Ubuntu Server 10.04 install. I think it might be a kernel problem. Basically, what happens is when I copy a large file (a 160GB disk image) to my drive (>60GB) the system consistently crashes after about 60GB of the file is transferred. It doesn't matter if I am sending the file using cifs, or over SSH. Checking syslog (paste dump here), it seems these flush errors always appear shortly before the crash occurs. The destination filesystem is a hardware RAID 10 array with 2TB of space. It is formatted as EXT4.

View 7 Replies View Related

Ubuntu Servers :: Samba Drops Connection When Transferring Large File

Jul 6, 2011

I'm experiencing connection problem when transferring a large file from Windows 7 (Home Premium) to my Ubuntu 11.04. The transfer starts, but after a couple of seconds the connection drops and all the shares are unavailable. I'm also unable to connect to the server over ssh, and the only thing I can do to restore the connection is to reboot the server. The strange part is that this was never a problem a couple of weeks ago, and I've not done anything to the setup on either machines besides installing security updates.

View 9 Replies View Related

Server :: Large File Size Cause RPC Authentication Error?

Oct 6, 2009

I think I am having a problem due to an NFS server file size limit. Is it possible I am missing a parameter on the RHEL NFS setup to handle large files? I am running an NFS server on a RHEL 5.1 machine and the HP-UX 11.0 machine does an NFS mount to that file system. The HP-UX executes a program that resides on the HP-UX machine to process a large 35 GB data file that resides on the NFS server machine. The program on the HP-UX can only read/process the first portion of the file until an "RPC: Authentication error" is returned multiple times until the program prematurely decides that it has reached the end of file.

I tried recompiling the same program to run on the RHEL 5.1 NFS server to access the 35 GB file locally (on the NFS server instead on HP-UX) and the program completed successfully, processing the whole file (about 7 hours of processing) with no "RPC: Authentication error." In addition, I have been running the nfs mount with the same machines for quite some time, but not with such large files sizes.

View 3 Replies View Related

Ubuntu :: "Error Splicing File: File Too Large" - Copy To A 500G USB Drive?

Sep 10, 2010

I have seen this 3 times now - it's an updated Lucid with EXT4, trying to copy to a 500G USB drive?

View 3 Replies View Related

Ubuntu :: Error "File Too Large" Copying 7.3gb File To USB Stick

Nov 24, 2010

I am trying to copy a 7.3gb .iso file to an 8gb USB stick and I get the following error when it hits 4.0gb

Error while copying "xxxxxx.iso". There was an error copying the file into /media/6262-FDBB. Error splicing file: File too large The file is to be used by a windows user, and I'm just trying to do a simple copy, not a burn to USB or anything fancy. Using 10.4.1.LTS, AMD Dual Core, all latest patches.

View 2 Replies View Related

Ubuntu Installation :: Upgrading The 6.06LTS UltraSparc Server?

Sep 19, 2010

I have a home installation of 6.06 LTS Server on my Sun Ultra2 (Ultrasparc). I love it, everything about it is great. However, I'm starting to watch the clock and realize that I will soon need to upgrade if I want to continue to receive security updates. What is the best way to accomplish this? Can I directly upgrade to the next LTS release? (8.04)? Also, there is no GUI installed, it's a headless system. What is the best mechanism for upgrading via the commandline?

View 1 Replies View Related

Ubuntu Servers :: File Creation Error On Apache2 .cgi File?

Jan 14, 2011

I am using python as a cgi for a simple game that i'm planning to run on a website. It requires the user to enter his name and age. This is saved in a file newly created in his/her name. However, I'm getting this error The above is a description of an error in a Python program, formatted

63 for a Web browser because the 'cgitb' module was enabled. In case you
64 are not reading this in a Web browser, here is the original traceback:
65
66 Traceback (most recent call last):
67 File "/var/www/webprog.cgi", line 51, in <module>
68 main()
69 File "/var/www/webprog.cgi", line 44, in main

[Code]...

View 4 Replies View Related

General :: View A Particular Ten Lines In A Large File Where Can't Open The File In Vi

May 12, 2010

I am using RHEL 5.I have a very large test file which cannot be opened in vi.The content of the file has some 8000 lines.I need to view ten lines between 5680 to 5690.How can i view these particular lines in a large file.what is command and option i need to use.

View 1 Replies View Related

Ubuntu Servers :: NTP Failing To Sync - Offset Too Large

Jun 11, 2010

I'm having trouble keeping NTP synchronized and able to provide time updates to other devices on the private network. I believe this is because the server has a too large of offset, that causes all NTP clients syncing through the server to fail.

I have two setups, one working, one failing, with mostly identical installation and settings.

Setup A is the working setup, as I will name it, is ubuntu server edition 8.04 with kernel 2.6.33.1. ntpd version is 4.2.4p4.

ntpq -p outputs this information:
remote refid st t when poll reach delay offset jitter
================================================== ============================
*europium.canoni 193.79.237.14 2 u 925 1024 377 101.851 0.194 0.328
ntp.conf is:

[Code]....

View 1 Replies View Related

Ubuntu Servers :: Samba In Large Scale Environment.

Mar 31, 2011

The Linux samba server would be part of the Windows domain.What's the best way to add shares for all of the students, assuming there is 200 students per folder? Is there a way to add a wildcard like $(USER) (that's logged in)//fileserver/ students/classof2011/Bill_Gates? Or would a directory path have to be created for each student?Either way is fine. I'm just curious what the proper protocol is for completing that task.

View 1 Replies View Related

Ubuntu Servers :: No Such File Or Directory Error?

May 11, 2011

rcon@li121-251:~/b3$ ./b3_run.py
: No such file or directory
rcon@li121-251:~/b3$ l;

[Code].....

as you can see up top it gives me an error when i try to run the file. it is a Ubuntu error and i don't what the heck is causing it. had it before but don't remember how i resolved it.

View 8 Replies View Related

Ubuntu Servers :: PhpMyAdmin Not Loading Large Mysql Database?

Apr 27, 2010

I am running Ubuntu 9.10 with Apache, phpMyAdmin and MySQL. Normally phpMyAdmin loads properly, but the other day I created a large database of 6,000+ tables and 1.3 GB of data in the entire database. Now phpMyAdmin loads and shows my different databases, but if I try to open the new large database, it just loads a white page with no content. Does anyone know if I can reconfigure my server so it will be able to show parts of the database?

View 2 Replies View Related

Ubuntu Servers :: Mdadm - Corrupt A Large Array Of Files

Sep 1, 2011

I've been using Ubuntu on my fileserver for quite a while now, and I've always really had this problem, but I want to finally address it and get it fixed. At seemingly random points (when my fileserver is under stress - typically while I'm writing lots of data to it), my fileserver will crash. It generally completely crashes, not responding to any further file requests or any of my SSH commands, and must be reset hard (typically by flipping the power switch). After such an occasion, I end up with some corrupted files. It seems to corrupt a large array of files (it's not an isolated issue - for example, it corrupts files that were not being accessed anywhere near the time it crashed, including files that had never been accessed during that period of uptime). The files don't get completely smashed, but they're definitely corrupted (artifacts in images, skips in audio and video files, often complete failure of binary files such as virtual hard drives or disc images).

I'm using Ubuntu Server 11.04, but similar issues to this happened for me in 10.04 LTS (in fact, I upgraded to try to solve them). I'm using mdadm to create an 8-drive raid6 array. The drives are 1.5 TB each, mostly Samsung HD154UI, but with a WD drive in there too (sorry, I can't find the model number at the moment). The hard drives themselves appear to be working fine - SMART reports no issues with any of them, mdadm says they're all up, and I have no reason to believe that the drives are at fault here (although I can conduct further tests if necessary). I've posted about this problem before here and here. In these cases, the issues seemed to be with XFS - in fact, I switched from XFS to ext4 on my RAID array because I simply believed XFS to be unstable. Unfortunately, this issue occurs with ext4 as well, so I'm fairly certain it's an mdadm issue. Here is the output of "cat /proc/mdstat", for those interested:

[Code]....

View 9 Replies View Related

Fedora Servers :: Error Trying To Add File In Svn

Sep 27, 2010

I'm trying to add a file in svn and the following error occurs:

Code:

View 3 Replies View Related

Ubuntu Servers :: Performance For Large Scale Website Or Critical Mission?

Aug 6, 2011

anyone of you could share if you have been using ubuntu for large scale website or critical mission project, say for 500.0000 secure transaction per 3 hours with 4 million users accessing server. how does ubuntu perform?

View 1 Replies View Related

Programming :: Stat On Cifs - Fails With Error No 75 - Error Shows "Value Too Large For Defined Data Type"

Dec 29, 2010

i have Ubuntu10.10 (kernel-2.6.35-22-generic) installed. struct stat StatBuff;

[Code]...

I have mounted a windows share folder on /mnt. When i gave any directory within /mnt/ to stat function it fails with errorno 75. perror shows "Value too large for defined data type". Example 1 is fail but Example 2 works fine.

View 7 Replies View Related

Ubuntu :: Large .tar.gz File That Trying To Extract?

Jan 4, 2011

I've got a large .tar.gz file that I am trying to extract, I have had a look around at the problem and seems other people have had it, but I've tried their solutions and they haven't worked.The command I am using is:

Code:
tar zxvf file.tar.gz
and the error is:

[code]...

View 9 Replies View Related

Ubuntu Servers :: Mysqld.pid File Error On 8.04.4 LTS - Unknown Variable

Feb 4, 2010

I'm running Ubuntu 8.04.4 LTS to host a moodle server in a school.. moodle is running OK - but I'm having problems with MySql Any mysql command generates:

unknown variable 'pid-file=/var/run/mysqld/mysqld.pid'

I've taken a look in unknown variable 'pid-file=/var/run/mysqld/ and there is no mysqld.pid file I have followed these instructions that I found elsewhere:

If there�s no mysqld.pid inside /var/run/mysqld directory, create mysqld.pid
# cd /var/run/mysqld
# touch mysqld.pid

[code]....

But after reboot the mysqld.pid file is missing again.. I can access mysql via phpmyadmin - but webmin fails with unknown variable 'pid-file=/var/run/mysqld/mysqld.pid'

View 8 Replies View Related

Ubuntu Networking :: When Transferring Large Files Using Scp Between Desktop Running Maverick And Other Servers?

Nov 12, 2010

sometimes when transferring large files using scp between my desktop running maverick and other servers running Ubuntu, Debian or CentOS, I get the following error message: 77% 258MB 11.3MB/s 00:06 ETAReceived disconnect from xxx.xxx.xxx.xxx: 2: Packet corruptI've found a seemingly related bug report on launchpad here: but the provided "ethtool" fix did not help. I'd be most grateful for any ideas on how to solve this issue. Some more info:Linux lotus 2.6.35-22-generic-pae #35-Ubuntu SMP Sat Oct 16 22:16:51 UTC 2010 i686 GNU/Linuxlspci | grep eth -i00:19.0 Ethernet controller: Intel Corporation 82567LM-3 Gigabit Network Connection (rev 02)

View 1 Replies View Related

Ubuntu :: Copying A Large File From The Network?

Feb 17, 2010

I am trying to copy a file from a network resource on my Local Area Network, which is about 4.5 GB. I copy this file through GNOME copying utilities by first going to Places --> Network and then selecting the Windows Share on another computer on my network. I open it and start copying the file to my FAT32 drive by Ctrl+C and Ctrl+V. It copies well up-to 4 GB and then it hangs.

After trying it almost half a dozen times I got really annoyed and left it hung and went to bed. Next morning when I checked a message box saying "file too large to write" has appeared.

I am very annoyed. I desperately need that file. It's an ISO image and it is not damaged at all, it copies well to any windows system. Also, I have sufficient space on the drive in which I am trying to copy that file.

View 8 Replies View Related

Ubuntu :: Bzip2 Compressed File Too Large

Feb 26, 2010

I have been having a recurring problem backing up my filesystem with tar, using bzip2 compression. Once the file reached a size of 4Gb, an error message appeared saying that the file was too large (I closed the terminal so do not have the exact message. Is there a way to retrieve it?). I was under the impression that bzip2 can support pretty much any size of file. It's rather strange: I have backed up files of about 4.5Gb before without trouble.

At the same time, I have had this problem before, and it's definitely not a memory problem: I am backing up onto a 100G external hard drive.

That reminds me, in fact, (I hadn't thought of this) that one time I tried to move an archived backup of about 4.5Gb to an external (it may have been the same one) and it said that the file was too large. Could it be that there is a maximum size of file I can transfer to the external in one go? Before I forget, I have ubuntu Karmic and my bzip2 version is 1.0.5 (and tar 1.22, though maybe this is superfluous information?)

View 4 Replies View Related

Ubuntu :: 699MB 10.04 File Too Large For 700MB CD?

May 2, 2010

I am attempting to burn the ISO for Lucid Lynx final onto a 700MB CD. The ISO file is 699MB, but Windows reports that the size on disk is 733MB and thus CD Burner XP refuses to burn the file, stating that it's too large for a CD.

Why this discrepancy on file sizes? I've noticed this with other files as well, suddenly it's a bit of a problem, as you can see!

View 6 Replies View Related

Ubuntu :: 10.04 Hangs During Large File Transfers?

Aug 1, 2010

I recently built a home media server and decided on Ubuntu 10.04. Everything is running well except when I try to transfer my media collection from other PCs where it's backed up to the new machine. Here's my build and various situations:

Intel D945GSEJT w/ Atom N270 CPU
2GB DDR2 SO-DIMM (this board uses laptop chipset)
External 60W AC adapter in lieu of internal PSU
133x CompactFlash -> IDE adapter for OS installation
2(x) Samsung EcoGreen 5400rpm 1.5TB HDDs formatted for Ext4

Situation 1: Transferring 200+GB of files from an old P4-based system over gigabit LAN. Files transferred at 20MBps (megabytes, so there's no confusion). Took all night but the files got there with no problem. I thought the speed was a little slow, but didn't know what to expect from this new, low-power machine.

Situation 2: Transferring ~500GB of videos from a modern gaming rig (i7, 6GB of RAM, running Windows7, etc etc). These files transfer at 70MBps. I was quite impressed with the speed, but after about 30-45 minutes I came back to find that Ubuntu had hung completely.

I try again. Same thing. Ubuntu hangs after a few minutes of transferring at this speed. It seems completely random. I've taken to transferring a few folders at a time (10GB or so) and so far it has hung once and been fine the other three times.Now, I have my network MTU set from automatic to 9000. Could this cause Ubuntu to hang like this? When I say hang I mean it freezes completely requiring a reboot. The cursor stops blinking in a text field, the mouse is no longer responsive, etc.

View 4 Replies View Related

Ubuntu Servers :: Error While Loading Shared Libraries - No Such File Or Directory

Jan 25, 2010

There is like one month since dig fails running after some upgrade. Platform: Ubuntu-Server 6.06.1 _Dapper Drake_ - Release amd64 Error: Code: dig: error while loading shared libraries: libdns.so.21: cannot open shared object file: No such file or directory.

View 2 Replies View Related

Ubuntu Servers :: Error Occurs When We Use Move Upload File, To The New Directory Within Our PHP App

Oct 7, 2010

We are running into issues with a File Upload script written in PHP. We can upload files without issues except with .*x files (such as .docx) We are getting permission denied errors. The error occurs when we use move upload file, to the new directory within our PHP app. If we give the uploads folder 777 access, it works fine without error. I dont like that. So I set it to 775 (Also dont like this), but it didnt work until I gave group ownership to www-data (I really dont like this)

This issue only happens on our production server, which is Ubuntu 9.04, running Apache2.2 and PHP5 will all the newest updates. We also have all MIME's configured, and are able to download the file from Apache without error. The first thing we noticed before the file permissions error, was that the MIME type changed to .zip when we used mime content type function. But yet using the FILES array, it still showed .docx.

View 2 Replies View Related

Ubuntu :: Can't Copy A Large 30gig Image File?

Jan 3, 2010

I have some large image files that are 30 gig and more. I am running Ubuntu 9.10 whenever I try to copy one of these files to another drive I get a error saying the file is too large. I am trying to copy from an external Hard Drive or a slave drive does the same thing. I have a friend who has expressed the same issue. This must be a widespread bug.

View 9 Replies View Related

Ubuntu :: File System Format For Mac OSX 10.5 For Large Files?

Sep 19, 2010

Is there a file system that both Mac OSX 10.5 and linux can read/write for large files (like 4gb files)? My desktop is Ubuntu and I run most from there, but I want to back up my MacBook and linux box on the same external hard drive. Seems there are some (paid) apps for Mac that will mount NTFS but I'm wondering if there is just a shared files ystem that will work for both.

View 9 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved