Ubuntu :: 699MB 10.04 File Too Large For 700MB CD?

May 2, 2010

I am attempting to burn the ISO for Lucid Lynx final onto a 700MB CD. The ISO file is 699MB, but Windows reports that the size on disk is 733MB and thus CD Burner XP refuses to burn the file, stating that it's too large for a CD.

Why this discrepancy on file sizes? I've noticed this with other files as well, suddenly it's a bit of a problem, as you can see!

View 6 Replies


ADVERTISEMENT

Ubuntu Networking :: Transfer A Large File Lets Say 700Mb Or So Wireless Shuts Down?

Jun 6, 2010

I have a problem that I can't seem to fix.When I try to transfer a large file lets say 700Mb or so my wireless shuts down and i have to restart my ubuntu computer the other computer is vista.ubuntu is on a wusb54gver4 and vista is going through a wrt54g tm running dd-wrt mega.I have tried every thing i know with no luck.

View 4 Replies View Related

Ubuntu :: Double Compression Technique Shrunk A 700mb File Down To 40mb?

Dec 31, 2010

I downloaded a file that was 40mb compressed and was almost 700mb when fully extracted. It was inside a .rar file that in turn was inside another .rar file. How can this be done in Ubuntu? Can this also be done with .zip and .7z files?

View 2 Replies View Related

Ubuntu :: 1compress The 699mb Ubuntu Iso To Atleast 50mb -120mb?

May 13, 2010

is the a way to compress the 699mb ubuntu iso to atleast 50mb -120mb. If its possible is there anyone holy enough to do it and upload it. I badly need 10.04 and since my last downloaded iso failed to install grrr and now have ran out of data bundles.

View 9 Replies View Related

General :: View A Particular Ten Lines In A Large File Where Can't Open The File In Vi

May 12, 2010

I am using RHEL 5.I have a very large test file which cannot be opened in vi.The content of the file has some 8000 lines.I need to view ten lines between 5680 to 5690.How can i view these particular lines in a large file.what is command and option i need to use.

View 1 Replies View Related

Ubuntu :: How Would Files Adding Up To 724mb Be On A 700mb Cd

Jul 9, 2010

would anyone know how to solve this problem im having.I copied some files from a normal cd then tried to burn them to another cd the same size but it says the new cd is to small by 24mb which i could`nt workout then, how they fitted on the original.Have the original files been compressed some how?If so how would i do the same to fit them on the new cd?

View 1 Replies View Related

Ubuntu Installation :: How To Make LiveCD Of 700MB Space

Oct 23, 2010

I want to make a live CD for ubuntu, but upon getting the .ISO I found that the .ISO was 709MB, and my disc is 700MB. How could I precede in making this live CD? If I use some sort of over burning software, will the live CD work, and please recommend a software to do so if it does indeed work.

View 8 Replies View Related

Software :: Install LYX Without The 700mb+ Of Dependencies?

Jun 20, 2010

I tried to install Lyx today, but because of the massive overhead I decided to install it without its 'recommended' packages. From previous experience some of these packages were just the same font in different sizes that rendered font selected in other word processing applications tiresome as I scrolled through the list.

Code:
apt-get install lyx --no-install-recommends

Only 8mb of disk space was needed! But at a cost.

Although AFAIK as a user that has only spent fives minutes at LYX ( 4 of which gawking at how easy it is to write mathematical symbols in ) the segment that does not work is the 'rendering' where I cannot export as dvi, pdf etc.

When I open a document, I get this error:

Quote:

The layout file requested by this document, article.layout is not usable. This is probably because a LaTeX class or style file required by it is not available. See the Customization documentation for more information. LyX will not be able to produce output.

What packages are absolutely necessary for LyX to work at a bare minimum? My monthly Internet usage is capped, and 700mb is a massive blow to it.

By the way, the documentation it recommends me to read ( 'Customisation' ) contains information applicable to creating new layouts and/or why I do not have someone else's layout that this document requires. It is obvious however that 'article.layout' is a standard component, and is probably inside on of the many recommended packages.

The download of recommended packages is ~ 400mb, but once extracted is ~700mb. That is still a large download for my internet connection currently.

Code:
The following NEW packages will be installed:
dvipng lacheck latex-beamer latex-xcolor libt1-5 lmodern luatex lyx pgf
preview-latex-style prosper ps2eps psutils texlive-base texlive-binaries
texlive-common texlive-doc-base texlive-extra-utils texlive-font-utils
texlive-fonts-recommended texlive-fonts-recommended-doc

[Code].....

View 5 Replies View Related

OpenSUSE Hardware :: Copy A 700MB .avi To A USB Stick

Apr 16, 2011

Been using 11.4 for the last few days and I think I might really like it. Thought getting my Broadcom wireless to work would be the hard part but I'm having real trouble with USB transfers for some reason. Tried to copy a 700MB .avi to a USB stick yesterday, I know it's good as I use it regularly. I get the KDE notification that it has copied the file (seemed far too quick but didn't think much of it) and then went to watch it.

Didn't show up on my telly and when I stuck it back in the computer to check it had only copied an arbitrary 39MB of the file. Tried again this morning, copying three albums to my phone - I get the notification that copying files has been completed almost instantly, I let it sit for five minutes and of the three albums one track has been transferred. It's an Android phone so I can use a file browser to confirm they are there, but they're all 0 bytes in size.

View 8 Replies View Related

Ubuntu :: "Error Splicing File: File Too Large" - Copy To A 500G USB Drive?

Sep 10, 2010

I have seen this 3 times now - it's an updated Lucid with EXT4, trying to copy to a 500G USB drive?

View 3 Replies View Related

Ubuntu :: Large .tar.gz File That Trying To Extract?

Jan 4, 2011

I've got a large .tar.gz file that I am trying to extract, I have had a look around at the problem and seems other people have had it, but I've tried their solutions and they haven't worked.The command I am using is:

Code:
tar zxvf file.tar.gz
and the error is:

[code]...

View 9 Replies View Related

Ubuntu :: Copying A Large File From The Network?

Feb 17, 2010

I am trying to copy a file from a network resource on my Local Area Network, which is about 4.5 GB. I copy this file through GNOME copying utilities by first going to Places --> Network and then selecting the Windows Share on another computer on my network. I open it and start copying the file to my FAT32 drive by Ctrl+C and Ctrl+V. It copies well up-to 4 GB and then it hangs.

After trying it almost half a dozen times I got really annoyed and left it hung and went to bed. Next morning when I checked a message box saying "file too large to write" has appeared.

I am very annoyed. I desperately need that file. It's an ISO image and it is not damaged at all, it copies well to any windows system. Also, I have sufficient space on the drive in which I am trying to copy that file.

View 8 Replies View Related

Ubuntu :: Bzip2 Compressed File Too Large

Feb 26, 2010

I have been having a recurring problem backing up my filesystem with tar, using bzip2 compression. Once the file reached a size of 4Gb, an error message appeared saying that the file was too large (I closed the terminal so do not have the exact message. Is there a way to retrieve it?). I was under the impression that bzip2 can support pretty much any size of file. It's rather strange: I have backed up files of about 4.5Gb before without trouble.

At the same time, I have had this problem before, and it's definitely not a memory problem: I am backing up onto a 100G external hard drive.

That reminds me, in fact, (I hadn't thought of this) that one time I tried to move an archived backup of about 4.5Gb to an external (it may have been the same one) and it said that the file was too large. Could it be that there is a maximum size of file I can transfer to the external in one go? Before I forget, I have ubuntu Karmic and my bzip2 version is 1.0.5 (and tar 1.22, though maybe this is superfluous information?)

View 4 Replies View Related

Ubuntu :: 10.04 Hangs During Large File Transfers?

Aug 1, 2010

I recently built a home media server and decided on Ubuntu 10.04. Everything is running well except when I try to transfer my media collection from other PCs where it's backed up to the new machine. Here's my build and various situations:

Intel D945GSEJT w/ Atom N270 CPU
2GB DDR2 SO-DIMM (this board uses laptop chipset)
External 60W AC adapter in lieu of internal PSU
133x CompactFlash -> IDE adapter for OS installation
2(x) Samsung EcoGreen 5400rpm 1.5TB HDDs formatted for Ext4

Situation 1: Transferring 200+GB of files from an old P4-based system over gigabit LAN. Files transferred at 20MBps (megabytes, so there's no confusion). Took all night but the files got there with no problem. I thought the speed was a little slow, but didn't know what to expect from this new, low-power machine.

Situation 2: Transferring ~500GB of videos from a modern gaming rig (i7, 6GB of RAM, running Windows7, etc etc). These files transfer at 70MBps. I was quite impressed with the speed, but after about 30-45 minutes I came back to find that Ubuntu had hung completely.

I try again. Same thing. Ubuntu hangs after a few minutes of transferring at this speed. It seems completely random. I've taken to transferring a few folders at a time (10GB or so) and so far it has hung once and been fine the other three times.Now, I have my network MTU set from automatic to 9000. Could this cause Ubuntu to hang like this? When I say hang I mean it freezes completely requiring a reboot. The cursor stops blinking in a text field, the mouse is no longer responsive, etc.

View 4 Replies View Related

Ubuntu Servers :: Large File Transfer On LAN?

Nov 11, 2010

I'm trying to create an Ubuntu Server file server that will handle large file transfers (up to 50gb) from the LAN with Windows clients. We've been using a Windows server on our LAN on the file transfers will occasionally fail... though the server is used for other services as well.

The files will be up to 50gb. My thoughts are to create a VLAN (or separate physical switch) to ensure maximum bandwidth. Ubuntu server will be 64bit with 4tb of storage in a RAID 5 config.

View 2 Replies View Related

Ubuntu :: Error "File Too Large" Copying 7.3gb File To USB Stick

Nov 24, 2010

I am trying to copy a 7.3gb .iso file to an 8gb USB stick and I get the following error when it hits 4.0gb

Error while copying "xxxxxx.iso". There was an error copying the file into /media/6262-FDBB. Error splicing file: File too large The file is to be used by a windows user, and I'm just trying to do a simple copy, not a burn to USB or anything fancy. Using 10.4.1.LTS, AMD Dual Core, all latest patches.

View 2 Replies View Related

Server :: NFS Large File Copies Fail - Error Writing To File: Input/output Error?

Jun 27, 2009

I recently upgraded my file/media server to Fedora 11. After doing so, I can no longer copy large files to the server. The files begin to transfer, but typically after about 1gb of the file has transferred, the transfer stalls and ultimately fails with the message:

"Error writing to file: Input/output error"

I've run out of ideas as to what could cause this problem. I have tried the following:

1. Different NFS versions: NFS3 and NFS4
2. Tried copying the files to different physical drives on the server.
3. Tried copying the files from different physical drives on the client.
4. Tried different rsize and wsize block sizes when mounting the NFS share
5. Tried copying the files via a different protocol. SSH in this case. The file transfers are always successful when I use SSH.

Regardless of what I do, the result is the same. The file transfers always fail after approximately 1gb.

Some other notes.

1. Both the client and the server are running Fedora 11 kernel 2.6.29.5-191.fc11.x86_64

I am out of ideas. Has anyone else experienced something similar?

View 13 Replies View Related

Ubuntu :: Can't Copy A Large 30gig Image File?

Jan 3, 2010

I have some large image files that are 30 gig and more. I am running Ubuntu 9.10 whenever I try to copy one of these files to another drive I get a error saying the file is too large. I am trying to copy from an external Hard Drive or a slave drive does the same thing. I have a friend who has expressed the same issue. This must be a widespread bug.

View 9 Replies View Related

Ubuntu :: File System Format For Mac OSX 10.5 For Large Files?

Sep 19, 2010

Is there a file system that both Mac OSX 10.5 and linux can read/write for large files (like 4gb files)? My desktop is Ubuntu and I run most from there, but I want to back up my MacBook and linux box on the same external hard drive. Seems there are some (paid) apps for Mac that will mount NTFS but I'm wondering if there is just a shared files ystem that will work for both.

View 9 Replies View Related

Ubuntu :: Large File Size With Xsane Save To PDF?

Jun 17, 2011

CanoScan LiDE 210 running under 10.10 on a Tosh Tecra M11-130 laptop.Currently trying out xsane to archive some paperwork in monochrome, as the bundled Simple Scan utility can only save in either colour or greyscale. The problem is that the same A4 page saved as monochrome has a file size about three times larger in Ubuntu than in Windoze.

The scan mode options are either 'Colour', 'Greyscale' or 'Lineart'. There is no 'halftone' setting available, as shown in some of the xsane manuals. Don't know whether this is significant to this issue. Xsane's main option window shows 3508 x 2480 x 1 bit for a 300 dpi A4 monochrome scan when 'lineart' is selected, but the intermediate file size is 8.3MB instead of just over 1MB before packing for the PDF. This is consistent with each pixel not being recorded as a 1 or a 0, but as a greyscale 11111111 or 00000000, i.e. monochrome/halftone, but stored in an eight bit field. how to tweak xsane for true monochrome intermediate .pnm files and saved PDFs?

View 5 Replies View Related

Ubuntu :: Network Shuts Down When Trying To Transfer A Large File?

Jun 6, 2010

When I try to transfer a large file lets say 700Mb or so my wireless shuts down and i have to restart my ubuntu computer the other computer is vista.ubuntu is on a wusb54gver4 and vista is going through a wrt54g tm running dd-wrt mega.I

View 6 Replies View Related

General :: Best Way To Copy A Large File Over NFS?

Aug 24, 2011

I want to transfer a huge file (60GB) over the NFS network on linux. Is cp the best option?

View 1 Replies View Related

General :: Can't Copy Large File?

Mar 26, 2010

I'm trying to copy a 6Gb file across from my laptop to an external usb drive but it quits at about 4.2Gb every time with a "file size limit exceeded" error. i have checked the output of ulimit -a and there is no limit there on the file size. I'm using the Slax live Cd for this as it always gets the job done

View 8 Replies View Related

Ubuntu Servers :: NCFTPD Error Large File On X64 6.06LTS

Feb 25, 2010

I'm unable to download large files from ftp.

The ftp server is: NCFTPD.

My server is:

Version:
istributor ID:Ubuntu
Description:Ubuntu 6.06.2 LTS
Release:6.06
Codename:dapper
Linux morpheus 2.6.15-55-amd64-server #1 SMP Tue Dec 1 18:31:51 UTC 2009 x86_64 GNU/Linux

when I download, it gives an 131 error unknown.

I tried the same file on a 32 b server, same version of all except 32 instead of x64.

View 1 Replies View Related

Ubuntu :: Root Filesystem Fills Up When Copying A Large File?

Mar 17, 2010

I was just copying a large (50GB) file from one mounted partition to another mounted partition (a USB drive), but before the operation completed, my root filesystem, on a separate partition, filled up.Because it filled up I also couldn't get past the login when I rebooted. I think this is because there is no room to load temporary files. I'm expanding the root partition to temporarily fix this. how can I avoid my root file system filling up when copying a massive file between mounted partitions? the file is being cached in root during the transfer.

View 3 Replies View Related

Ubuntu Servers :: System Crash When Copying Large File

Jun 15, 2010

I am having a bit of a problem with my Ubuntu Server 10.04 install. I think it might be a kernel problem. Basically, what happens is when I copy a large file (a 160GB disk image) to my drive (>60GB) the system consistently crashes after about 60GB of the file is transferred. It doesn't matter if I am sending the file using cifs, or over SSH. Checking syslog (paste dump here), it seems these flush errors always appear shortly before the crash occurs. The destination filesystem is a hardware RAID 10 array with 2TB of space. It is formatted as EXT4.

View 7 Replies View Related

Ubuntu :: Large File Copy Operations Freezing Up Computer?

Sep 9, 2010

Ok so I'm running Mint (not full Ubuntu), and I'm not sure if this is a problem with the FS, Kernel, what not.I'm running Linux Mint 9 - x64 - kernel 2.6.35.14 - and when I did a large file copy operation (9GB) it froze up my system until the copy operation was done. I couldn't even use Pidgin, Mozilla, or anything, when trying to open up another Terminal it froze as well.

View 1 Replies View Related

Ubuntu :: Natty Large File Copies Cause Dramatic Slowdown?

Jul 30, 2011

If I initiate a file copy of more than a couple of GiB, the PC goes into a dramatic slowdown. Even selecting a different subdirectory through Nautilus can take 30+ seconds.Now, whilst I appreciate that file copying puts a load on the bus, DMA and, to a certain extent, the CPU, it seems unconscionable that it makes the PC effectively unusable until the copy has completed.Is there any way to (practically) lower the priority, or similar, of the copy process so that one can continue to use the platform during large file copies? Right now, I end up using a laptop adjacent to the PC whenever I have to copy a large file. This, for a mainstream operating system is, frankly, ludicrous.I'm aware that I could run Nautilus 'nicely' but I don't want to make changes which would compromise other aspects of the system. It would also be pleasant, for a change, not to have to read a couple of telephone directories of technical documentation in order to resolve the problem myself. This must be a general problem and, in my view, something which seriously compromises the usefulness of Ubuntu given the sizes of contemporary drives and files.

The i7 I'm using at the moment has 8 cores non of which go over 15% usage while the copy is progressing despite the OS being effectively frozen for long periods.Environment:Intel Core i7-2600 @ 3.40GHz; 16GB RAM; Natty (11.04) fully updated as at 29th July 2011; Kernel 2.6.38-10-generic, GNOME 2.32.1.; Primary drive has 563.9GiB free space, Running in Ubuntu 'Classic' (no effects) mode

Disk /dev/sdc: 2000.4 GB, 2000398934016 bytes
255 heads, 63 sectors/track, 243201 cylinders
Units = cylinders of 16065 * 512 = 8225280 bytes

[code]....

View 2 Replies View Related

General :: How To Send A Large File Securely

Aug 28, 2011

I need to send large files from a Linux machine to another using cryptography. The sender machine knows the recipient IP but not vice-versa. I don't need strong cryptography and prefer higher-speed less-secure solutions.

There are no problems with presharing crypto keys but I'd prefer not dealing with SSH users creation.

I think to HTTP PUT over TLS, but I never had experience with it and I prefer to hear which are the possible solutions. I know that it can listen as a daemon but I don't know anything about cryptography. So pipeing with OpenSSL may be a solution.

View 2 Replies View Related

Software :: Use Csplit To Split Up A Large File?

Jan 11, 2010

I use csplit to split up a large file and I get file xx01, xx02 and so on. I use a for loop to loop through the files.

Code:
for f in "xx**"
do
echo test
echo $f

[Code].....

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved