Ubuntu :: Libreoffice Writer Or Natty Produce Huge Files
May 3, 2011
I do monthly reports by copying the previous document, update the text and change the images. The images are the same size and numbers each months. Since last month I upgraded my laptop to Natty and suddenly my document went from 942 kB to 10.1 MB in .odt. When saving to PDF the usual size of 472 went up to 1.9 MB. I have searched the net and the forums but haven't seen anything about a similar issue.
I'm not sure if it's an issue that is from the previous document being produced in Open Office and now updated and saved in Libreoffice. Or if it's somehow something to do with the upgrade from Maverick to Natty. I would hope I don't have to uninstall Libreoffice and install Open Office as a solution (which I understand is not entirely easy in Natty, something I read about Open Office being transitional to Libre). I can't email simple documents to customers that's over 10 MB large...
View 1 Replies
ADVERTISEMENT
Sep 8, 2015
I use scim to input Chinese in Libreoffice-writer. It worked on one of my computers until a couple of days ago.
Normally to start scim I press Ctrl and the space bar at the same time. Then I use the input method Cangjie to input Chinese.
Now when I do so in Libreoffice-writer "English/European" appears in the scim panel. The other choice is "English/Keyboard". The choice of either doesn't enable me to input Chinese.
In all other software scim works as before.
The computer runs Debian sid. On another computer that runs Debian sid too I don't have the same problem. I have version 5 of Libreoffice on both computers.
View 1 Replies
View Related
Jun 21, 2011
I'm running Kubuntu 11.04 and LibreOffice Writer 3.3.2. I'm trying to print a #10 envelope on an HP LaserJet 5M printer. When I create the envelope and attempt to print it, LIbreOffice sets the paper size to #10 Envelope, as it should. I put the envelope into the manual feed tray and initiate the actual printing operation. The printer goes through the motions of processing the output -- and then goes back to the READY state without printing anything. I've tried using all possible Paper Tray settings and also using the CUPS drivers rather than the Foomatic drivers. None of that makes a difference. What does make a difference is setting the paper size to Letter, but then, of course, the information prints in the wrong place on the page.
View 2 Replies
View Related
Jun 4, 2011
I wrote a document using LibreOffice Writer. It has heading styles and a generated table of contents.I saved the document to ODT and as PDF. I loaded the PDF into Caliber as an e-Book.
I converted the PDF to EPUB format for Nook(tm) devices. I stored the EPUB to my 'droid phone and loaded it into an EPUB reader. VIOLA! I can see the table of contents as page #1. I cannot see any other pages in the EPUB. I can read the original PDF just fine.
Q1. What must I do along the way so that I have a useful EPUB? (I don't need fancy just complete and readable.) Q2. Are their things that I need to do to the PDF in Calibre? Q3. Are there things I need to do in LibreOffice Writer or to the generated PDF before I convert it to EPUB with Calibre?
View 3 Replies
View Related
Jul 31, 2011
Using Opensue 11.4 64 KDE
Have been having some problems with Libreoffice Writer crashing every time I try to save a file that contains any kind of multimedia (usually images) so I decided to completely remove the app and any trace of it and start again.
After complete removal I reinstalled from the Opensuse repo 3.3.1
The porblem I have now is that when I start typing in writer absolutely nothing happens. Wait approximately one minute and the text I typed appears. Try to backspac to delete or type some more and the behaviour persists.
Have I missed installation of some package?
View 3 Replies
View Related
Apr 4, 2010
I recently bought a new laptop. compaq presario CQ41 109AU. It does not come with a pre-installed OS so i've decided to put an Ubuntu on it[karmic koala 9.10]. I successfully completed the installation procedures. however, it does not produce sound when loggin in and playing music files. I gone to look if the settings where put to mute, but they were not. I tried installing a MS Vista in the other partition and the sound worked.
View 1 Replies
View Related
Sep 11, 2010
I have hard drive with several thousand photos. These photos are in different formats, some are tif some jpg some raw (cr2). These files are in dozens of directories. What I want to do is produce a list of all the files, in all of the directories, sorted by the file name (not sorting on the path), listing the location, file name, size and date created.
For instance I may have a file called photo1.jpg in /photos/pics/
I may also have a file called photo1.cr2 in /photos/misc/ and a file called photo1.tif in /photos/processed/summer/.
I would like a text file that would look like this:
/photos/misc/photo1.cr2 2536658 2010-07-09 13:17
/photos/pics/photo1.jpg 320046 2010-07-07 14:47
/photos/processed/summer/photo1.tif 234456689 2010-07-10 09:22
Of course I want it to do this for all of the photos. I pretty sure that there is a way to do this with a minimum amount of work. I have no problem with using the command line.
View 3 Replies
View Related
Jul 7, 2011
So I noticed today that my machines root hard drive had almost no space on it.
I did a disk usage analyzer and I found out my var/log folder is 95GB.
The large logs are:
can I just delete them? also how can I stop this from happening again?
View 1 Replies
View Related
Nov 19, 2010
how can i delete my log files? They are 131.2 GB! And i need space on my pc . And is it ok to delete it ?
View 6 Replies
View Related
Nov 8, 2010
I am going through a multi-step process to produce output files, which involves 25,000 greps at one stage. While I do achieve the desired result I am wondering whether the process could be improved (sped up and/or decluttered).input 1set of dated files called ids<yyyy><mm> containing numeric id's, one per line, 280,000 lines in total:
Code:
123456
999996
[code]....
View 14 Replies
View Related
Jul 25, 2010
I have a 60GB partition with / and home on it. I logged on yesterday and it gave me a warning saying that I had only 1.9 GB of disk space left. I ignored it for a day and assumed that i had too many videos and pics.But the next day i had not added any files or downloaded any software but i had 0B left. I used the disk usage analyser and found that 33GBs came from /var/log. It was from two log files. syslog and daemon.log 16.5GB each!! I opened them up and i found that this line of text was repeated hnundreds of thousands of times.
Code:
Jul 22 19:32:36 aulenback-desktop ntfs-3g[5315]: Failed to decompress file: Value too large for defined data type
[code]...
View 3 Replies
View Related
Jul 27, 2011
i have about 2 TB of 700mb avi files as data on disc want to spread it across two 2TB ext usb drives (sata 3.5 inside the housing) obviously i have to rip them to the laptop and then move to the ext hdd (omg laborious little task) am i better doing the ripping in meerkat or in a windows machine? files need to be accessible by W7, XP, and meerkat to vlc player. what should i format the discs to?
View 5 Replies
View Related
Oct 29, 2010
I'm looking for a fast way to verify a copy of a folder with 150Gigs of data, in 33 files. Some of the files are a few kb, while a few are 20-30Gigs. I've done a file count, which is quick, but doesn't verify that all the files are intact. I tried running md5sum on them, which works, but will probably take as long as copying the files in the first place. Diff works too, but is slow too.
View 1 Replies
View Related
Jun 24, 2011
I have a wav file bigger than 8GB. i recorded it on a windows PC. unfortunately wav files cant be bigger than 2GB. somehow i got a file that is almost 9GB. I tried to chop the file under ubuntu into smaller pieces to open it part by part. i used gnome split to divide the file and made 10 parts out of it. now i have these parts of the data which i cant read with no program except for gnome split to merge them together again - which would only bring me to the beginning of my problem. so my question is: is there any other way to open/ split&open a wav file of that size or maybe a way to open the splitted file partially?
View 3 Replies
View Related
Jan 29, 2016
I installed Debian on my laptop and now I had warning that my log files inside var are getting out of hands...Following files...
36G daemon.log
48G daemon.log.1
41G kern.log
55G kern.log.1
31G messages
42G messages.1
8.2G syslog
17G syslog.1
How can I clear this and set up properly so they don't take so much space?
View 6 Replies
View Related
Feb 12, 2011
I've started using the huge.s kernel and when i try to compile packages slackware complains about kernel headers but all i see is the smp header files on the slackware discs ?
View 2 Replies
View Related
May 18, 2010
I examined the problem, and determined that a huge amount of disk space is being taken up by files in /var/log/ The following files:
Code:
/var/log/messages.1
/var/log/kern.log.1
/var/log/daemon.log.1
/var/log/messages
/var/log/kern.log
/var/log/daemon.log
/var/log/syslog
are all over 1 GB in size. The largest is 18 GB. Together, they total 48.3 GB. I restarted the system, forcing a fsck.
View 3 Replies
View Related
May 19, 2010
Is there any software in Linux to view huge .txt files, say, over 10 megas? I'm now using default "gedit", version 2.28.0, which seems to not be able to open huge .txt files. It's the same case for Windows default .txt browser, but in Windows, "Win Word" seems to work fine. software under Linux to browse huge .txt files?
View 5 Replies
View Related
Oct 9, 2010
syslog, messages and kern.log are incredibly huge files that are taking up a lot of space on my hard drive. Is it safe to remove them and/or to reduce logging so it doesn't take such an enormous amount of hard disk memory? If so, how can I reduce the logging so it doesn't produce logs that are 10s of GB in size?Also, mounting a drive places it into the folder /media. Will it become problematic if the size of the mounted drive exceeds the amount of free space available on my Ubuntu partition?
View 4 Replies
View Related
Feb 2, 2011
I started getting errors about running out of disk space in root this morning. I hunted up what's taking all the space; var/log is 39GB (Ubuntu is installed on a 50G partition.) It's specific files that live in that directory, not subfolders. The files are:
kern.log = 11.6 GB
messages (plain text file) = 11.4 GB
kern.log.1 = 6.1 GB
[code]...
View 9 Replies
View Related
Jul 4, 2011
I streamed video through a my computer with mediatomb yesterday. The problem is that now, I got these huge log files. I am running out of memory (less than 1 gb left) as we speak. They're filled with ufw entries, but my question is:
I read somewhere about a program called logrotate that were supposed to keep logs from getting to big, is this wrong and should mediatomb generate 3 separate log files with 5gb of data each for just 2 hours of streaming?
View 2 Replies
View Related
Dec 22, 2010
I am facing a strange problem in my server, One of my filesystem shows as 3.1G when I execute df -h command and the utilization shows as 83%, but when I cd to the directory /usr/local I could not find any huge files in that filesystem and I have searched for hidden files as well,
groupserver:~ # df -h
Filesystem Size Used Avail Use% Mounted on
/dev/sda9 3.1G 2.5G 532M 83% /usr/local
groupserver:/usr/local # du -sh *
0 bin
93M abinav
[Code]...
View 2 Replies
View Related
Feb 17, 2011
I've recently converted two Ubuntu 10.04 (one 64, one 32bit) machines to LibreOffice. We've used OpenOffice.org since early 2.x The issue is a database file created in OOo and now needing to be accessed in LO. As far as I know, there shouldn't be any issue with file compatibility. On the 64 bit machine, (a Ubuntu server installation - not headless) the file opens with no trouble and is fully accessible. (forms, reports, tables, etc.) However, on the 32 bit machine, trying to access the file over the network on the server via samba (because we also have Windows machines on the network, so I serve it all up via samba) I am able to open the database, but get the following error when trying to access any table, form or report: The connection to the data source "Mattress Prices" could not be
established.
[Code]...
View 4 Replies
View Related
Sep 7, 2010
My /var/ partition continues to fill up on all my servers, and it is because the logs in /var/log/apache2 or /var/log/mysql are being deleted during log rotate, but their file handles are being held open. Thus, a "du -sh /var/log" shows the correct values, but "df | grep /var" shows something much different.
It seems that the log files rotate, however if I run "lsof | grep deleted" it returns lots of files that are no longer visible in the directory, however refuse to clear themselves off the disk.
The only way I have found to make these log files go away (and thus clear up the disk space on the partition I should have) is to restart either apache or mysql, depending on which process has huge sized log files being held open.
Is it just me, or is this a big flaw in the way linux works, that it can't figure out how to release file handle for a log so the disk space can be reclaimed? This is happening to me a lot lately.
Here is some output from one of my web servers so you can see what I am seeing...
root@web49:~# df -h | grep /var$
Filesystem Size Used Avail Use% Mounted on
/dev/sda8 9.2G 6.1G 2.7G 70% /var
root@web49:~# du -sh /var
[Code]....
View 9 Replies
View Related
Apr 12, 2010
how to create the huge.s kernel files on the slackware disks? or at least direct me to a post if there is the same question. I currently rsync my files to Alien BOB's script, and i use syslinux to install from my usb stick. i was wanting to install using a later kernel just for testing purposes. (i.e 2.6.34-rc3 as of this writing)
View 9 Replies
View Related
Aug 9, 2011
I've had a problem with libreoffice - it won't open files. That is,File/Open doesn't bring up a dialog box. File/New works and so does using a filename on the command line when starting libreoffice.Now there's good news and bad news. The good news is that with help from some libreoffice-users folks, I now know what caused this and I have a workaround. The bad news is that it's a bug in Libreoffice and I need some help testing other versions, please. So here's the steps to reproduce:
(1) Login to a Gnome session (KDE may also work, dunno)
(2) Start libreoffice
(3) Select Tools/Options/LibreOffice/General
(4) Observe in that page a checkbox 'Use LibreOffice dialogue boxes'
(5) Ensure the checkbox is blank (not selected)
(6) Save settings and check that File/Open pops up a dialog box.
(7) Quit libreoffice and log out of the session
(8) Login to an LXDE session
(9) Start libreoffice
(10) Select File/Open
*(11) Check whether a dialog box appears
(12) If not, select Tools/Options/LibreOffice/General
*(13) Check whether there is a checkbox 'Use LibreOffice dialogue boxes'
If your system does NOT open a dialog box in step 11 and does NOT have a checkbox in step 13, you are seeing the same bug as me.I'm using opensuse 11.3.
View 6 Replies
View Related
Oct 26, 2009
I need to transfer some massive amount of data (2.5terrabyte, many files, directory structure) to a embedded raid-box which has a minimal linux on it (some custom distro from western digital). We tried rsync (version 2.6.7), but it crashes because the filelist is too big for the ram available (fixed in later versions of rsync, but I don't know how to update, it's not debian based and there are no compiler tools). We tried nfs, but the max bandwidth produced is around 1 mb/sec (cpu bound?), so it'd take around 3 weeks this way. Samba has problems with big files (and we have some 20gb files in there).
SCP isn't installed, and would probably also be cpu bound due to encryption I think. So the only option left would be ftp, we're currently trying using ncftp with the command "put -R /path/to/data/" , but it's been running for over an hour, eating up most of the ram, and not using any bandwidth. I think it is still building a file list or something. FTP already worked for a single 20gb file with acceptable bandwidth of about 12mb/sec. Does anyone know a better ftp program (for console) that can start transferring some data or at least display an estimated time for the copy-preperation?
View 8 Replies
View Related
Sep 30, 2010
I need to figure out how to arrange for the fastest-possible read-access of a large or huge memory-mapped file. I'm writing high-speed real-time object-chasing software for a NASA telescope (on earth). This software must detect images of fast moving objects (across arbitrary fields of fixed stars), estimate what direction and speed the object image is traveling (based on the length and direction of a streak on the detection image), then chase after the object while capturing new 4Kx4K pixel images every 2~5 seconds, quickly matching its speed and trajectory, then continue to track and capture images until the object vanishes (below horizon, into earth shadow, etc).
I have created two star "catalogs". Both contain the same 1+ billion stars (and other objects), but one is a "master catalog" that contains all known information about each object (128 bytes per object == 143GB) while the other is a "nightly build" that only contains the information necessary to perform the real-time process (32 bytes per object == 36GB) with object positions precisely updated for precession and proper-motion each night. Almost always the information in the "nightly build" catalog will be sufficient for the high-speed (real-time) processes.
[Code]...
View 8 Replies
View Related
Mar 4, 2011
I received some M$ Power Points files and cannot open then. One file was from Power Points 2010, and another saved with the option 2000-2010. These were rejected by Impress with the failure report "Version Incompatibility. Incorrect file version". When a file was saved with the option Power Point 95, Impress tried to open it but when RAM useable ramped up to > 1GB, Impress automatically shut down. The only other information wrt the files is that its highly likely they contain graphics generated with M$Visio. Has anyone had similar experiences? Are there any work arounds?
[Code]....
View 7 Replies
View Related
Jul 14, 2011
Whenever I save a text file edited with LibreOffice, I get a new, locked file which can be identified with its "~.lock" file prefix. This locked file prevents access (probably for security reasons) to the original file. However, this is a serious impediment as it forbids any further editing of the original file or document so long as the locked file has not been removed. Though erasing this locked file (which can be made visible with "Ctrl-H" if hidden) should free the original, this in fact is not the case. how to unlock these locked files?
View 9 Replies
View Related