Ubuntu :: OpenOffice Spreadsheet Slowdown When Copying Multiple Cells
Dec 13, 2010
Does anyone else find that Openoffice 3.2's spreadsheet app becomes unresponsive when copying and pasting multi-cell selections? I've had this problem ever since I started using Ubuntu 10.04, which comes with OO.O 3.2. The problem becomes noticeable whenever I make a selection more than about 8-10 columns across. Once the flashing border appears around the selected cells, the spreadsheet app becomes very sluggish. The mouse still moves normally, but it takes several seconds after a click until the highlight moves to another cell, or to un-select the cells, or copy them. These slowdowns happen both in old spreadsheets and in freshly created ones. They didn't happen with Openoffice 3.1 or earlier. Haven't tried it under Windows, so I'm not sure whether this is an issue specific to Linux OpenOffice 3.2. Other apps don't seem to be affected.
I have a spreadsheet I have been using for years not and as I go I hide some of the cells. Today I unhide those cells and they came up but very thin and hard to read. I have tried to highlight the "thin" cells and do some type of format but I can not seem to get them back to the normal size.
I'm having some trouble with hyperlinks in openoffice 3.2 spreadsheet, though it may be a begginers problem.
I wrote a script to execute a certain task (in my case, to open an DNA alignment file with bioedit) and then, in openoffice, I assigned a hyperlink button to the path needed to execute the script, which is in my /home ("./script_file"). When I do this the first time, clicking the hyperlink button runs the script perfectly. However, whenever I close openoffice and re-open it afterwards, the hyperlink doesn't work anymore. I've checked and this happens because the original URL, which was "./script_file" changes to /home/user/etc... or /temp/etc... and, obviously the file isn't found. If I manually correct the URL, the hyperlink is functional again... how to fix the URL of an hyperlink so that I won't have to correct it every time I close and re-open the program?
I know this is more of the Openoffice question but I felt I would find some solution here for this. At my work I am using lot of PDF files which are created using Openoffice Calc (Spreadsheet). I want to convert those PDFs to Calc file again for editing. Currently I am using a windows Machine with able2extract pdf converter for converting PDFs to .xls file. But it is not ideal solution for Linux and Openoffice users here.
I have 60+ directory's each containing multiple .doc files. I need to move them to a single directory and keep their file name intact. I don't think cp will do that with out listing all the file names. I was thinking of something like: cp -r /dir/*.doc /newdir . Or should I use a combo like find -type *.doc|cp /newdir?
Regularly I find myself cloning a machine using rsync. I find it understandable, reliable and fast, faster than dd, and I don't have to worry about different partition sizes etc. However, usually I partition my hard disk in a number of partitions:
Code: / /home /usr /var
When I start with a new, empty machine, I start up with a USB stick or live CD, and my new, empty hard disk becomes /dev/sdb. After creating the 4 partitions I have /dev/sdb1, /dev/sdb2... etc. My root directory is on the disk I used for booting, usually /dev/sda. So, in order to access my newly created partitions, I mount them on the /mnt/directory of my root:
Code: mounted now later /mnt/sdb1 / /mnt/sdb2 /home /mnt/sdb3 /usr /mnt/sdb4 /var
In other words, I mount now /dev/sdb1 on /mnt/sdb1, while after copying /dev/sdb1 will become my root directory, /dev/sdb2 become my /home directory, etc. When I start the resync process to copy the image from a remote machine, I have to copy all 4 partitions separately. First the root directory, excluding /home, /usr, /var, then /home, then /usr, /var, like this:
That is a lot of typing and waiting. Sometimes I have a different partition scheme so it is not really feasible to write a script to use always. Now the Question: is there a smarter way of mounting the newly formatted disk (/dev/sdb1, /dev/sdb2... etc) in my root tree so I can perform the rsync copy in just one time, without all the excludes, but assuring that the correct source partitions end up on the correct destination partitions?
I am working with DM355 target board. Here we record. The video coming from IP cameras. Now I have to write c program for copying. The recorded avi files with date and time to NAS server using scp. I wrote a script to copy single file to NAS server.
Suppose I have a tree structure like this: /home/mahmood/sim/a/b/file1.cpp /home/mahmood/sim/a/b/file2.h /home/mahmood/sim/a/c/file3.txt /home/mahmood/sim/d/file4.txt
How can I copy all of them to /home/mahmood/sim. So that when I run "ls" in /home/mahmood/sim, I see all files: file1.cpp file2.h file3.txt file4.txt
Can 'cp' search for all file and copy them in another folder?
I have a list of approximately 50 words that I'd like to search documents for and delete those words. I was wondering if there is some type of automated process for removing multiple words rather than me manually putting each word into 'find and replace'
On that note I guess I could write the Macro in python if there isn't anything out there that does this. However I read that open office only works with python 2.3.5 or something of that nature, and I have already installed 3.1. Is that still going to be an issue?
Description: I am a newly appointed system engineer taking care of linux servers. We have a new set of data coming in which need below configuration: How to do a script with function?:
for files with ".txt" in sm copy each of the files to folder : sm1 and sm2 (log every copy) if succesful: remove original log into the log file if not successful: (not successful copying 1 particular file to all the folders) retain and retry log into the log file mail out the admin with that particular file name
I have already do try a bit: cd /export/home/ for dir in sm1 sm2; do cp -p sm/*.txt $dir/ done Is my starting right? How to do the rest parts?
Maybe this isn't the right place, BUT: How can I create an OO Writer template with about 20 input Text Areas? I can make one, and then I thought I could use COPY and PASTE to dupe it in the other places, but it didn't work. I hate to spend all that time creating the other 19 identical boxes.
i have a macbook pro with snow leopard 1.6.6 (its Unix based don't look me like that ). my question is about an linux software so thats why i'm asking here (on mac forums nobody answered me) I wanna know that how i can open documents in openoffice 3.2.1 in tabs instead of multiple windows ... (like firefox ?
This is supposed to be possible right and better than naming each single cell?How do you do it?Also, maybe my approach is wrong as it's going to lead to a very big nested IF statement.There is a list of numbers which are price ceilings and another list of numbers containing the fees applied to these. I want to a formula that checks the item price against the ceilings and if it's less than or equal to that ceiling uses the fee in the adjacent row (which is the other data range).The only way I know how to do this is a big nested IF statement but does anyone know better?
i recently caved and upgraded to ubuntu 11.04, despite my hate of the unity interface. anyway, i seem to be having completely random, massive lag spikes! seems to happen when i do certain things, such as spend to much time on videos, or stream hd videos. playing games, such as minecraft, and using skype seems to be giving me the same issue. i don't know what to do! i haven't found anything similar to this in my google searches,
I am having a bad slowdown issue when I enable Visual Effects with the Ati proprietary drivers that come with the installation. Everything runs very slow maximizing and minimizing take about 3-5 seconds per click and the effects get all choppy and don't look right like they did on my x1950 video card. I have tried downloading the current drivers from the ATI site but they don't seem to be initialising correctly after I perform the install.
I also know that the graphics card is not the culprit as it works perfectly fine on my Windows 7 64 bit install. There is no reason that my video card cannot handle the Visual Effects as it is a 1GB 4890 should have more than enough horsepower.
My machine is:
AMD Phenom II 965 Black Edition 8 GB G. Skill 1600 DDR3 Memory 1 TB Western Digital Caviar hard drive 1 GB 4890 Graphics Card ATI.
Anyone seeing a dramatic slowdown and "PAGE NOT FOUNDS" after the latest updates?
It even happens on LAN activity. Peak speed is excellent, the ability to resolve both local and WAN addresses is very spotty.
Machine is a newer Clone with AMD Phenom II dual core with 2gb running 10.04 32-bit with Gnome. Network is Gigabyte MoBo 1000-T ethernet hardwired to Dlink router into cablemodem.
XP and Win7 aren't affected.
It was ripping this morning, but right now, is pretty much crippled.
I have Ubuntu 10.04. I installed the KDE libraries to use Okular. Then Ubuntu took almost 30s more to boot (20s black screen between login and desktop). I uninstalled KDE, booted on another kernel, rebooted and the boot time was OK again (25s).
I reinstalled KDE, 55s to boot again. Rebooted couples of times always the same. Rebooted on another kernel then again on mine (it seems to fix the slow boot times).
Boot time was 35s (I thought it was OK). After lauching Okular, the boot time was again 55s...
Removed KDE, Okular, and the boot time is normal again since 10 reboots..
Does anyone else have problems when loading an animated gif in firefox? After upgrading to natty, firefox eats 100% of cpu each time an animated gif is displayed on the tab title, or when the favicon is an animation. I have an fglrx driver, but opengl animation doesnt slowdown pc as much as favicons does on firefox.
So found one issue: When torrent upload speed reaches peak speed (160-200 Kbytes/s) huge read slowdown happens. Server becomes almost unreachable... It allows to connect via putty but it takes a lot of time.
Tested top stats during those lags (Deluge, Transmission) - 10-15% CPU usage.
So I think the problem is in LVM and not in CPU.
How is it possible to find weak place in system to avoid those lags... Cause if torrent is seeding it's impossible to watch movies through network form that server.
I have noticed that when iptables is active on my router i get a noticeable slowdown on browsing. This is untraceable through traceroute:
Code: traceroute -n www.a-site-here.com
finishes immediately and at the same time when loading the site on firefox I am getting timeouts.
I use firewall builder to configure iptables. I have removed all rules while trying to find the one causing the problem, but even disabling all rules does not solve the problem. So, by leaving only an "accept-all" rule on firewall builder(so i don't lock myself outside) I still get slowdowns. In this case
Code: iptables --list-rules gives: Code: -P INPUT DROP -P FORWARD DROP -P OUTPUT DROP
I am seeing a 0.3% clock slowdown in kernel 2.6.18-238.9.1.It is giving ntpd conniptions.The problem does not occur in the previous 2.6.18-238.5.1 kernel.
# 2.6.18-238.9.1 [root@blue ~]# uname -a Linux blue 2.6.18-238.9.1.el5 #1 SMP Tue Apr 12 18:10:13 EDT 2011 x86_64 x86_64 x86_64 GNU/Linux[code]....
I have a home server based on Ubuntu Linux 10.04.2.
Hardware: Motherboard - Asus AT4NM10-I (Intel NM10, PCI) CPU - Integrated Intel Atom D410 RAM - 2 Gb Lan - D-Link DGE-528T Gigabit Adapter
Provider gives 8/2 Mbit ADSL connection.
So tried Deluge and Transmission, and integrated or external network card and no luck.
When torrent file is being seeded on top speed network starts freezing, server almost unreachable, video freezing when watching it by LAN from server... etc...
When I pause upload - everything starts working ok!
Network based on gigabit switch and cooper UTP cables...
If I initiate a file copy of more than a couple of GiB, the PC goes into a dramatic slowdown. Even selecting a different subdirectory through Nautilus can take 30+ seconds.Now, whilst I appreciate that file copying puts a load on the bus, DMA and, to a certain extent, the CPU, it seems unconscionable that it makes the PC effectively unusable until the copy has completed.Is there any way to (practically) lower the priority, or similar, of the copy process so that one can continue to use the platform during large file copies? Right now, I end up using a laptop adjacent to the PC whenever I have to copy a large file. This, for a mainstream operating system is, frankly, ludicrous.I'm aware that I could run Nautilus 'nicely' but I don't want to make changes which would compromise other aspects of the system. It would also be pleasant, for a change, not to have to read a couple of telephone directories of technical documentation in order to resolve the problem myself. This must be a general problem and, in my view, something which seriously compromises the usefulness of Ubuntu given the sizes of contemporary drives and files.
The i7 I'm using at the moment has 8 cores non of which go over 15% usage while the copy is progressing despite the OS being effectively frozen for long periods.Environment:Intel Core i7-2600 @ 3.40GHz; 16GB RAM; Natty (11.04) fully updated as at 29th July 2011; Kernel 2.6.38-10-generic, GNOME 2.32.1.; Primary drive has 563.9GiB free space, Running in Ubuntu 'Classic' (no effects) mode
Disk /dev/sdc: 2000.4 GB, 2000398934016 bytes 255 heads, 63 sectors/track, 243201 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes
This is not the common problem that's on the stickied thread. I'm able to connect to my wireless network perfectly. I switch between Windows to game and Linux for desktop, coding, and hosting, so I can tell it has something to do with ubuntu 10.10. When I'm connected to my wireless network, at random times, my internet begins to slowdown, then I get disconnected from the internet, but the taskbar says I'm connected. This problem only happens in ubuntu and not Windows for some strange reason.
The applications that are usually open do not cause it, because it still does it even though I have little applications open (Google Chrome, gedit). When I'm on Teamspeak 3, I notice my ping rises to the 1000s randomly, and I get disconnected. Once again that only happens on Ubuntu.
I'm running 32-bit Debian Squeeze (2.6.32-5-686) on an old IBM TinkCentre piece-meal system that runs great otherwise. I have a LITEON iHAS424 Burner attached. As soon as I insert any disk (CD, DVD, Blank, Commercial, etc.) the system either locks up entirely or crawls to a near stop. The drive is about 3-4 months old and I've experienced the problem a couple of times in the past but just about every time I try to use it lately. The CD/DVD drive seems to work fine on other systems, but I haven't had the luxury of leaving in for any length of time to be 100% sure.
I have been able to pull out of the problem in the past by opening up the case and disconnecting/reconnecting both ends of the drive cable, but this doesn't always work and never "feels right" as a fix.Here is what I think are the significant portions of my messages log file:Quote:
Mar 1 20:03:46 bugs kernel: [ 1.415289] ata_piix 0000:00:1f.2: PCI INT A -> GSI 18 (level, low) -> IRQ 18 Mar 1 20:03:46 bugs kernel: [ 1.415299] ata_piix 0000:00:1f.2: MAP [ P0 P1 IDE IDE ]
I recently added an external hard drive through a IEEE 1394 interface. I'm finding that during large file transfers the system slows to a crawl. It's still running: routing for example seems fine. But running applications are pretty much unusable: Apache is unusably slow, SSH login is very slow, etc. Currently I'm using the IEEE 1394 drivers from Axel's ATRpms but I'm pretty sure I saw this with the default kernel IEEE 1394 drivers too.