When running MintLinux a year or so ago, I encrypted a load of sensitive documents. This was done using a PGP tool for Nautilus. (Something like right clicking on the file, and then encrypt). I made the keys, and obtained an encrypted file. Everyone knows what's coming up now. Since then, I have lost the key I retained for opening this file. I understand that I have no chance at all of getting the data out without the secret key. My question is, where would the secret key have been put? It was a Gnome program which did the work, and also guided me through making the keys. I cannot find this program in Gnome under Ubuntu 9.10.
Would I be silly in thinking that the same program with exactly the same parameters would make the same secret key file? I know the passphrase exactly, just lost the secret key.
For no known reason there are a few subdirectories added to my Gnome - Locations - menu. Both the traditional panel menu and the new - computer - menu In the - computer - menu there is a option to remove items but this does not work. Is there any way to remove those subdirectories from the menu list. Or/and where can I find that list to edit this manually. (Menueditor alacarte does not edit locations and system). (Suse 11.1)
I have built gnome-2.28.1 from scratch (source) with gnome I compiled GDM and many other programs. GDM is starting fine, I can enter my user on the login prompt and login without problems. When I'm in the desktop I can't access the Computer or Trash locations clicking on the icon or even through Nautilus, the same error happening: Error: Operation not supported.
I tried debugging by hand running the follow command: gvfs-ls trash:// Which tells me exactly Error: Operation not supported.
On my ubuntu system, lsof | grep gvfs returns a lot of gvfsd* daemons, like gvfsd-computer and gvfsd-trash. On my Scratch system lsof | grep gvfs doesn't return anything. My prefix used when building gnome was /opt/gnome and sysconfdir on /etc/gnome/2.28.1. lsof shows me that D-Bus is running, actualy gdm starts gnome with dbus-launch --exit-with-session gnome-session.
And inside my /etc/dbus-1/session-local.conf I have: Code: <!DOCTYPE busconfig PUBLIC "-//freedesktop//DTD D-BUS Bus Configuration 1.0//EN" "[URL]"> <busconfig> <!-- Search for .service files in /usr/local --> <servicedir>/opt/gnome/share/dbus-1/services</servicedir> </busconfig>
Well, I done many tweaks around and still can't display gvfs location, what I think is very odd is that lsof doesn't returns any gvfsd* daemon running.
Is there any method to add a location to the list in the gnome weather applet/clock ? [EDIT] To be more precise: my location is not there, how to extend the list with my location so I could have weather displayed for my city
I have two storage drives that I will be sharing by FTP. One is internal 1TB ext4 HDD and another one is an external USB 1TB NTFS HDD. Both drives get mounted to /media and I am trying to set an additional mount point for each. For internal HDD everything works perfectly. I simply went to /etc/fstab and copied the line related to it. Now I have:
Code: /dev/sdb5 /home/eugene/.MOUNT/sdb5 ext4 defaults 0 0 /dev/sdb5 /media/sdb5 ext4 defaults 0 0 which does exactly what I need.
I tried doing the same for the USB drive which produces unexpected results. The lines are Code: /dev/sdc1 /home/eugene/.MOUNT/sdc1 ntfs defaults 0 0 /dev/sdc1 /media/sdc1 ntfs defaults 0 0
This has the following results: - in /media/Y (Y is label of this HDD) I have this HDD and can access all its contents which is good - in /home/eugene/.MOUNT/sdc1 I don't have anything and this is bad - in /media/sdc1 I have only one folder from this HDD and this folder is empty (on the HDD this folder is not empty) and this is somehow weird.
So I've got a home server hosting a website for my restaurant, but I'd like to get another server up to get some redundancy going.
I have another machine I'd like to set up at another location to take over retrieving requests sent for the website whenever my home server goes down. I've got my domain through [URL], but the domain is hosted through [URL] for their dynamic dns service (because im not using a static ip).
So I'm guessing having another server set up is just a matter of setting up dns records, however I don't know where to begin with setting that up. Any words of wisdom out there?
I am trying to view a share I have on a windows computer but nautilus claims it can't handle that. I tried it in dolphin and found the file but when I tried to open it gedit claims it cannot handle smb locations either!
Last night I used Nautilus to FTP a couple of files to my web-site, it worked fine and I created a bookmark for it. Today, with no changes to my system, it refuses to do it showing the error message in the title of the post. I thought that maybe my bookmark was corrupted so I tried File>Connect to Server using FTP with login. Just after entering my password I get the error message. I'm using Maverick. My Internet connection is working well. What the heck is going on?
My company has 2 locations. I have a server running BIND, Apache, and MySQL. I'm setting up a second server just in case the primary goes down. I'm sure it's bad form to it the way I'm doing it, but how might I go about configuing my backup? Should I do it as a Secondary DNS server?
Is there a way to set Firefox to place downloaded files in different folders based on the file type?e.g. in my Downloads folder place all .doc files in a sub-folder called ".doc", all .jpgs in a sub-folder called ".jpg"I'd assume there's probably a rule, or a script that can be used to accomplish this, but being a graduate student, I don't have alot of free time to poke around and figure it out myself
We have approximately 100 retail locations that will have split vpn tunneling. Intranet traffic will flow over the vpn to the corporate headquarters, voip traffic will tunnel to a regional hub and internet bound traffic will go over the local isp. The retail locations are small with 1-8 users and no enterprise grade equipment (servers, etc). This setup in effect will render our current content filtering solution useless.
The locations will be equipped with Cisco ASA 5505 Firewalls. The original plan was to use a Websense server and the url filtering feature to act as a content filter. I just found out that pricing for Websense was not included in the budget will be a show stopper.There may also be some performance issues with this method. Putting a proxy server at each location is not really an option. We do not have the resources to place a server at each location, plus the users could simply unplug an inline device or go around it. There is minimal supervision at most of these locations.
Ideally, I would like to find a way to use something like Dansguardian with an ldap interface and the url filtering feature of the ASA firewalls. I found a program called n2h2p, but I can find 0 documentation for it. It is also 2 years old with no updates. I also need to be able totrally manage this as trying to keep up with 100 different configurations for 400 users would be virtually impossible for the amount of time I will have available
Is there a way to sync Tomboy notes to multiple locations? I would like to be able to sync them to my UbuntuOne account and at the same time to my local NFS server, but from the looks of it Tomboy only lets you choose one location for syncing. Maybe there's a workaround for this or something?
I have an assignment as follow. I have to install 2 versions of FFMPEG 0.6.1 and Alexander Strange on non default locations with all required dependencies and check the dependancies for each of these and make sure to install the required dependancies in the local folder of the app rather than at the system root.
The flags to set for the above must include --enable-mp3lame --enable-gpl --disable-vhook --disable-ffplay --disable-ffserver --enable-a52 --enable-xvid --enable-faac --enable-faad --enable-amr_nb --enable-pthreads --enable-x264
Now how to install them? and how to run them simultaneously? Do let me know if any other thing/info required.
I am running Ubuntu 10.10 64. I have a RAID array consisting of two 1 TB HDD's, controlled by my on-board RAID controller. I have a dual-boot of Ubuntu 10.10 and Windows. The RAID array is mapped in /dev/mapper, and here is the output of sudo dmraid -ay
Code: RAID set "pdc_dedfhcfdee" already active RAID set "pdc_dedfhcfdee1" already active RAID set "pdc_dedfhcfdee2" already active RAID set "pdc_dedfhcfdee3" already active
I've switched my non-maximized window button locations back over to the right with gconf-editor but i'd like to change the global menu buttons over to the right as well.I've had 20 odd years working that way. I can't seem to find an option in gconf-editor and if google has the answer I'm not using the correct search terms.
What can you do when your linux system "can't find" dynamically linked libraries that are indeed installed in their correct locations? Case in point, I'm trying to run a program called 'ucanvcam':
oliver@human ~/installed/ucanvcam-0.1.6/bin $ ./ucanvcam ./ucanvcam: error while loading shared libraries: libgd.so.2: cannot open shared object file: No such file or directory oliver@human ~/installed/ucanvcam-0.1.6/bin $ locate libgd.so.2 /usr/lib64/libgd.so.2.0.0 /usr/lib64/libgd.so.2
oliver@human ~/installed/ucanvcam-0.1.6/bin $ ldd ./ucanvcam linux-gate.so.1 => (0xf7706000) [...] libgd.so.2 => not found [...] librt.so.1 => /lib32/librt.so.1 (0xf6b1e000)
How can I tell it to look for libgd.so.2 in /usr/lib64? And more importantly, why isn't it looking there, and where is it looking?
I am writing a c program in linux and in the program I am mounting 2 usb devices attached to 2 specific ports in the computer. (eg : I have to mount the usb attached to the left port in to /mnt/left and the right port to /mnt/right) and the attachment order of those devices may differ (eg: left port usb may or may not be attached before the right.) In this case what should i do?
I am sitting here at my computer at 2:30am on a saturday morning with no idea what the hell im doing. But here's my goal I have a KML File for Google Earth. This file is generated with the GPS locations of certain sites.
I have a text file with extra information about certain sites. What i'm trying to figure out how to do is by using perl I want to open the text file grab the first location on line one and search for it in the KML file. When that line is found I want to edit the Description field in that KML file and add something to the description and change the placemark from Yellow to Blue.
Can anyone give me advice on the best way to handle this and where I can start. I can understand if nobody wants to give me the answer directly but I would like to be pointed in the right direction
I've got an external HDD from Iomega that I've been using for over a year and lately it's been doing this thing where it suddenly changes from /dev/sdg to /dev/sdh while the partition on it is still mounted, seemingly at random. I'm often (not always) able to look at the contents of the partition, but it gives me I/O errors followed by a list of files and directories.
I am using slackware64 13.0 running kernel 2.6.29.6. Below I've provided the relevant lines from dmesg (edited out irrelevant lines to keep post under max character limit.) I'm beginning to suspect NFS has something to do with this, but I can't even begin to imagine how.
When this happens I'm able to umount -l the partition and remount it using the new /dev/sdh1 but eventually the same thing just happens and the whole drive switches back to /dev/sdg. This problem persists between reboots. It's also happened with an ext3 filesystem, in fact I switched to ext2 because it kept telling me "Aborting journal" and I was afraid I would get a corrupted journal and perhaps a destroyed file system.
About 8 months ago I was doing some work at my brother's place and while down on the floor fiddling with cables I yanked this whole drive off the desk and it smashed into the floor (while spinning probably, I'm pretty sure this model is disk based.) I've been expecting to see some strange behavior since that happened, but it took a while and I can be certain this is related to that incident, although I'm rather convinced it is.
I have 2 linux servers in different locations. I need to setup a ip tunnel. I follow this steps on both servers:
Server1: ip tunnel add tun0 mode ipip local IP_Server1 remote IP_Server2 dev ethX ip l s tun0 up ip a a 10.10.10.1 peer 10.10.10.2 dev tun0
Server2: ip tunnel add tun0 mode ipip local IP_Server2 remote IP_Server1 dev ethX ip l s tun0 up ip a a 10.10.10.2 peer 10.10.10.1 dev tun0
After creating the tunnel everything is ok, but after a time(maybe some hours), I can't ping the other end of the tunnel (ping to IP_Server1 and IP_Server2 is ok all the time; the connection to internet is very reliable). I have tried "ipip" and "gre" mode, but same result. If I ping from two servers the other end of the tunnel, the connection is again established for some hours and ping is working in both directions.(if I ping only from one side the ping is not working) How can I resolve this issue for no longer having to log on both servers to ping the other end of the tunnel? If I use an crondjob to ping the other end of the tunnel at 2 hours everything is working fine for weeks, but I need other solution.
I am writing a c program in linux and in the program I am mounting 2 usb devices attached to 2 specific ports in the computer. (eg : I have to mount the usb attached to the left port in to /mnt/left and the right port to /mnt/right) and the attachment order of those devices may differ (eg: left port usb may or may not be attached before the right.) In this case what should i do?
1. Where do I post reports of minor bugs?2. I think I found a minor bug...When you first install Ubuntu and log in. If you of the applet thing on the top bar and click the edit button next to 'Locations' the clock preferences will pop-up.Now if you click on the general tab you will see by default that the clock applet thing should show you the time,temperature and weather...Yet if you look and the applet there is no notification of the of the temperature and weather.I tried to see if un-checking them and closing then opening them up again and re-checking would work but that has no effect.I think this is a minor bug in ubuntu and/or gnome and I would like and update and/or for Ubuntu Maverick Meerkat to correct this.
When we right click on files and folders..copy to and move options are there in my system. only home folder and desktop are shown under these options..i want to add some of my own directories too..how do i do that ?
I am completely new to compiling the kernel. Trying to compile on an old Dell C610 laptop that has Debian 6.01 installed and working. Here is what I have done so far:
Downloaded linux-3.0.tar.bz2 to home directory Also downloaded patch-3.0-git13 to home directory tar xjfv linux-3.0.tar.bz2 which uncompressed the tar ball in the created the linux-3.0 directory in my home directory
I have a workstation dedicated to monitoring. The goal is to have multiple web sessions and other applications running across dual screens on multiple virtual desktops. I have a nice Perl/TK/wmctrl script that will automatically rotate the desktops. All is working great.
The problem is, I need a solution to automatically start the applications on the correct desktop, with the correct window size, in the correct location on the desktop. That way we can start the monitor boxes in the morning and have everything start in the correct place. It is a really cool effect to have wall mounted monitors with cube rotation showing off multiple graphs and more.
Do any of you pros know how to start an application with a specific window size and define where on the desktop it is placed? The box is running OpenSUSE 11.4 KDE. Is that kind of control possible?
I am writing a script to install a program (a GUI interface) and would like to search if the required software is already installed. This made me think of the command whereis. I was curious how the command whereis is working but didn't know where to search. Is it equivalent with a find at the most common locations?
I have hundreds of directories in various subdirs that I need to remove. I want to remove all of these dirs, but can only find solutions on how to do remove files (or how to remove subdirs from within the current dir).
I think I need something like
find -iname 'testfile*' | xargs rm -i
where I want to remove every directory that contains the word 'testfile' within the directory name. I know xargs wont work for dirs,