I have 2 Linux systems - both running Fedora 14 and fully updated. They are sitting side by side on the same network. I have Firefox 3.6 installed on both, and running on one (call it system A). If I ssh to system B and run firefox there, I get a new window opened, but the browser is running on system A. If I close the instance of Firefox running on system A before trying to start one on System B, then the new window is actually running on system B.
This doesn't happen when I run Thunderbird on different systems. What is happening to Firefox that it performs this way?
I am connecting to remote Linux PCs via ssh, to update software and do other tasks. I want to send a notification to the remote PCs screen (eg, "Do not run program X, it is being updated now"), so the users do know what is happening.
Is there a reverse way for ssh -X host so I can connect to a remote Linux machine and run notify-send and it appears on the other display?
I have two computers running Ubuntu. I want to run the copy of Firefox that is on the remote machine. i connect to the remote in the usual way, "ssh -X fermi", which brings up a console on the remote machine, fermi. When I type "firefox", however, it starts the copy of firefox that is on the local machine. On the remote machine, "ps aux|grep firefox" confirms that firefox is not running there. Is there a way to force execution of the remote copy?Or, can I start a local copy of firefox using the remote bookmarks, preferences, etc?
I am trying to use my Apple Remote Controller that came with my iMac 4.1. I did some research and on the Ubuntu site it said that its possible with a application called Infrared Remote Controller. It detected from the hardrive which version I had etc but than a error message appeared on the bottom saying:
How do I do this in 10.4? I need to be able to Remote Desktop into my server over SSH but can't because I don't have a spare monitor on hand... and Ubuntu won't start a desktop if it doesn't detect a monitor.
I'm trying to access a VMware server via Lab Manager using Firefox 3.6.12 from Ubuntu 10.10. Lab Manager comes up but when I double click on a VM console to open it up I get: VMware Remote MKS Plugin not running. So, I install the plugin and restart Firefox, as directed. The plugin shows that VMware Remote MKS Plugin 2.2.0.0 is installed. I go back to Lab Manager and double click on it, but get the same message. Lab Manager does not find the plug in.
I want to remote login as a non root user and then run a command under the root account.I have set up the ssh/scp for the non root user and this configuration works fine. What I dont know is how to run a command under root once remotly logged in as the non-root account.I have to run this command under root, it cannot be changed.
I am running rtorrent on a remote server mainly for seeding torrents. However, I am also using it to download new media.
I wish to share the downloaded torrent media on the web server. This works great for torrents that have finished downloading, however, for torrents that are in the process of downloading, such as large multi-GiB torrents which have slow seeders or no seeders, this creates an issue.
The problem is that I have set the rtorrent default download directory to be visible on the web server. Which is what I want, so that people can get the downloaded media via a web server. Then, when I drop a new torrent into the directory that rtorrent watches, it will start downloading the media to the web servers visible directory. When it finishes the download, then it works the way that I want. So that the media is available both as a web download and also as a torrent download.
The issue is that when a download is in progress, media that is available on the web server is incomplete.
I can manually close a torrent and manually move the media to a different location, then manually tell rtorrent to use the new location for the torrrent (^o)... but that is doing it manually... and if I need to do this with a large number of torrents, it takes a very long time.
I want to automate that so that I don't have to mess with it at all. A lot of other software such as utorrent and such can do this automatically, but it seems that feature is missing from rtorrent.
rtorrent does offer a scripting function in it's config file that can start a script on completion of a torrent file, however, I have not been able to figure out how to automaticaly stop a torrent seed, move the torrent media to a different location and then tell rtorrent the file is in the new location, then start up the torrent again. Moving a torrents associated media to a new location also initiates a hash check when the torrent is started.
I am not going to use different software. Since rtorrent is shell based and uses curses, it doesn't require GUI features. That saves something like 150 megs of memory for disk caching, etc. rtorrent kicks MUCH ass for this purpose and it uses a tiny amount of system resources.
I use screen with rtorrent so that it will run without having to be logged into the system.
I also need to limit the bandwidth on Apache so that people can download with the web if they absolutely must, but I would prefer that they used the torrent files. So, I would like to slow down their downloads quite a bit so that more bandwidth is used with rtorrent.
As with most Linux issues, there is more than one way to do something. The problem is, I can't even figure out one of those ways right now.
I have an Ubuntu 10.10 running on a pc at work. On my Windows pc I can use a free utility called Teamviewer which you run on the pc (no install needed) and then you connect without any need to VPN into my work network.
So, what would be the best way to connect from home and remotely connect to my Ubuntu system at work?
How do I obtain information about the running x-server from a remote shell session? I want to know things like resolution, color depth, etc. My xorg.conf is basically empty. The only thing I can think of doing is to read the Xorg.0.log file which seems inefficient.
I thought that 'xrandr' displayed some text output but that behavior has changed (?). It seems to require an X display. Is there another way? Something I could incorporate into a shell script? (This is Fedora but that shouldn't really matter)
I suspect this has been posted before, but I'm new enough to the OS to lack the proper vocab for a proper search.
In any case, I'm basically trying to run programs remotely at work (from home) that will run for a long time. I first ssh into the appropriate network, then ssh into the individual machine. This works great and I can run programs no problem. The issue is that these programs can run for days or weeks, and the network kicks me out after some period of time due to inactivity.
Is there a way to start a program on a remote machine then terminate the connection but have it keep running?
Does anyone know of any way (preferably fully documented/tutorialed, but even theoretical would be great) to remotely install Ubuntu to a machine currently running Windows...
We have 8 machines powering a digital signage system. The machines are not physically accessible (without extreme difficulty) and are currently running windows XP with a VNC server for control.
I want them to run Ubuntu instead. Is there any way anyone can think of that I can do this? My only thought so far is WUBI...but once it boots into Ubuntu ssh isn't installed by default and vnc isn't enabled by default so I wouldn't be able to control it.
Also I'd really like to completely wipe out Windows and use only Ubuntu.
I discovered in running a script on a remote server. Take this little demonstration script:
[Code]...
This runs as expected (it does nothing) on both my local desktop and the remote server. However if I log in via ssh to the remote server and run this script everything on the server freezes. I do mean everything ftp/ssh/nfs/login-requests/... even if I go and physically plug in monitor and keyboard there is no response. It basically needs a full restart to clear. I have repeated this several times. Now something is clearly wrong here but does anyone know what. I didn't think a non-admin user should be able to actually lock up system processes.
What you do if the job takes a long time to finish and you don't want to wait.Say, I ssh to a remote server from my laptop and start a long-running job. Then few hours later I ssh again and inspect how did job run, its uotput and etc.
In the past, I've installed Internet services as daemons and as xinetd.d with no problems. Those approaches do not meet my needs. And, perhaps, nothing will.
- the service was converted from VB-6 to wxPython. It has a GUI which is accessed with either "remote desktop" or VNC. - the wxPython service works on Windows and can be accessed from other hosts on my LAN - the wxPython service works on CentOS and Fedora, but can only be accessed from within the server host. Even from other user-ids. But, I cannot get to it from other hosts. - ipchains AKA firewall ports are marked for INPUT. - The server host uses autologin to fire up a useid in group "user". I do not want it running as "root". the .bash_profile fires the service up. - the service is heavily mult-threaded, and supports devices connected to serial ports asynchronously with the ephemeral port threads (all this works).
There are some programming solutions that I would rather not develop. - a proxy service that runs under xinetd.d. - separate the GUI code from the Internet and serial port code. Allocate a "control" port for remote GUI control. a'la SAMBA & SWAT
Is there any hope, that I can run it as is, by doing some network configuration stuff.
Running Ubuntu 10.04 to 10.04 over LAN; I can SSH -X into my server just fine, and am able to launch various applications (Nautilus, Gnome-Terminal, Disk Utility, etc) but I'd really like a Gnome desktop. When I've tried the various StartX commands and gnome-session, but something just isn't clicking. Is there a way to have a 'second' Gnome session running on the client over the first?
To pre-empt some obvious solutions; I don't wish to use VNC, the lag drives me nuts, and I'd like to keep my gnome session running on the client machine if possible. If it's not possible, that's fine; it just seems like there'd be a way to do it?
I've experienced the fairly annoying firefox error where it refuses to start the GUI. If there were no active firefox processes, nothing happened when I started it, but it did "start", meaning that I got the "firefox is already running" when I tried again. I found a fix to this, which was to delete some or all files in the ~/.mozilla/firefox directory. I now have 2 different problems. Number one is that the problem keeps reoccurring and I'd like to know what's causing it. The second problem is that even when I've "fixed" the issue, I still can't run firefox from scripts or click links. I get the "firefox is already running" error instead of a new tab or window.
My problem is Firefox only shows https:// pages and not regular pages like google. I'm stumped. I have no clue why. I've tried other versions of linux. Different computers and still the same result. If I turn quiet off I see in my terminal all the traffic.
I am a novice in the world of cloud and recently managed to configure Ubuntu 9.04 Cloud (using kvm, eucalyptus and other packages) successfully at my college for my project work. The problem is that i can only manage to view the running instance using rdesktop from any remote machine. Is there any way to do this other than rdesktop/logs? Secondly, I want to develop a application on the lines of google docs as a part of my project. Is it possible to install apache server on this virtual instance, and host a website? How will the client access this website? Which frameworks would be required or do I have to develop one?
im looking for a program that lets me monitor server remote a check if apache is running and if its not send out a mail pretty much something like the Big Brother projeckt was before they went comercial i found a few projekts but most havent been updated for years anyone got a clue on any active projekt that does this?
I'm running Fedora 14 and I've just updated from Firefox 3.6.15 to Firefox 4 (rpm from remi repo). Overall, it's faster, sleek and running great on KDE . However, I've detected a minor issue. In order to run Firefox with a plugin installation free from nspluginwrapper (which seems to break partially when there is a major flash update), I switched from Firefox 3.6.15 64 bit to 32 bit and got the 32 bit versions of all the plugins. All was running mostly fine. When I upgraded to Firefox 4 32 bit, the two versions (32 and 64 bit) of xulrunner2 got installed somehow simultaneously. With the previous version, that didn't pose a problem. However, seeing as Firefox 4 is launched by xulrunner2, the 64 bit version takes precedence over the 32 bit one and consequently Firefox 4 32 bit is run as 64 bit app (?!) and no 32 bit plugins are loaded. By removing the 64 bit xulrunner2, everything is running as intended. Nevertheless, in the future, if xulrunner2 is used to run other XUL-based apps, as long as the 64 bit one is installed, it'll take precedence (in a 64 bit environment), which is fine, and anyone facing my situation will need to have a way to have both versions installed at the same time and specify that the 32 bit version should be used by firefox.
I have a 100% Slackware 13.37 network on both server and clients, with roaming profiles using NIS and NFS. Currently I'm debugging the whole thing, and I have a strange error that I can't quite explain.Sometimes when my girlfriend logs in, she gets some strange "Firefox is already running" error. The strange thing here is: ps aux | grep firefox returns absolutely no Firefox process. The only "solution" to convince Firefox to start again is to wipe her ~/.mozilla directory, but by doing this, she loses all her bookmarks and settings.
I am at a loss. I can not access my work remote desktop via the terminal server client on my wired box running Ubuntu 10.10. My wireless laptop is able to connect right away once I established the VPN connection. The VPN connection is established on both boxes with no problems.
When I tried the Terminal Server Client on my wired boxed, it says it can not establish a connection. Yet my wireless box gets connected immediately!
I check the /etc/dhcp3/dhclient.conf and the /etc/resolv.conf to see if there were any differences, but they are essentially the same. When I have the vpnc connection, they both recognize it and I am able to ping the IP address shown when I do a "ifconfig" on the terminal.
What can be the problem? Anything I need to configure on a wired computer versus a wireless one? What else can I check?
More than often, my OpenSUSE 11.3 box completely freezes when surfing the internet with FireFox, forcing me to stop the computer the hard way (pushing the on/off button) and restart itDoes anyone encounter the same issue? Does anyone has a start of a clue to explain me why? Soes anyone could tell me where to start to look for the reason!Note that the last time the problem occues (i.e. 10min ago) was on this site (no other tab opened).My configurationSONY Vaio VPCF11S1E: Core i7 720QM, 6GB RamOpenSUSE 11.3 64bits. I did not install the official NVIDIA driver as it causes the screen fonts to be extremely bigOPenBox-session + tint2FireFox has the adobe flash plugin installed
I was really looking forward to Firefox 4, especially to trying the hardware acceleration. I'm using nvidia proprietary driver which is the combination which *should* work, according what I've read. However it doesn't. Firefox doesn't even recognize my graphics card, when I type about:support, there is no information about my graphics. So I wonder, is there anyone who got this working?