Ubuntu :: Running Garmin Mapsource (6137) With GPS Access?
May 15, 2010
Just posting this as it maybe useful for other people. There are some other topics related to running Garmin MapSource under ubuntu. I've got Garmin MapSource (version 6137) working under Lucid with a working GPS link to a Garmin 60CSx handheld. Actually, since I have this running, I have completely erased Vista and Win7 from my hard disk as MapSource was the one and only application which sort of kept me sticking to Windows. Here's a brief instruction. First install wine:
I've been trying for a while now to find an effective track and waypoint manager on Ubuntu so that I can download tracks and waypoints via USB for my Garmin GPS-60 and GPS-60 CSX hand held GPS field units.Have tried a bunch of different programs, but still no luck. Has anyone been able to find a solution. I even tried to install dnrgarmin through wine, but that didn't work either.
I cannot get map updates for my Garmin nuvi 1350LMT because Linux is not supported; but I was able to get Windows' version of Firefox through PlayOnLinux and that was enough to fool it and I got a lot further, but Garmin's website can't recognize that the GPS unit is hooked up to a USB port; though I can see the icon on the desktop and access the files.
I am having trouble connecting to my Garmin GPS unit, an eTrex Venture HC. I gather from all the resources I've checked that it my system is finding the GPS just fine, but I am unable to communicate with the device. Using gpsbabel, I get the following error:
I noticed that the gpsd service was disabled, so I enabled it, but that didn't seem to change anything.
I can use gpsbabel to download data from my Garmin 60csx using: # gpsbabel -t -i garmin -f /dev/ttyUSB0 -o gpx -F Garmin-2Mar2011.gpx But I cannot connect to my Garmin 60csx using QLandkarteGT I have set the QLandkarteGT Device & Xfer config as follows - Garmin Serial Port: /dev/ttyUSB0 Type: GPSMap60CSx
When I click on Track/Download, I get: Failed to download tracks. Failed to configure USB: Device or resource busy The kernel driver 'dummy' is blocking. Please use 'rmmod dummy' as root to remove it temporarily. You might consider to add 'blacklist dummy' to your modeprobe.conf, to remove the module permanently.
I just want to tell everyone how good the Garmin/Asus A10 mobile phone is as an internet modem.I am in Australia and Mobile internet is an expensive thing to have. cheapest is dodo but i had a few problems trying out a friends usb modem.Anyway i was in the market for a Gps phone and discover this one. it is Android phone and pretty cheap so far it is very good gps, camera, mp3, and phone is excellent.BUT here is the clincher. when you plug it into the computer, the phone screen lights up and gives you 3 options.
is there any possible way to hide currently running processes from an user? This means I do not want him to know about what programs/processes does any other user but him run. In short words if that user runs 'ps -aux' he should get only his processes.
How can I get access to my NTFS Partitions in VirtualBox running Windows XP Pro? I saw it somewhere once saying something about setting up shared files but it was confuseing and did not work.
I have SSH running on a computer I use as a server at home and login to it for my own purposes but am needing to share access to this server with someone else, and I'd like to do it in a way so that when they sign in all they see is the contents of one folder and nothing outside of it. So I'd like them to have full access to this folder and do anything they want with it, but not be able to browse outside of it at all via something like WinSCP (they're using Windows). I'm thinking I need to create a new account for them to sign in with but beyond that I'm not sure what I need to do. The only other special thing is that the folder I'd like them to be presented with is actually on an external hard drive. We're going to be doing a lot of online music collaboration and I need to give him lots of free space to drop files and the internal hard drive doesn't have a lot to spare right now.
I'm about to have a web server at home for the first time. I've always missed having full control and not having to contact my hosting company when I need to do some specific changes - and some changes they won't do for you at all.I've chosen the non-GUI Ubuntu Server with LAMP, and nothing more is installed really except for a couple of command line tools from the repository. The LAMP software has been locked down as good as I can by following some guides on the net and using common sense. Like Apache 2 don't have access to the file system except for the www folder, and setting the headers to Prod. MySQL has skip-networking and I've commented out the listen string to localhost. PHP has a truckload of functions that I've disabled in the php.ini, also by following some guides on the net, among some other security enhancing php.ini editing.
The only thing the server will serve is a well known PHP forum and some html docs, and that's all. Nothing advanced or complicated stuff, and I'm definitely not programming PHP myself or letting anyone do it for me.But I do want to sleep well at night knowing that my server is always on and sitting on the edge of my home network! And can I do that? I've heard that you don't need to be worried about getting your Linux server box hacked, but you should be worried about anyone getting root access to it. But is it really that simple? Ubuntu is shipped without root account and you must have the sudo password, right? What's the odds for anyone to get full access to my system?An issue: I've heard that Apache never must run as root. When I do a ps -ef, I see that there are several www-data processes running apache, but there's one root process running apache too. Is this normal and is it safe?An issue: I've heard that PHP can fail pretty easily. But isn't PHP running under apache 2 and limited by the www-data filesystem access?An issue: MySQL is running as a MySQL user, and I guess that's an unprivileged user right?
Before the other day, I'd copied a live CD to ramdisk and run it from there before, but the disk was INX (INX is not X), a live CD based on Ubuntu that runs entirely in the text mode, no GUI. INX is a terrific product: colorful, educational, light, agile, fun to use, and and often damned useful, but when an OS only uses text, you may not notice how much running from the RAM speeds up an OS. Previously, I'd assumed that the best reason to run a Live CD from the RAM was to free up the CD ROM drive. When I started running a full KDE 3.5.10 Desktop from the RAM, it didn't take me long to notice the awesome boost in speed and performance.
The computer has the fastest access to the data that's in the RAM. (The "A" in "RAM" stands for "access", right?) So the machine is faster. As the RAM gets larger, I'm sure more and more live CDs are going to offer the RAMdisk option. Right now both INX and SLAX share the characteristic of being exceptionally small CDs, and that makes them well suited for this kind of application. The "minimal" version of Slax, the basic CD without any modules added, is less than 200 mbs, which fits very nicely on my 1024MB RAM. I now use the minimal SLAX cd to initiate the system, and I keep a collection of the modules on my hard drive to copy to the RAM and activate at will.
Here's a couple of screenshots: [URL] I'm using Wine here to run my one and only favorite Windows program, a text to speech program called READ PLEASE. Note that I am also running KTorrent, which is uploading from and downloading to my external MyBook hard drive. [URL]. Here's a shot of yakuake, which is sort of like Konsole with superpowers. I just upgraded my hard drive KDE system to 4.2.3, and they still haven't fixed Yakuake yet. I know it's been reported, so I'm sure it'll be taken care of.
I set up my first web server and it works flawlessly -when accessed from external network or from other computers on my lan. However, I cannot access it from the computer where the server is run on. I have found numerous people with similar problems but the flavour I am experiencing is somewhat different and no solutions I have found apply to it.
I have two network interfaces on my server, eth0 (public static IP connected to internet directly) and eth1 connected to LAN 192.168.1.0/24 range. The server is 192.168.1.1. It is connected directly to the internet and serves as a SNAT for other comps on the lan.I added "192.168.1.1 www_server_com" to the /etc/hosts on the server and also on the other machines on the LAN. All the other machines can open website without any problem.HOwever, the server itself only opens website if the address islhost. Internal IP, i.e. 192.168.1.1. gets a time out and so does www_server_com ( I cant use dots as I do not have more than 15 posts on the forum )Here is the firewall script I am using.
I am a novice in the world of cloud and recently managed to configure Ubuntu 9.04 Cloud (using kvm, eucalyptus and other packages) successfully at my college for my project work. The problem is that i can only manage to view the running instance using rdesktop from any remote machine. Is there any way to do this other than rdesktop/logs? Secondly, I want to develop a application on the lines of google docs as a part of my project. Is it possible to install apache server on this virtual instance, and host a website? How will the client access this website? Which frameworks would be required or do I have to develop one?
I run Fedora 13 on my HP dv6000 Pavillion. I am looking for a way to remotely access my work station running Windows XP from my Fedora machine. Most of the information I found explained how to access a Linux box from a Windows environment, and the one place I found that showed the other way around was incomprehensible.
Our company servers run Windows XP, and we have successfully installed remote desktops on other laptops running Windows.I thought of running the remote desktop from a Virtual Machine Windows session, but it seems a cumbersome way to do it.
I'm trying to access data from TFTP server which is running on my fedora 15, when i tried to read that file from TFTP path..i'm getting response as time out.. even i tried to get the data in localhost itself...there also i'm getting same time out.. i tried all permission mode.
In the past, I've installed Internet services as daemons and as xinetd.d with no problems. Those approaches do not meet my needs. And, perhaps, nothing will.
- the service was converted from VB-6 to wxPython. It has a GUI which is accessed with either "remote desktop" or VNC. - the wxPython service works on Windows and can be accessed from other hosts on my LAN - the wxPython service works on CentOS and Fedora, but can only be accessed from within the server host. Even from other user-ids. But, I cannot get to it from other hosts. - ipchains AKA firewall ports are marked for INPUT. - The server host uses autologin to fire up a useid in group "user". I do not want it running as "root". the .bash_profile fires the service up. - the service is heavily mult-threaded, and supports devices connected to serial ports asynchronously with the ephemeral port threads (all this works).
There are some programming solutions that I would rather not develop. - a proxy service that runs under xinetd.d. - separate the GUI code from the Internet and serial port code. Allocate a "control" port for remote GUI control. a'la SAMBA & SWAT
Is there any hope, that I can run it as is, by doing some network configuration stuff.
I am a newbie to linux of 4 weeks. I set up my first web server and it works flawlessly - when accessed from external network or from other computers on my lan. However, I cannot access it from the computer where the server is run on. I have found numerous people with similar problems but the flavour I am experiencing is somewhat different and no solutions I have found apply to it.I have two network interfaces on my server, eth0 (public static IP connected to internet) and eth1 connected to LAN 192.168.1.0/24 range. The server is 192.168.1.1
I added "192.168.1.1 www.server.com" to the /etc/hosts on the server and also on the other machines on the LAN. All the other machines can open website without any problem.
HOwever, the server itself only opens website if the address is localhost. Internal IP, i.e. 192.168.1.1. gets a time out and so does www.server.com.I do not understand why the record in etc/hosts doesn't point it in the right direction. It seems that when I open address 192.168.1.1 it still gets routed to the external network. I have seen using DNAT to deal with the problem but it didn't work in my case (maybe I didn't do it correctly). I have spent whole evening/night trying to sort it out, it's 4AM now, going to bed frustrated and angry (at myself hahaha). Still like linux very much, won't be going to windows anymore. Please help
I am at a loss. I can not access my work remote desktop via the terminal server client on my wired box running Ubuntu 10.10. My wireless laptop is able to connect right away once I established the VPN connection. The VPN connection is established on both boxes with no problems.
When I tried the Terminal Server Client on my wired boxed, it says it can not establish a connection. Yet my wireless box gets connected immediately!
I check the /etc/dhcp3/dhclient.conf and the /etc/resolv.conf to see if there were any differences, but they are essentially the same. When I have the vpnc connection, they both recognize it and I am able to ping the IP address shown when I do a "ifconfig" on the terminal.
What can be the problem? Anything I need to configure on a wired computer versus a wireless one? What else can I check?
I'm trying to get vsftpd running with both anonymous and local user access to the same folder. The directory I'm using is /tftp with the following permissions:
Hi. I have Ubuntu 10.04, nm-applet is running in the background, my battery icon and sound icon are showing but my network icon has been missing for the past 2 days. It was working fine before but now it's not. How can I fix this issue if I don't have an ethernet cord? Is there a way to roll back the recent updates or do I need to reinstall my network manager?
I've tried restarting the system and I've tried killing nm-applet and reloading it using Alt F2. I get some Debug error.
When I try to run nm-applet --sm-disable
It says an instance is already running and then gives me a warning.
I tried removing "iface eth0 inet dhcp" from /etc/network/interfaces and then tried restarting by "sudo /etc/initi.d/networking restart"
It says:
What can I do to connect to the internet? I have a flash stick if its possible to download a .deb package on this mac and transfer it over to my other laptop to fix the problem. If its possible.
I can't figure out why but my processor is running at 100% on all four cores, and the fan is running at max speed. All I did was double click an a.out file created by g++, and it is running at full speed now.
Sometimes I connect to my Debian box from another computer (using SSH on Cygwin or Linux), and once ina while I want to run some console apps. And sometimes some of these apps might complain about "another intance, Error: an instance of newsbeuter is already running (PID: 2496)". Is there a work around for this issue at all(without killing the original instance") ? The reason I do not want to kill the app because there might be 2 users connected to the same machine that might be using the same app.
Is suid disabled from running all home made bash scripts or just from running them as root or:
Who would know for sure.
I googled several combinations of Mandriva Linux how-to suid disabled setUID etc... so far all I found was "many distributions are disabling suid for security reasons" nothing specific.