I need to share some space on my hard disk with my wife, over our home network. I'm dualbooting Ubuntu and Windows.
My Ubuntu partition is limited to 30GB, and the Windows partition gets all the remaining disk space. Does /host suffer size limitations while being accessed through Ubuntu, or it's size is the same as the Windows partition?
EDIT: For the record, she's using Windows XP on her machine, only... no dual boot. Couldn't get her off Windows yet, but i'm still trying
I recently converted and .avi file to dvd .iso and when I try to play it in my dvd player. I get this message "playback prohibited by area limitations". I was wondering what does that mean and how do I get this movie to play.
Coming to the point, as per object, I'm playing ( read also as "messing up" ) with mldonkey since last weeks and I have managed to get it working correctly on my home network ( behind router and firewall ). Next step in my development plan was to enable remote access to mldonkey web interface from any external network, like for example a friend pc. I'm aware of the "IP Access Restriction" in the mldonkey configuration file (downloads.ini) but there I can only specify an ip ( or ip-range) to allow access.
So the question is: how can I manage to disable* ip restrictions upon access, so that with a DNS-aliasing service I can access mldonkey web page virtually from anywhere?
(*) = maybe this is not the correct word but it explains the concept.
I'm considering setting up a virtual machine running Windows, with Ubuntu 10.10 as the host OS, for those cases where I have a Windows-only program.I understand that using a VM will lose some performance, but are there other limitations to what the OS in a virtual machine can do compared to "running on bare metal"?
For example:
Can a VM play games, like Dragon Age Origins or Civilization V? (Possibly with poorer framerates and/or lower resolution, but does it play at all?) Can a VM rip DVD/Blue-ray using AnyDVD or similar Windows program? Can a VM handle new hardware that requires dedicated drivers, but the drivers are only available for the OS running inside the VM? (Ex. graphics card, digital camera, card reader for smart card authentication.) Is it possible to say anything about "general limitations" of VMs, or is this wholly dependent on the specific VM?
I noticed that firefox sometimes uses a lot of memory. Can something like setrlimit be used to control it? I tried to use it on the command line, but it didn't work.
I have a network camera, with a linux OS. We need it to be really precise in its timing for the specific application we want it for. But it's not!
After killing some of the garbage processes on it (and having some achievements, but not enough), it seems that there's some kind of bandwidth limiting applied on the outgoing traffic.
My question: What are all the things that should be done on a machine, running linux, to remove all the bandwidth limitations?
Notes:
1- With my very basic knowledge about traffic shaping in linux, I have made these observations:
# tc qdisc ls dev eth0 qdisc pfifo_fast 0: root bands 3 priomap 1 2 2 2 1 2 0 0 1 1 1 1 1 1 1 1 # tc qdisc del dev eth0 root RTNETLINK answers: No such file or directory
2- I am trying to download the images from an HTTP link, and I already have about 14Mbps. I want it to reach at least 50Mbps. I can download with speeds higher than this when downloading from another PC on a windows network. So I assume this is not a problem with the cables. Also the eth0 on the camera is said to be a 100Mbps device.
Are there limitations on what can be started from /etc/pm/sleep.d after system wakeup?I wrote a perl script to randomly change my gnome desktop background showthread.php?p=10654538 I want to run that script when my laptop wakes up.I followed this guide : and wrote this code
I am working on Fedora, I am trying to figure out if there is any way we can set limitations/rights to linux desktop, so that we can control which things user can access, like if i dont want user to have access to kde control center, and should not change the settings in it, or disable changing wallpaper,or disable changing screen saver. How that can be done in linux?
I have a network of 100 machines, all with ubuntu Linux. Is there a limit to the number of machines that can connect to one single machine (at the same time)? For example, can I have 99 of my machines maintain continuous ssh connection to the 100th machine? Can I have every one of my machines (every one of the 100) maintain a continuous ssh connection to all other 99 machines? How much memory does each such a connection take?
I need to make some solution for my home network, I have a linux server which is: Linux Centos 5.5. So, what I need to do is to make a virtual interface for my clients which set its bandwidth up to 1Mb/s shared to them, but my real bandwidth is 2Mb/s. Also, after that, I have two questions:
1. How to set this rate limitation to that interface? 2. How to edit this interface to let it work and route the client data to my ADSL router?
I'm already generate the virtual interface using webmin managment tool, so I need to set its rate and route data.
I am working on a cluster for a molecular dynamics class and I have to edit my FORTRAN code (only the newest and best for me!). In order to get through to the cluster I have to ssh in. The network on which the cluster resides is behind a firewall, so I have to ssh through the firewall into the network first.
this is fine, I can login and move files and folders as needed, including sftp-ing into host 1, then into the cluster so I can transfer files from cluster to host and then host to me. This gets rather tiresome, so it would be nice to edit the files in place.
The problem is that when I access my code with emacs it launches the emacs client on Host 1, with no mouse support. I know the purists will howl about how I should be using keyboard shortcuts, but I am a chemist and not a programmer, so the mouse is very nice for me. Is there any way I can perhaps mount the cluster using sshfs so that when I open my code it launches a local instance of emacs? Sorry if this is the wrong forum, but I thought it was network related.
Many of mails sent from my mail server that are in Queue;The main reason is deffered by domains like yahoo,aol,etc.but there is one more error that i keep getting and that is Host Unknown,Below is an example from mail log,The catch is,test mail sent on the same email id sent from my personal mail from the same server i.e. url was deliveredHowever,another mail containing client information sent from customercare@mycompanysdomain ended up in queue.
There are more examples of the same,around 20 domain have the same problem.
I got a bunch of machines (~10) that I share with my co-workers. I have appropriate .ssh file(s) set up so I don't get prompt for password when I try to ssh.Currently I ssh into these hosts and then do a top to check the load before I start using the machine. Because I don't want to be on a busy host.Can someone show me how to write a script that find a least-busy host given a list of hosts to check? (hardcoded is fine)
I have two servers, one has an empty / and the other has a subdirectory with a large number (4 gig) with many, many files. I need a way to transfer the files en masse from the server with the large number of files to the one that is essentially blank.I don't have space on the used host to simply gzip all the files. I've googled this and see that there may be some combination of tar and/or gzip that will let me do this with some sort of redirection.
I really need and example line of how this can be accomplished. If my explanation seems rather sparse, I can supply more details.
Bit of an odd one, this. I've migrated a website from my old server to a new machine. Both servers run Ubuntu + Apache2. Both only serve a single site, apart from the default site.I've flipped the domain name to the new IP address.The trouble is that after moving the virtual host config over into sites-available, with the necessary link in sites-enabled, Apache attempts to serve from the default web root (/var/www) rather than the actual site content (in /var/www/technology). So for example, an attempt to browse.
I'm trying to get Synergy up and running between my Windows 7 (server) host and my Arch Linux (client) host. In rare exception, synergy works perfect on my windows host, however every time I try and run Synergy on my linux machine I get the following error in messages.log:
[code]...
I'm running Arch with a barebones Xorg install and SLiM with LXDE. I'm not sure what in the world is causing the problem and haven't been able to find anything of substance in a search.
I am a bit of a n00b when it come to linux but I am setting up a test environment were I have a appliance monitoring network traffic. Part of my test requires me to copy a file via RCP from one host to another host. I have two ubuntu boxes. I have allowed the subnet in the etchost.allow for ALL. I have installed rsh-server
When I try to copy the file it looks like it tried to use SCP instad of RCP because it connects to 22 instead of 544. Also note that traffic must be unecrypted thus me trying to use Is there anyway to make ubuntu go old school to allow me to use rcp instead?
Code: testuser1@ubuntu:~$ rcp /home/testuser1/test.txt testuser1@10.46.41.38:/home/testuser1 ssh: connect to host 10.46.41.38 port 22: Connection refused lost connection testuser1@ubuntu:~$ rcp usage: scp [-12346BCpqrv] [-c cipher] [-F ssh_config] [-i identity_file] [Code]....
I have the impression that not that many people understand the scope and limitations of GNU General Public License. This is somehow my basic understanding of it. If I take a program covered under the GNU license, first of all I have the right to get the source code. Second, I can modify it at will. Third, I can redistribute it as will too but the new code will necessarily will have the same GNU license. This made me wonder how people actually can charge for software derived from Linux, for instance, Red Hat. Well, my impression is that they really make profit only out of services. In this thread [URL] I think I found a lot of confusion, even from a moderator (not intended to offend). Red Hat is based on Linux and it is necessarily covered under GNU. Somebody probably bought the program from RHE and can make it available at no cost.
Nevertheless, the moderator decided to warn the user. In this article [URL] it says the following:"Our training is not designed to promote vendor lock-in. Though these courses are based on Red Hat Enterprise Linux, the source code for [RHEL] is available to the community via the GPL [GNU General Public License]," said Red Hat spokewoman Leigh Day. This thread [URL] shows yet more confused people. Is there is a glitch in this type of license that prevents programs like RHEL to be redistributed for free? Why their license page doesn't mention GNU license? Or the problem is just that people get overwhelmed by this license and are afraid to be penalized and get paralyzed? By the way, RHEL is just the example. The key question is about the license!!
a client brought in an 160GB external HDD and wanted to get the files off it, there appeared to be no partitions on the disk but i thought it may have been formatted to use the whole disk. I tried to mount it as the various FS types the client thought it may have been to no avail.
I ran testdisk on it which told me that it previously had a mac partition table and a 210GB partition on it (which is larger than the disk) could anyone enlighten me as to whether or not this is even possible, and if so how could i retrieve the data?
Ubuntu 10.04, xsane 0.996, Brother MFC 240c scanner.I just finished writing a long dissertation on my problem with this scanning environment (which I will spare you). In a nutshell the resulting image, when printed, is smaller than the original document. In writing my dissertation for this post I determined that the cause of the issue is that xsane believes I am scanning an 8.5 x 14 inch document when I am in fact scanning an 8.5 x 11 letter. So the question is... can I change the size to 8.5 x 11? and if so, how? I have not found anything in the xsane Preferences.
today I upgraded via official testing repository Gnome to version 3.18. After this, icons on desktop and nautilus are bigger, than before. Next thing, gaps between icons are smaller than before. I tried change theme to default (Adwaita), then run gtk-update-icon-cache, but without result.
Normal view - icons are big for this view. URL....
Small view - icons are still big for this view. URL...
How can I change icons size and gaps size? Or is it bug for this version?
is lvresize with --resizefs options re-size the Logical Volume and then re-size the file system? i mean we don't need to use resize2fs?I looked at man pages but it doesn't explain this option.
Have just installed Lexmark s605 printer on wireless network, printer works ok but when i print a document even though it is showing the correct size on 'print preview', the printed output is on the page is tiny and and 90deg rotated, I've tried various drivers from the Lexmark website, and also messing about in printer settings but nothing seems to make any difference.
I'm trying to ssh from my laptop to my desktop (both fedora 14) over a local network. I can ping my desktop and get responses, but if I ssh to it, I receive
ssh: connect to host 192.168.100.xxx port 22: No route to host