General :: Yum Local Install - Package And All Dependencies In Local Directory?
Dec 9, 2009
I am trying to install tilp, a program for the link between a computer and a Texas Instruments calculator. I have downloaded all the packages to a local directory. I tried telling yum to install all the packages at the same time, though, the dependencies still fail to resolve (though they are all in the directory). I don't know if it would be safe to force install without the dependencies (even though I would install them later).
Well, I am facing problem when doing lab questions.
I must use DLXLinux bundled in Bochs (bochs.sourceforge.net).
I am required to use the /usr/local directory.
In /usr directory, there is no directory named 'local' but there is one thing called 'local@'. So, when I try to use mkdir command to create 'local' directory in /usr , there are error "cannot make directory.....".
I have a local repository, and declare also a remote one, I want to tell to apt-get to install a package from a local repo, if it exists. it seems that it begins from the remote. here is my sources:
deb file:/home/CD1 squeeze main deb file:/home/extra6 / deb http://ftp.fr.debian.org/debian squeeze main non-free contrib
One of Konqueror's unique features is that i can name a local process as the action in a form. When i submit that form, the local process is executed. Very helpful for certain offline tasks. What would make it even better is if i could find a way to pass some data to that local process from the html page. This could be the content of a hidden input item, etc. Alternatively, if there is a way for Konqueror to create or update a local file with data from the html page, that would acheive the same end.
What command would you use to read about the sync system call (not the sync command)? How would you read a local man page for sync that was kept in the /usr/local/share/man?
how should i remove a package in a local cache? i tried removepkg packagename.txz and itdeleted its files. but then i tried to install it back, it says its in the local cache so it didn't download the package but instead it installs it right directly. i want it to downloadit once again because the package isn't working.
I am not able to access the directory /usr/local. But when I do ls I am able to see it.
Code:
[root@indra ~]# ls -ld /usr/local drwxr-xr-x 2 root root 0 Feb 9 12:11 /usr/local [root@indra ~]# cd /usr/local -bash: cd: /usr/local: No such file or directory [root@indra ~]#
I am new both here and in Linux. As the subject says, I would like to learn how to copy a directory (not a file) from terminal with progress bar showing. The copy is local, i.e., not to another computer. My distro is CentOS 5.5. I know that if I do it with nautilus I would see the progress, but I want to learn how to do it from the terminal. I know that PV command can show a progress bar, but from what I saw, it works well for files, but not for directories (recursive).
Is it possible to use PV for directories? If yes, could you please show me the syntax? I also saw that some people mentioned that rsync can also show a progress bar, I tried to do it, but it didn't work out - perhaps I got the syntax wrong. If rsync can really be used to copy directories with progress bar, show me the syntax? Any other ideas on how to do it? I would like ideas that do not involve using any script, i.e., just something that I can do using the regular commands.
I have a directory on my server at /home/dave/www/images/site (ext3) which I want to mount directly to my Windows computer so that I can transfer data easily via command line tool. Is that something possible?
I am currently interning at a place and my job is to essentially learn UNIX. My supervisor gives me problems here and there to help guide me with my learning but for the most part I'm doing this all by self-teaching myself. Needless to say I have run into a few obstacles...for instance-Create a *one* line command that, using tar, will collect the full /usr/local directory (you need to run this as root again) and copy the whole /usr/local structure under /optFor example /usr/local/bin/hello will become /opt/local/bin/hello, etc. I want this as follows:1. /usr/local is collected by tar, but the output of this tar command is its stdout.. what you get from the previous stdout, you compress with gzip and send it to stdout again 3. get this output and decompress with gzip.. get this output and pipe to tar in a way that will extract the tree under /opt.If anyone knows how I could go about doing this, please let me know, or at the very least point me in the right direction. What I've got so far (which could be completely wrong) is:tar cvf - usr/local/ | gzip -c - | gunzip -c - | tar xvf -in theory I feel like this should work (except for extracting the tree under /opt...i'm kinda stuck there)
I'm working with a dual-boot laptop running Ubuntu 10.0/Windows 7 and a Debian 5 VPS while the OS's shouldn't have much impact on my question.
What I would like to do is create a html page that I can upload to my VPS which lists all of the files/folders on my local 2TB hard drive (Specifically media such as Movies, Music, TV Shows...). The media obviously will not reside on the server, but I would like to at least have a list which will allow me to select, for instance, a bands artist so that it redirects me to the albums in the directory below.
Ultimately, I'm looking for Open Directory Browsing without actually having the media on my server. I have been attempting to create something to this effect using lynx, however, I'm not sure if it can be done with this command or if it's even possible for that matter.
I want to update all the machines in the network from a central repository which is on my master server and whose archive directory is shared through samba.I searched in the man page of sources.list and found that there is an option for this but can't able to implement this. Can anybody kindly tell me the way to do the same.
I have a postfix mail server on ubuntu 10.04 lts behind a router. so all local users are fetching/sending mails through ms outlook using local IP. Sometimes when internet goes down and any mail send then it bounced back immediately saying domain not found. Can u please tell me how i configure to hold all mails in postfix server rather than bounce when internet fails and will pass through when restored the internet around 15-30 minutes?
I'd like a way to see all of the devices on my local network and what their local IP address is. I recall that I used wireshark to troubleshoot a similar problem a while back, but it doesn't seem to have a way to see all of the devices- only the traffic. (I'd like to do this without having to physically interface with my router if possible, and I am in an encrypted network if that matters)
I have installed a web server on my local network. Everything is well configured and web pages are shown correctly from Internet (outside the local network) using the domain or the public IP.The issue is if I try to see that web pages (using the domain or the public IP) from inside the local network. In that case the router config page (192.168.1.1) is shown instead of the web pages.From inside the local network I'm only able to see the web pages using the internal IP address (192.168.1.XX).
I've got an Ubuntu server hosting our websites and other various things here in our own home. We recently switched to a router that doesn't support loopback (abomination), so I've set up hosts files on our computers so we can access our own sites when on our home LAN.
However, we often take our laptops as we travel about, and I'm guessing due to the hosts files when we try to access our sites, it'll look on whatever local network we're connected to for our server, which won't work, obviously.
Is there a way to set up something like a hosts file that'll only try to look up the local IP of the server when we're on a specific network (our home one), or have one that tries to look for the local IP first, then proceeds to try and resolve the domain name and use the external IP if the local IP doesn't work?
I am working on a project that sets up packages on the cloud.
For example, If i want to setup Drupal, i need seperate machines to maintain separate tasks, like mysql in one machine(cloud instance/node 1), apache server in another machine(cloud instance/node 2), etc.
So if drupal.rpm has dependencies apache.rpm(has deps too) and mysql.rpm(has deps too)
Is it possible in rpm package manager(yum) to handle such customizations & above requirement?
is there a way to install packages store on your HD with apt-get, like a "apt-get install ./package.deb ? If not how to handle the dependances in a very very easy way.
I currently have Kernel Linux 2.6.24-26-generic loaded (according to System->Administration->System Monitor->'System' tab) [I'm using 8.04 LTS].
I'm hesitating in removing them because I assume that the Synaptic Package Manager would remove these when doing the upgrades. My suspicion is that these are still needed for later versions and removing them would cause a few problems to say the least.
Am I safe to either 'Remove' or 'Completely Remove'?
It would be great to save all that space if these can be removed.
I have recently installed Fedora 12 on a desktop PC and as my first experience of Linux, I am really impressed. I have now installed several packages and have reached a point where I would like to share the PC with other user (family members in the same house).My question seems so basic I am almost embarrassed to ask it but could some one explain the best way to create a local shared directory that could be used to store files accessible to everyone (e.g. music, photos, videos, documents etc.)There will be three users and as it is a family PC, they will all have full access.
Reading posts from various forums, I am little confused about what is the best way to proceed (i.e. what is Linux best practice). The simpler of the two methods is to simply make the directory using the mkdir command, followed by the chmod command to assign full access rights. Fore example if the local shared directory is called 'share'. The alternative approach assigns a group, a group administrator etc and then adds users to the group.
I've recently had to update my own package blah.cls in the texlive distro on Ubuntu. I duly put it in /usr/local/share/texmf and ran texhash and mktexlsr (the latter just in case). The database has updated: checked with kpsewhich.
Now the problem: I'm able to compile my Latex file using that package blah.cls only when I run latex (or Kile, or gedit) in sudo mode. When I run it with no sudo, the error is "can't find the package blah.cls" Obviously other files compile nicely, sudo or no sudo.
I tried to set up an NFS server on my Ubuntu on VirtualBox. But I failed when trying to mount a local directory via NFS. I used Ubuntu 11.04 First I ran the following two commands
I am buying a server 2 day ago and installed CentOS 5 (64 bit) . But i have a problem .My server FTP have folders : _________________________ maildir admin_backups domains imap public_html user_backups __________________________
I cannot access usr / local directory.usr / local directory, how can I access? usr / local does not appear FTP.And i get an error : libmysqlclient.so.15
I'm trying to make a local package server for my offline development network. Can anyone recommend a mirror containing every single package for CentOS 5.4 x86 as well as 64 bit? I've looked around but I haven't had too much luck yet.