Fedora :: Updating From Local Files
Jul 10, 2011I need to update a system, but the network connection is way too slow to run yum over the network. I have the update directory downloaded. Can I have yum get files from local files?
View 1 RepliesI need to update a system, but the network connection is way too slow to run yum over the network. I have the update directory downloaded. Can I have yum get files from local files?
View 1 RepliesOne of Konqueror's unique features is that i can name a local process as the action in a form. When i submit that form, the local process is executed. Very helpful for certain offline tasks. What would make it even better is if i could find a way to pass some data to that local process from the html page. This could be the content of a hidden input item, etc. Alternatively, if there is a way for Konqueror to create or update a local file with data from the html page, that would acheive the same end.
View 1 Replies View RelatedI have installed apache using the appropriate 'yum' commands.I didn't experience any problems during the installation.My problem is that I am not able to see any php file I created myself (I have installed php too).The only php file I can see is the one I created using this:
Code:
su -c 'echo "<?php phpinfo(); ?>" > /var/www/html/index.php'
Then I am able to see the information from the above file.
[code]....
I want to update all the machines in the network from a central repository which is on my master server and whose archive directory is shared through samba.I searched in the man page of sources.list and found that there is an option for this but can't able to implement this. Can anybody kindly tell me the way to do the same.
View 1 Replies View RelatedI have just created a back up script that would backup my system into two (root and home) tar.bz2 file. that was good and well. The thing is though I also created one that I thought would update the tar.bz2 files using the "u" (update switch). it seems that it won't work though. I did a search on the matter and found a thread in the Linux Forums that was saying that "gzip" couldn't be updated.
Is this the same with bz2? Is there a way to get it working? it would be handier than creating new bz2 file every time as it takes so friggen long to run the full backup (and thats even after excluding what I don't need).
I got a directory with files in it like: 2006-07-01.foo2007-08-04.foo I need to update the timestamps on these files using "touch -t 200607010000 2006-07-01.foo" on each file in the directory so I came up with the following one liner:
for i in `ls -1`; do touch -t `ls -1 | sed -n 's%([0-9]{4})-([0-9]{2})-([0-9]{2})(.*)%1230000%p'` $i; done
My goal was to use sed and get the timestamp for touch and then loop through each file and touch with the timestamp.However the script, not giving me the results I intended. Can anyone chime in on what I am doing wrong?I have been banging away at this for a couple of hours now and am clueless on what it could be. I also tried another variant such as:
for z in $(ls -1 *.foo); do echo $z $(for i in `ls -1 *.foo | sed 's%([0-9]{4})-([0-9]{2})-([0-9]{2})(.*)%1230000%p'`; do echo "$i"; done); done
I continue to work on automating the update and deployment of a vendors WAR files, and have bumped into my next challenge... The vendor provides web.xml files have entries that look like this
Code:
<context-param>
<param-name>siteminder.enabled</param-name>
<param-value>false</param-value>
</context-param>
I need to search the file for a param-name and replace the param-value below it with the correct value. I expect sed or awk is the trick on this, but I am not sure how to have it search for one line, and have it update the line below it.
I was trying to update my debian lenny and aptitude gives a whole lot of errors about files it can't fetch due to a 404 error. Aparently de files i need are no longer available in the places they where supposed to on de Debian-servers.It concerns 35 packages that can't be updated. If nescessary I can post all the package-names and versions (old and updated). I'll give the two first as examples :
[Code]....
Updating Hardy using Update Manager. It complains it can't find certain deb files: with good reason, the whole directory level is missing on the archive. For instance: archive.ubuntu.com/ubuntu/pool/main/l/linux/linux-image-2.6.24-28-generic_2.6.24-28.73_i386.deb
There is no directory /ubuntu/pool/main/l/linux on archive.ubuntu.com. The ls-lR.gz file, however, claims that there is such a directory and that it contains the deb file I'm looking for. Am I not seeing straight or is something really whacked on the archive?
I'm trying to download a large (~18Gb) file using rsync from a server to a client, but the server, for an unknown reason, kills the connection after a time, when I've downloaded only about 8-10Gb.How can I continue the downloading of this file?
I've tried using the --partial option, but it just restarts the download from zero again. I've tried adding "--append" but I get "rsync: on remote machine: --append: unknown option" because the version is 2.6.3 on the server (and 3.0.6 on the client). I don't have control of the version on the server.
I updated the package libcgic-devel to a newer release of the same version. The change in the distributed files includes renaming a file cgic.html to index.html. I have both files installed now and cgic.html is orphaned.
View 6 Replies View RelatedJust during the last three days, when running Update Manager, clicking Check, downloading all available updates, then Clicking Install Updates, the Downloading Files window pops up but it is blank. and it stays blank. The little rotating icon indicates something's going on but, NO. Nothing has happened, even after a couple of hours. Trying to close Downloading Files brings up a window to Force Quit. Then trying to close Update Manager has no effect. to shorten the story, finally a window pops up saying that the "AT SPI Registry" is not responding. I'm sure trying the same thing will not yield different results. I'm running 10.04.
View 9 Replies View RelatedI have a 3dsp pci wifi card, and the last kernel it supports is Ubuntu 10.04 2.6.32-(21-24) I want to update but dont want to accidentally update the kernal.
View 3 Replies View Relatedi've used the firefox add-on FireFTPseveral years and like it a lot...simple and handy..but the version 1.99.4 loaded on my openSUSE 11.4 is unable to deleteocal files..previous versions, a right click on a local file and select "Delete"would result in the file being deleted...current version: same action results in a pop-up error "Unable to deletefile." _and_ the creation, in the local directory of a new sub-directory/home, it including an empty sub-directory /.Trashi'd wish another user to confirm that undesired action and join my bug
View 9 Replies View RelatedI just read the Linux scp command issue question and it reminded me that I regularily forget to specify the colon in the host part of a scp command, and thus copying a file locally instead of copying to a remote host, e.g. I do
scp foo host
instead of
scp foo host:
But I never use scp to copy a file locally. So I wonder if there is a way to make scp fail if both (the source and destination) arguments refer to local files.
i was trying to copy some files over my hdd using wget.this was the format of the command the catch is that there is a local website that is installed into directory heirarchy and i would like to use wget to make the html files link to each other in one directory level.the command didn't work inspite of trying different forms, so what's the mistake in this command or is there another way?
View 3 Replies View RelatedI'm running Firefox within YLMF 3.0 (a distro of Linux with Ubuntu 10.04).
When I go into "Help,About", it says ...
version 3.6.3
Mozilla Firefox for Ubuntu
canonical - 1.0
While I'm in Firefox, when I try to open a local file (a "local file" meaning an HTM or image file stored on my computer as opposed to remotely through the internet) by going into the File menu and selectng Open (or by doing "Ctrl/O" from the keyboard), I get a dialog box prompting me to navigate to the file. When I navigate to the file and click on Open, the dialog box vanishes and nothing more happens. (The webpage that was already displayed is still there).
However, if I type the URL representing the local file into the address bar (for example, if I have a file called "homepage.htm" and it resides on the path "/home/user/html/", and I type "file:///home/user/html/homepage.htm"), it works and the file gets opened and displayed.
Also, if I'm not already in Firefox but instead go into the File Manager and navigate to the filename, then right-click on the filename, then choose "Open with", then choose "Firefox", Firefox will then be invoked with that file loaded in it.
But why can't I open the file the normal way from within Firefox by using "File,Open" (or "Ctrl/O" from the keyboard)?
I want to have a shared area for movies, music, etc. where files are available for all users. What is the best way to do this? I've tried a few different things, (ie. creating a folder and sharing it among a group, but for some reason it doesn't seem to work the way I want it to. I'm now thinking maybe have a partition like /share and set the permissions to all in fstab, but I'm not sure.
View 9 Replies View RelatedHow to unzip the zip files to local directory in linux?
View 5 Replies View RelatedSetup:Two Fedora machines in a large network, with a Win2003 server in the epicentre.Goal:To share files through the Public folder.Results so far:I put files in the public folder, I enable anything I can find related to networks in the Firewall (samba, nfs, and even www (http). In the public folder preferences, I set 'Share over a network' and never to ask a password. I do all this on both machines.Then I open Nautilus, and click on the network link. After a little while I get three links:Windoze network, public files on computer 1, public files on computer 2. Let's say I'm on computer 1, if I click on the public files on computer 2, I get the following error:
Code:
HTTP Error: Cannot connect to destination
Error resolving "_webdav._tcp" service "bentrein's public files on Athena.SISS" on domain
[code]....
I have a Ubuntu server hosted on Amazon EC2. I need to create an automated backup scheme so I created another Ubuntu instance on my local network which is hosted in a virtual environment. I managed to transfer the necessary files between 2 machines on the same network using the rsync command:
rsync -azvv -e ssh /home/path/folder1/ remoteuser@remotehost.remotedomain:/home/path/folder2
How can I do the same thing but transferring files from my Amazon server to my local server? Is there a way I can achieve this with port forwarding, or by VPN, or anything else? It doesn't have to be rsync. If you know about a better method, kindly let me know.
I use a PHP IDE that has no built-in ability to upload a project to a site. So, I'm looking for common easy to use tool for Linux that would able to upload modified documents to the server instead of uploading of the whole site. I also accept shell scripts that would be able to do this.
View 1 Replies View RelatedWhat I am trying to do is reasonably simple, I have 1 Ubuntu Desktop PC and 2 Ubuntu Laptops that are all connected via wireless network (remote desktop works perfect so no network issues)All I want to do is utilise the huge drive space on the desktop machine as a kind of fileshare for the two laptops so backups and music/photo sync can be done.
I have managed to do this using Samba in that from the laptops I can write files in Nautilus to the fileshare BUT the files have no owner, no group, and no permissions, this is the if I view from the client or server side.
IE I can navigate on the laptop to the shared are by the URL...in Nautilus right click and add a blank document and call it say "test".If I now have a look at the permissions it says "The permissions of test could not be determined"
I work on several computers throughout the day or week and make extensive use of Dropbox to sync my files between them.
One of the things that bugged me, was that bash/ssh/screen and other settings weren't synchronised between those new computers. And that I had to go through the hassle of having to recreate all those files when using a new computer. So I decided to start using Dropbox for this as well.
Code:
#!/bin/bash
# This folder contains my personal prefferences that I want to be able everywhere
# Basically, I keep my personal configuration for a few files stored on my Dropbox
# and create symlinks to the required files
[Code]...
I have a problem while copying files from a remote computer to my local one using the scp command. I am sure that I am using it correctly, please check it below:
---
blah@blah.com:~/g4work> scp blah2@blah2.com:IndirectMethod_Spher...s/H_1.mac.root .
---
What I get in return (instead of the statement saying 100% of file copied) is:
---
On this machine the G4SYSTEM=Linux-g++
---
The interesting point is that the above returned statement is one of the Environment variables set on both the machines that are necessary to work with a toolkit called Geant4. Here is what I get when I type 'printenv | grep G4' just to show you (note the statement in bold):
---
G4LEVELGAMMADATA=/home/blah/geant4/geant4.9.3.p02/data/PhotonEvaporation2.0
G4INSTALL=/home/blah/geant4/geant4.9.3.p02
G4LEDATA=/home/blah/geant4/geant4.9.3.p02/data/G4EMLOW6.9
G4NEUTRONHPDATA=/home/blah/geant4/geant4.9.3.p02/data/G4NDL3.13
G4VIS_BUILD_OPENGLX_DRIVER=1
G4RADIOACTIVEDATA=/home/blah/geant4/geant4.9.3.p02/data/RadioactiveDecay3.2
G4ABLADATA=/home/blah/geant4/geant4.9.3.p02/data/G4ABLA3.0
G4LIB=/home/blah/geant4/geant4.9.3.p02/lib
G4VIS_BUILD_RAYTRACERX_DRIVER=1
G4LIB_BUILD_SHARED=1
G4VIS_USE_OPENGLX=1
G4UI_USE_TCSH=1
G4VIS_USE_RAYTRACERX=1
G4REALSURFACEDATA=/home/blah/geant4/geant4.9.3.p02/data/RealSurface1.0
G4SYSTEM=Linux-g++
G4WORKDIR=/home/blah/g4work
---
The other thing that I would like to mention is that these Geant4 Env. Variables are loaded each time a new (bash) shell is started as a result of the bash login script.
I have installed quanta plus software as i fount on net that quanta plus is nearest to dreamweaver.But now i am facing a problem in it as it is not able to open any php files which are located on another PC. Although my PC is connected to that PC in the network but still i have to copy those files in my local system and then only it is opening. But i want it to open directly.Is there any solution?
View 4 Replies View RelatedI have a big iso image which is currently being downloaded by a torrent client with space-reservation turned on: that means, file size is not changing while some chunks in in (4 Mib) are constantly changing because of a download.
At 90% download I do the initial rsync to save time later:
$ rsync -Ph DVD.iso /media/another-hdd/
sending incremental file list
DVD.iso
[Code]....
Then, when the file's fully downloaded, I rsync again:
total size is 2.60G speedup is 1.00
Speedup=1 says delta-transfer was not used, although 90% of the file has not changed, target dir is on another FS and copying takes several minutes. Why doen't it try to speedup the transfer?! How can I force rsync to use delta-transfer?
I've recently been using amarok and liked it a lot. I unchecked the show system tray in configuration, restarted amarok and then checked it again. Now amarok crashes whenever I go into configuration or browse through local files. I have tried reinstalling and deleting the directory ~/.kde/share/apps but no use.
View 1 Replies View RelatedI would like to configure my system to do the following. Previous versions of ubuntu did some of this but now I cant seem to get this working.
1.Browse other Ubuntu systems on the local network by default. Avahi is install on my systems but they cant see each other.
2.From the GDM screen log into another PC. I had this set up in the past so from my laptop I could log into my server almost like terminal services. How do I set this up again?
3.Share files on the local network easily. How do share file with other pc on the network if I don�t want to use samba? What happen to NFS in Ubuntu?
How can mail attachments be downloaded from a POP3-supporting mail server and saved as local files? I've looked into a variety of mail clients and studied mbox, maildir and MIX but no solutions are jumping out and saying "It's me you're looking for!" (story of my life ) The requirement is for an organisation with no document management system; as a first step I want to get all their files in one place, including files that may exist only as email attachments.
View 8 Replies View Related