Software :: Copy Files In Local System And Then Only It Is Opening
Jan 8, 2010
I have installed quanta plus software as i fount on net that quanta plus is nearest to dreamweaver.But now i am facing a problem in it as it is not able to open any php files which are located on another PC. Although my PC is connected to that PC in the network but still i have to copy those files in my local system and then only it is opening. But i want it to open directly.Is there any solution?
i was trying to copy some files over my hdd using wget.this was the format of the command the catch is that there is a local website that is installed into directory heirarchy and i would like to use wget to make the html files link to each other in one directory level.the command didn't work inspite of trying different forms, so what's the mistake in this command or is there another way?
There is this bug in the latest version of Ubuntu, which is also Jessie, which is:
Can't copy a file from SMB share to the local file system: Software caused connection abort
The problem, apparently, is that newer versions of Samba hit servers with multiple requests at the same time, and for some reason the Zyxel and Iomega boxes can't handle this. The best solution they've come up with is to modify the smb.conf file on your server to include this setting: "max mux = 1".
Here is the reference material on this bug: [URL] ....
People who develop samba have fixed it in the latest version but neither the ubuntu nor Debian have released the fixed version of nautilus, as of yet. Here, is the reference: [URL] ....
i am using Fedora 14. Once system get hanged during opening a video file so I had to restart the system by pressing restart button. But after restarting there are few problems appearing like system monitor not opening and Thunder bird opening but not showing any folder including inbox.
---------- Post added at 04:54 AM GMT ---------- Previous post was at 04:42 AM GMT ----------
copy files from one CentOS system to another one. The files are generated automatically at one server, and i want to copy them immediately to an other server.
I am running some Pcap files through editcap and then tshark. I am running fedora 11. This will create a couple of thousand text files all numbered sequentially 1-x. How can I copy these files across a network(I connect using putty) or how can I copy them onto an external HD so that when I view them on the windows machine they have the right formatting (Windows know to open them with wordpad/notepad) basically that windows knows that they are text?
I need a command-line method of copying files from a Linux box to a Windows machine that is in a domain and requires authentication. I cannot install additional software or services on the Windows XP machine. I can install any software on the Linux machine. I've tried scp, but the connection failed and if my understanding is correct it is because scp requires that the target (windows machine) be running an ssh service. Is there a command-line linux utility that can pass Windows domain user and password and then copy a file from the linux machine to a share on the windows machine?
Using C++, I want to process sub-folders on my home folder sequentially each with a special naming format and containing some binary files in it:
Code: 1/ 2/ 3/ 4/ 5/ 6/ ...
Give above folders, I will process files in 1/ at first, 2/ at second, 3/ at third, and so on.
For some n/ folder, if I realize that n/ actually does not exist in local file system, I do not want to wait for it. Hence I will keep processing (n+1)/ folder, and so on.
However, when processing some (n+m)/ folder, previously not processed n/ folder may have been created on local file system. In this case, I do not want to miss processing it, but somehow detect its creation and process it. After processing n/ folder, I want to continue from (n+m+1)/.
I am using CentOS 5.3. output of "uname -a" Linux localhost.localdomain 2.6.18-92.el5 #1 SMP Tue Jun 10 18:49:47 EDT 2008 i686 i686 i386 GNU/Linux
My kernal version is kernel-2.6.18-92.el5
Whenver i try to copy files from my centos to pendrive ( 2gb, kingston datatraveller) my system is gettin hanged leaving no option then to reboot. i tried from the terminal also, and as a diffrent user also. but same results. sometimes i can copy files of small size. but when i go above 5 mb..system hangs..
I have a fairly clean install of Debian 5.04 on a G5 tower and am having some local network sharing problems. The machine linuxG5 has an address of 192.168.1.4 and when I am logged into that machine I get the following output
silver@linuxG5:~$ nmap localhost Starting Nmap 4.62 ( http://nmap.org ) at 2010-04-24 10:19 EDT Interesting ports on localhost (127.0.0.1): Not shown: 1706 closed ports PORT STATE SERVICE 21/tcp open ftp 22/tcp open ssh 25/tcp open smtp [Code]...
I would like to connect via SSH or similar to my servers located in a remote DC from a laptop running centos5. I normally do this on a puter running dows, and using secureCRT. Just wondering if centos has something built in for this, or if there is some preferably free software I can get.
One of Konqueror's unique features is that i can name a local process as the action in a form. When i submit that form, the local process is executed. Very helpful for certain offline tasks. What would make it even better is if i could find a way to pass some data to that local process from the html page. This could be the content of a hidden input item, etc. Alternatively, if there is a way for Konqueror to create or update a local file with data from the html page, that would acheive the same end.
I have a 160GB harddrive which I installed a F12, would like to upgrade to a bigger drive, but I hate to have to re-install everything.
Recommend a good disk copy utility? The utility should be able to not only copy files, but boot sector and everything. So I just need to make a copy, change my BIOS to boot from the new drive and run everything as before.
just installed ubuntu couple of days back on my netbook. I am still a beginner, enjoying my adventure exploring ubuntu. I have another desktop which runs on XP. I am able to access XP shared folders through my netbook(linux). However, i wanted to copy files from XP infact folders using TERMINAL in my netbook, not copy and paste using my mouse. Are there any commands for it?
I'm using the latest Ubuntu 8.04.4 server x64 edition in my PC. I'm using a QuadCore CPU (2.4GHz) and 2GB of RAM in My PC.I have 3 HDD:- sda+sdb are in RAID1 and all partition are ext3 fs.- the third sdc is single drive with ext3 fs.While I copy files from one disk to another (I trying all cases, for example RAID1 to RAID1, sdc to sdc, sdc to RAID1, RAID1 to sdc and the results are the same) sometimes my system hangs up with message: "Oops 0000 [1] SMP" or "general protection fault: 0000 [1] SMP" when I copy large files.
I read lot of forums and some said that I must disable acpi in grub. I'm trying the acpi=off, noapic, nolapic parameters with all variants, but nothing helps.When I copy small files the system works good. But If I copy large files (400-500MB) the system randomly hangs, and sometimes the ext3 filesystem crashes too and the disk where I copied the files go back the previous state, and the files which i copied are disappears
Does anyone know of an application for making copies of web sites that can be read offline? I've tried using wget, but with very mixed results. Something a bit more reliable would be useful
I am looking for a SVN repository browser which dosen't have a restriction of creating a local copymy problem is as listed.I use projectlocker.com for my svn repo'snow once repository is creted i want to create a truck i.e a folder named trunk.and then checkout the truck on my folder and keep updating it as normal.earlier i used to achieve all this by simply using Tortoise SVN Browser and creating a folder.can any one suggest me a replacement of tortoise svn's Browser feature.Note : i know replacement of Tortoise SVN, i ma currently using PAGEVCS. I want a replacement of Repository browser.
In my network I only have one machine that is configured to send email outside the network. How do I instruct my local copy of sendmail to use that server as a relay?
my mail server [URL] is hosted abc@[URL] is a pop id below are alias:
user1@[URL] user2@[URL] user3@[URL]
fetchmail is configured to download all mail for alias email id on local linux server and distribute to local users.
fetchmailrc config: [Code]....
everything works fine except bcc copy is not getting delivered to email alias (at local linux server). It is delivered to local postmaster account :abc@[URL] (in Linux server) I have tried all envelope option in fetchmailrc file, but it did not work.
I am new both here and in Linux. As the subject says, I would like to learn how to copy a directory (not a file) from terminal with progress bar showing. The copy is local, i.e., not to another computer. My distro is CentOS 5.5. I know that if I do it with nautilus I would see the progress, but I want to learn how to do it from the terminal. I know that PV command can show a progress bar, but from what I saw, it works well for files, but not for directories (recursive).
Is it possible to use PV for directories? If yes, could you please show me the syntax? I also saw that some people mentioned that rsync can also show a progress bar, I tried to do it, but it didn't work out - perhaps I got the syntax wrong. If rsync can really be used to copy directories with progress bar, show me the syntax? Any other ideas on how to do it? I would like ideas that do not involve using any script, i.e., just something that I can do using the regular commands.
when I installed 13.37 I created a local copy of the entire stable tree (source/ and all the rest) just to have all that stuff around to browse offline.
Now, to instruct myself, I'm trying to use rsync to keep this stuff up to date. But I seem either to have misread the rsync man page or ... well, I don't know. I am issuing the following command and getting the results seen below:
I have two servers. One of them has a svn server running and another hosting projects.
I have a daily cronjob updating the projects -- ie running svn update, rebuild etc.
Now, my cronjob on the remote server works. However, a similar cronjob running on the local server for local projects (ie the same server as svn) is instead displaying a "svn: not working copy".
I double checked the paths, permissions and user info and if the script is launched manually, it works fine. Deploying the same thing remotely works.
I even tried using file:/// (suggested here http://www.hightekhosting.com.au/myaccount/knowledgebase/90/Using-SubversionorSVN-on-cPanel-Servers.html) but still nothing.
if I can grab a copy of the Lucid packages that my laptop's downloading and dump them into a directory on the desktop computer, then upgrade the desktop in a way it makes use of the packages it wants and that I have to hand already.
For the life of me I can not figure out what I am doing wrong with scp to copy a directory and its contents from a remote machine to my local host. I have no issues with getting a single file but would like to just save time and get the whole folder in one command.
Here is what I have tried:
scp user AT remoteMachine:/home/username/folderIwant user AT localMachine:/folderIwant this gives me a permission denied error and try again and received disconnect from localHost to many authentication failures
scp user AT remoteMachine:home/username/folderIwant . says can not find file or folder
I am sure this is something easy that I cant remember, and searches gives me local to remote not remote to local and trying to make the local to remote suggestions I read to work remote to local have not worked.
i am using dolphin 1.5 in kde 4.5.2. whenever i try to access movie file from remote samba server. dolphin copies the movie file to somewhere in local hard disk. so, i have to wait until a big file transferring complete. i know that it happens when i open .avi using mplayer. if i open the same remote file with kmplayer, it will player immediately instead of making local copy first. however, kmplayer is very slow and sounds and video stream breaking up, (i am sorry i do not know right english expression for this) i suppose this is not related to mplayer configuration. this seems to be dolphin problem. can i make dolphin to stop copying samba share to local disk and play instantly? there is a video in videos. it is comparing how dolphin and nautilus act differently when i play remote samba share movies.
The code listed below is an excerpt from a script that I am writing. The goal is to verify that a directory on a remote server is available to the local system. If that is not the case, a log file is written, and all filesystems that were previous unmounted, are remounted on the local system.
Code: # # Unmount all NFS mounts prior to the archive process. umount -a -t nfs # Mount the remote directory (NFS) prior to running the make_net_recovery script. # Make sure there is a <remote server> folder located in the /mnt directory. If it is # not already there, create one. mount <remote server>:/<local system> /mnt/<remote server>
# Verify the remote directory (NFS) is available. This directory is needed # as it is the destination for the iso images. If it is not available, stop # here, and write the results to a log file. df |grep <remote server> > /dev/null RC=$? echo $RC
if [ ${RC} -eq 0 ] then echo successful else echo not successful >> /tmp/make_net_backup.log && mount -a exit fi Is the syntax shown above correct?
Is there a method at the command line to copy files from one location to another and retain the source files group and user?I'm migrating some MySQL files from one machine to another.I want to back-up the original files in the directory presently. They have owner:group of mysql, some have owner:group root:mysql and so on. To copy them under cli or Nautilus everything changes to root for I execute sudo cp or gksudo nautilus and copy via gui.
Since it is MySQL data I could simply do a dump of the database and restore it on the other machine. But there's about 20 db's and I want to do this via a copy for it will be faster - at least that is what I think.