However - is there such a thing as a decent HTML editor like dreamweaver? Komposer is buggy as hell - useless! Bluegriffon, well umm - screen fonts are bizarre, especially in viewing source code - brake down, multicoloured obviously a bug - no deb either, looks like a windows program install (?). This does look really good, but is unusable as I cant see in souce code view without getting a headache! Also, ignores css on links.
Seamonkey - you have to open browser then editor, then open your file. Ignores css totally. Amaya - ignores used fonts unless you re-edit - and ignores css on links. Weird way to select things as well, such as images. There must be at least one decent editor?
Let's say there's an url. This location has directory listing enabled, therefore I can do this: wget -r -np [URL] To download all its contents with all the files and subfolders and their files. Now, what should I do if I want to repeat this process again, a month later, and I don't want to download everything again, only add new/changed files?
Is it possible to configure yum so that it will download packages from repos using wget?Sometimes in some repos yum will give up and terminate for "no more mirrors to retry". But when use "wget -c" to download that file, it will be successful
I had set two 700MB links for download in firefox 3.6.3 by browser itself. Both of them hung at 84%.I trust wget so much.Here the problem is : when we click on download button in firefox then it says save file & when download has begun then i can right click in downloads window & select copy download link to find that link was Kum.DvDRip.aviif i knew that earlier like in case of hotfile server there is no script associated with download button just it points to avi URL so I can copy it easily. read 'wget --load-cookies cookies_file -i URL -o log'I have free account (NOT premium) on sharing server so all I get is html page .
I need to small shell script that I can download hdf data from ftp://e4ftl01u.ecs.nasa.gov/MOLT/MOD13A2.005/first,file name.MOD13A2.A2000049.h26v03.005.2006270052117.hdf each sub folders.next I copy all files with h26v03 to local mashine.
My apache ignores index files (index.php ,index.html ,index.htm , ...) and while these files exists in directory apache lists directory content!I mean http://localhost/test/ lists directory content instead of showing index.php!
I'm trying to download a set of files with wget, and I only want the files and paths "downwards" from a URL, that is, no other files or paths. Here is the comand I have been using
Code: wget -r -np --directory-prefix=Publisher http://xuups.googlecode.com/svn/trunk/modules/publisher There is a local path called 'Publisher'. The wget works okay, downloads all the files I need into the /Publisher path, and then it starts loading files from other paths. If you see [URL]..svn/trunk/modules/publisher , I only want those files, plus the paths and files beneath that URL.
i was trying to copy some files over my hdd using wget.this was the format of the command the catch is that there is a local website that is installed into directory heirarchy and i would like to use wget to make the html files link to each other in one directory level.the command didn't work inspite of trying different forms, so what's the mistake in this command or is there another way?
If a wget download is interrupted (like if I have to shutdown prematurely), I get a wget.log with the partial download. How can I later resume the download using the data in wget.log? I have searched high and low (including wget manual) and cannot find how to do this. Is it so obvious that I did not see it? The wget -c option with the wget.log as the argument to the -c option does not work. What I do do is open the wget.log and copy the URL and then paste it into the command line and do another wget. This works but the download is started from the beginning, which means nothing in the wget.log is used.
Recently I did a wget for DreamLinux 4beta6.3, which I now cannot burn to a disk. Another issue with linux in general being just a hobby. Why could a dev not set it up where you could just wget the upgrades for the main distro files themselves ?? We already do this when upgrading the apps, so why not the distro? Why could you not start at alpha, and just slap do a rolling upgrade into rc and beyond?
I understand the potential stability issues, but why can't we ghost the drive and hence have a backup on hand and then do this as outlined, and hence do a single install from alpha the distro's grave? I am wasting four months on "Setting Up" each install, only to wind up reinstalling again. This happens in every variation I have come across. Even in stable versions, the rolling upgrade just craps the install, and the major do overs hijacks all my data.
I am vijaya, glad to meet you all via this forum and my question is I set a crontab for automatic downloading of files from internet by using wget but when I kept it for execution several process are running for the same at the back ground. My concern is to get only one copy, not many copies of the same file and not abled to find out where it's actually downloading.
In order to download files from a particular website, I have to include a header containing the text of a cookie, to indicate who I am and that I am properly logged in. So the wget command ends up looking something like:Code:wget --header "Cookie: user=stringofgibbrish" http://url.domain.com/content/porn.zipNow, this does work in the sense that the command does download a file of the right size that has the expected name. But the file does not contain what it should--the .zip files cannot be unzipped, the movies can not be played, etc Do I need some additional option, like the "binary" mode in the old FTP protocols?I tried installing gwget; it is easier to use, but has no way to include the --header stuff, so the downloads never happen in the first place
I am using openSUSE 10.3.I play my video files using mplayer which I installed from tarball & necessary codecs.I can play my video files from command line nicely.Only one problem there are no thumbnail of any video file.Does anybody which software should I install so that thumbnail appear for video file in my nautilus file browser.The default player for GNOME desktop environment is totem which require internet connection to play file while my computer has no internet connection that is why I don't do anything with totem as it always says particular codec needed to play any audio or video file.
Is anyone aware of an app that previews video files, i.e. by showing thumbnails, with or without an embedded player?
Running KDE4, the preview setting in Dolphin is very, very slow to render usable icons (even on my reasonably fast / beefy system) and I'm wondering if there isn't something better than the preview modes of file managers in general.
I STFW already and couldn't find anything - there are plenty of picture viewers and plenty of video players, but I couldn't find any video previewers. The closest thing seems to be mplayerthumbs, but if I'm not mistaken, that's just the preview mode built in to Dolphin.
I've created some time-lapse videos from photos, using this command: ffmpeg -i IMG_%03d.JPG -s 1440x1080 -sameq video.MP4
And it worked great. Now I want to join several of these time-lapse videos to make a single, longer video (all the input videos have the exactly same format). I already tried using: cat video1.MP4 video2.MP4 > stitch.MP4
but the output ends up being equal to video1.MP4, I don't want to transcode nor changing any parameter of the video, I just want a end-to-end stitching, as if those videos were on a playlist.
I am using debian 6.0.0 The video files are played by default in totem.I like gmplayer. Also I want to make script for intelligently guessing as follows Get all the names of file in <folder-containing-video>. See which of them (among *.srt.*.sub.*.ssa) has maximum number of characters matching with video name. That file will be parameter for sub option . I saw a desktop entry for gvim which is like gvim -f %F -f means foreground I can try gmplayer -vo xv -sub what-should-I-write-here %F
How con i install win32 codecs (free)in 0pen suse 11.... I have got gstreamer codec pack but on compiling it says missing packages like c, gcc, glib...etc...how can i get them from where do i get them....
Today encoders are getting smarter. They can compress Blu ray similar quality in 700MB. It seems header of video file contain info about frame rate, audio/video encoder etc. which can't be guessed. In MPEG audio , every part of file is independently playable. If a movie is binary split into 6 parts & I don't have the first part then it is unplayable.
Code: example ls -rwxrwxrwx 1 root root 280M 2010-12-07 20:23 irn2-cd1.mkv -rwxrwxrwx 1 root root 50M 2011-05-26 13:09 last-50M-cd2 -rwxrwxrwx 1 root root 50M 2011-05-26 13:44 first-50M-cd1 file * first-50M-cd1: Matroska data last-50M-cd2: data irn2-cd1.mkv: Matroska data
I need to mirror a website. However, each of the links on the site's webpage is actually a 'submit' to a cgi script that shows up the resulting page. AFAIK wget should fail on this since it needs static links.
What would be a nice, simple command to go through all files in a directory (no sub-directories), and change all the MP4 Video files I have to MP3 audio files (keeping the original filenames except for changing the "mp4" extension to "mp3")?
The files in question were videos taken with one of those Flip cameras, but I only need the audio off of it.