General :: Download Helper Creating Zero Size Files
Mar 30, 2011I've got a small problem with dwhelper. Now when I'm trying to download anything from ..... DH doesn't download and creates only empty files.
View 1 RepliesI've got a small problem with dwhelper. Now when I'm trying to download anything from ..... DH doesn't download and creates only empty files.
View 1 RepliesThe sort of add on for Firefox 'download helper' now has a new screen capture thing, in case it won't download the video from a particular site. You click the red button and a purple boundary appears round the video frame. I assume this is the basic function of the thing; it then provides the geometry of the purple frame to the program you've given it to capture the video.
By default the program it uses is 'recordmydesktop'. I've tried using this but get sound from the laptop's internal microphone, not the video. Tried 'pavucontrol' to remedy this but it says (wrongly) that no program is recording, so can't put things right.
So tried using ffmpeg: [URL]
Assume the stuff in curly brackets is what download helper supplies. Sometimes works perfectly but with B.BC. i-player only get the top left hand corner of the video and it is stretched somewhat horizontally.
In a shell script, I am creating a .tar using this command
Code:
tar -zcvf dst/lib/library.tar.gz dst/lib
There are two problems:
[code]...
I bought a new SD card which I intend to put some MP3s on - except that I can't write to it because it tells me the destination is Read Only. No-probs thinks I: I'll just reformat it.
"Error creating file system: helper exited with exit code 1: cannot open /dev/mmcblk0p1: Read-only file system"
Various chmod commands all result in Read-only file system. I tried umount then mount commands, but it couldn't find it to mount once I'd unmounted it using the same /media/ file path (I assume it's the only one).
Our LAN is connected to internet via leased line, and it is getting slow at peak times.We doubt someone might be downloading files of huge size.Instead of SSH-ing into each system and finding out the download size (ethstatus), is it possible to view from a network bandwidth monitoring software?
View 2 Replies View RelatedHow can you create a script to move or copy files from a main directory into multiple directories below the main directory.
View 7 Replies View RelatedCreating Files in Red Hat Linux ?
View 1 Replies View Relatedwas prompted to dl some updates after a fresh install but it was only partially successful, and whenrebooted, the screen froze with the mssge:Cannot open Konsolekit sessionPermission of the setup uid in not correct.
View 1 Replies View RelatedI am having a few situations to which I do not see any thing in du man pages.Quote:1) I want to see files in a sub directory which are larger than a particular size only.2) I use du -sh > du_output.txt I see the output as described for option -s and -h how ever what I am more interested is if the output comes in a format which is say for example
Code:
dir0--->dir1-->dir3-->dir4
| |
[code]....
full snapshot of ArchLinux Repo for x86_64I want to use this as my restore backup should I need to reinstall Arch without network support.How do I build several *tar volumes of my /mount/my_repo to fit into 4.5GB DVDs ... the thing is 18 GB size...How do I extract all the *tar created to a folder later on...? is it the same as extracting a single *tar, will tar find all volumes in the same directory level so as to continue extracing or do I need to merge them in some way
View 1 Replies View Relatedhow I can randomly write / create a 1 GB file in bash to test disk / network i/o? I was told I could use the 'dd' command but I don't know if there are some better ways and or what the 'dd' command looks like.
View 7 Replies View RelatedConcatenating two files without creating a newline between them? So how is it possible to do that? I've tried the following:
Code:
echo 123 > file1
echo 456 > file2
cat file2 file2 > file3
[Code]....
i want to Restrict a particular user from creating a file beyond a prticular size.ie he should not be able to create a prticular size [say 10mb] but he can use upto 10 gb.[ not the quota space i mean]
View 6 Replies View RelatedTo find the space occupied by files modified more than 4 years ago, i tried following.I am wondering if it is right ?
Code:
find /temp -type d ! -name ".*" -mtime +1460 | wc -l |du -sh
I tried this, but this sits there for long time (of couse the path i tried has lot of files) So i am not sure if this is right.
P.S.:
SHELL=bash
OS=RHEL5
I am directly creating "qf" & "df" files into the sendmail queue folder and then processing this queue by command line. This is the only way to export data and email from this old application I am stuck with.This works quite well in my test enviornment but I am really new to linux/sendmail and just looking for any feedback on this process. Is this direct creation of queue files safe and any pitfalls that I should be aware of?
View 2 Replies View Relatedi need help in this issue how to find files with unusual size and with unusual names of EX : just dots, names ending with space(s),names containing shell wildcard characters, names containing non-ASCII (control) characters
View 3 Replies View RelatedI made an account under freeshell.org and it has been very satisfactory so far. I recommend everyone getting an account under freeshell.org. But anyways, how do I find files over, for example, 500 KB, in the entire, my shell account?
View 1 Replies View RelatedI'm trying to configure a script the deletes the file and directories with more than 10 weeks on my /tmp directory, this is what i'm trying to do:
Quote:
#!/bin/bash
#Script para borrado de ficheros y PDFs de /tmp
### Directorios/PDFs a borrar
[code]....
Actuaaly i am creating watch on one directory in which files are continuously coming.Is there any command which can give listing of all files who have come in last 24 hrs.
View 8 Replies View RelatedPlease why my scanning is always creating huge 50Mb to 100Mb PDF files ?Each A4 Pnm file is of 6.5Mbytes by resolution of 150.If I decrease the resolution lower than 100, then it starts to be unreadable my text ...
View 11 Replies View Relatedi manage to delete some files from the system. now i need to recover them.. i know the inode # (through ext3undel) and also the size.Quote:Unfortunately, we cannot automatically obtain the name of a deleted filefrom Unix file systems - since the connection between the iNode (whichholds the MetaData, including the file namee real data is droppedon deletion. However, we can obtain a list of names from the deleted files.How can i use this information to recover the files?Also can i search the text from a partition? (file don't exists). As i need figures
View 3 Replies View RelatedI don't want the 'Download' dir in my home. I don't have it in any other computer and there is no problem. But in my Desktop, every time i reboot there is one 'Downloads' dir in my home and I remove it every time (rmdir Downloads). But when I reboot, it is there again! So what is creating my Downloads dir?. I repeat, I have another laptop with the same ubuntu version (10.04) and I don't have this problem.
View 9 Replies View RelatedI have a two hours long home video with I edited in a video editor program. I'd like to burn it to DVD, but first I need to export it to an mpeg file. Cinelerra doesn't allow me to render to mpeg, instead it offers .avi or .dv - problem is that the resulting file size is enormous. (i.e. 1 minute of .avi = 1.2GB! or around 500MB when I output to .dv) What file format would be best to render to and at the same time not to get an insanely big file? I'd like to keep it under 1GB if possible.
View 2 Replies View RelatedWe are currently running Redhat 5.4 64Bit (Build 2.6.18-164.e15) however having issues with tar files that are greater than 2TB in size.
We have been told that upgrading the kernel to support the ext4 file system (supported in version 5.5?) and mounting the current 10TB nas share as an ext 4 file system may solve our over 2TB file size issues. Are their limitations within the ext3 file system that cause issues to files greater than 2TB and if this is the case Do we have to update the kernel (complete rebuild), or can we load a package on the redhat box to support the ext 4 file system Where do we get the upgrade/package. I have logged onto the redhat site and can not find kernel updates but can find the full installation packages. We are not connected directly to the internet and do not have access to the update repository
i am using red hat linux 2.4 . I have 3 folders dir1 dir2 dir3 I have tarred them like this.
1.tar cvfz tarball_1.tgz dir1 dir2 dir3
2.tar cvfz tarball_2.tgz dir1 dir2 dir3 2>& /dev/null (So that it does not display any error message or operation details to the user)
[usr@machine]$ ls -lrt
-rw-r--r-- 1 usr grp 199843988 May 17 13:39 tarball_1.tgz
-rw-r--r-- 1 usr grp 199837488 May 17 13:53 tarball_2.tgz
But can any one explain the size difference as seen in list output...
how to find total size of all files whose names starting with a
OS: SunOS
du -h a* is giving individual file sizes.
I need to configure pronter in my office so where can i download PPD files forptinters.I serched in [URL]
View 6 Replies View RelatedI haved tried 3 times to download DVD-7 from http://cdimage.debian.org/debian-cd/...md64/jigdo-dvd, and every time it has failed with just 5 files left to download.
It says:
I cannot begin to describe. All those hours of downloading for nothing! What the heck is happening here? When I try to just continue on, I get error code 3 aborts and have to just start all over.
i have a dedicated ubuntu server and would like it to download files from emule....but it has no gui. i am looking for a command line tool like rtorrent but that handles e2dk links.
View 1 Replies View RelatedI want to generate a temporary random list from a directory of files and then determine the size of an arbitrary block of files from this list (say 1-25 or 26-50) and add their names to a file along with some other info for each name. I can generate a random list with file sizes like this: ls -l | sort -R | cut -d " " -f 6 but i'm not sure how to add up the sizes of just a certain block of these files and at the same time save the file names.
View 2 Replies View Related