At the moment I regularly use Image Magick convert command to convert PDF's to images which is great but I would like to only convert some of the pages in the PDF instead of them all. Im sure the function must be available I just haven't been able to find it even after bashing google for a few days.
Code:
convert *full.pdf test%d.jpg
this is the command I use at the moment.
Im new to linux Iam pursuing bachelors in computer science engineering.... my final year project is to develop a "html to pdf converter using linux"... i have some basic knowledge of commands used in unix
1) Where can i get the basic idea about the existing converters... their pros and cons... .... programming...
2) Any kind of material or source where i can get basic idea of programming used for converting html pages that are linked into a single pdf..
I have a Cent OS 5.4 32 bit final installed in my dedicated server.
I used to run lighttpd with php in my server until now and all was fine.But yesterday I changed my website which needs apache to run. So installed apache using yum install httpd command.
Then I added the virtual host name of my domain in webmin panel but when i try to run my php script in browser then its not opening php pages.
Instead it downloads php files like index.php when i open in browser.So I guess apache is not able to compile and run php pages. Only html pages are opening right now..
I'm very new to Linux, and I have ran a Windows XP server before, with a Apache2 Server software for Windows. and didn't have any problems with it, but I really want to set this server up with Linux.
The server is running, but the only thing that comes up in the browser, is the html pages, the pages call all the images from a folder /images/ but for some reason I can't get it to pull the folder up. None of the graphics show on any of the pages.
I also have a VirtualHost setup, and have another website set up in the DocumentRoot with the 2nd site, but it won't come up ether, even tho it is listed as the 2nd VirtualHost, and have that folder set up as the DocumentRoot.
I'm just using IP Addresses for now till I get this all sorted out.So if I enter the IP the main page comes up, just no graphics. if I enter the IP Address/2ndSiteFolder/ I get a directory listing with the 2nd site folder listed, so now if I click on the 2ndSiteFolderName, it comes up as IP/2ndSiteFolder/2ndSiteFolder/ Now the 2nd site comes up, and so does all the Graphics for that site, even if I go to some of the other subfolders site. So its just weird. I don't know what I need to do to get this to work right.
I know it has to do with setting up the folders for VirtualHost, I just don't know what it is. I have been working on this for a week, and I'm not getting anywhere..
I have Ubuntu 9.04 and just installed Sound Converter. I am trying to convert a bunch of .ogg files to mp3 to play on my iPod and it's not working so well. In the Sound Converter options I have is set to convert to high quality mp3. I choose the folder that the files are in and after a moment (slow laptop) Sound Converter populates, I hit 'convert' and it shows that the conversion completes in two seconds. All that it did was create the new folder structure of artist/album but there is nothing in there. Not sure what I am missing. I used Sound Converter before and it worked fine.
I'm trying to use convert, I have installed the imagemagick. I use this line:convert *.jpg test.pdf but I'm only able to convert to pdf 1 single jpg file, not multiple files at once. When there's more than one file, I get the following error: Segmentation fault
I have a lot of .flac files downloaded from several sites. Most of them come with a .cue file, and the .jpg with the cover, etc. It seems it is the intention of the uploader that one rebuilds the original CDDA. However, if I had a stand-alone CD/DVD player with flac I would hardly see the point of converting the flac to cdda. Furthermore, I could even play the flacs with a software player although, in this case, the audio quality would not be so good due to the noise picked up by the signal from the PC digital circuits.
Do I have the convert the int to a string using stringstream then convert the string to a char? or is there a more direct way?Also is there a way to tell the length of a int?
I tried to make that as short and descriptive as I could. There are certain web pages that I try to visit on my Linux computer that either respond after more than a minute of waiting for a response or, more often than not, simply do not respond at all - leaving the browser "waiting for xxxxx". Yet when I try those same pages on my GF's Windows computer, connected to the same router/DSL connection, the pages load quickly and without problems. Both systems are running Firefox.. One of the pages, for reference, [URL]..
I've just installed Ubuntu 9.1 amd64 and I have a problem with acces to some web pages (i.e amd.com, adobe.com). I looks like bowser load a page but then stopping it last infinitely. I have Ubuntu 9.1 x64Atheros wireless adapter connected to ZyXEL access point.Parallel Windows machine which connected to te same router has no prblem with access to those sites. I tried different browsers, different setting of DNSeven set up different versions of 9.1 (x64, x86)
after wrestling with several problems over the past few days of switching over to a portable version of Linux 9.10 (I'm running a Live USB created using Lili USB), and deciding between the UNR or the Desktop i386 releases (I eventually went with UNR because i386 will not load on my netbook and ends up corrupting itself), I now find myself with the interesting problem that Firefox will no longer load webpages.I'm using a wireless connection, and it worked about 4 hours ago, but doesn't any longer. It registers that it is connected (The network icon displays the connected information), but the webpages themselves will not load (And I don't know if any other web based application functions- I only bothered to check firefox).
And I know the connection itself is not to blame because I'm using it right now to post this message (Off of Windows 7).Any help that can be given would be appreciated. Except for these problems that come up I'm loving Ubuntu, and intend to use it inbetween classes and such.EDIT:No internet applications work, and the router does assign me an IP address while in Ubuntu. Also, the speed of the "Server not found" message indicates that it's not even trying to connect.
If I surf the net and reads webpages everything works great.But if I forxample try to write a comment in Facebook, I have to minimize Firefox, then maximize. Writing works fine
I am a web developer and fonts are crucial to my job.All i want is firefox to render the fonts just like if im on a windows machine, is this possible?For example i install tahoma and microsoft core fonts but in the FF they're not appearing right. This is a straggle to my work and believe me, i 've searched the entrire internet for a solution!
I know this is off Ubuntu topic, but, imo, Ubuntu is the best forum for answers on the internet.Anyone using one of these online editors ?I use it as integrated into a CMS, but would like to build it into some web pages - with sole purpose of EDITING MY WEBPAGES ONLINE. Kompozer is great, but has a mack truck hole that may never get filled and I find the (usage) simplicity of ckedit attractive.Problem is I am not able to grasp how to do it.The examples are gobbledeegook to me. I mean i try them, they work, but do not address how to integrate (maybe just over my head).Installation and usage is trivial, I just cannot grasp how (if possible) to integratethe editor INTO a webpage - better yet, direct it to a webpage.
Pages do not load on certain sites (such as LinkedIn and Which? for example). The sites themselves load but when I click on links the pages load forever. With Which? I can close FF and open it again and then the link loads but after a few clicks they stop loading. It happens on all browsers (at least I tried FF and Chromium) but OK on Windows.
Out of no where Firefox has just decided to not load web pages anymore. I've restarted my router, restarted my wifi card, restarted my computer, *completly* uninstalled Firefox then reinstalled it, still no luck at all. Anyone have any idea how to fix this? I'm stuck using Google Chrome now, I like it but I need the Add-Ons FireFox has (AdBlock Plus, Xmarks, DownThemAll, etc.)
Here are my specs: Ubuntu 10.04 64bit Acer Aspire 3680 Intel Celeron M 1.6GHz Single Core 1GiB RAM 40GiB Hard Drive
Basically all it does is sit there saying "Connecting to www.google.com..." for like 15seconds then gives me one of two pages, "The connection has timed out - The server at www.google.com is taking too long to respond." or "Unable to connect - Firefox can't establish a connection to the server at www.google.com."Google Chrome, Skype, Ubuntu's built in messenger, System Update, and so on connect just fine to the internet.
Recently I noticed, that manpages are not available anymore for a normal user:
Code: $ man grep No manual entry for grep See 'man 7 undocumented' for help when manual pages are not available.With root privileges everything works fine
I set my terminal's colour scheme to a black background with white letters. Everything works great, except man pages. When I read a man page, the titles look black, so they are unreadable on the black background. The paragraph text looks fine, it's the headers (and anything that is *bold* I think) that looks wrong.
Yesterday I upgraded from 9.10 to 10.04 using the built in upgrader. After upgrading, whenever I open Firefox, it will just say "The connection has timed out" for every page, but I am connected to the internet. Also, w3m will not load any web pages as well. But I am connected to the internet, because pidgin works, apt-get works, ping, and telnet work.
I will ping google.com with success each time, then attempt to load google.com in Firefox and it will say that it cannot connect to the host. I have tried both "Connect directly to the internet" and "Connect using system proxy settings" for Firefox (the system settings are to connect directly to the internet).
I then thought that maybe something was blocking port 80, so I used telnet (telnet google.com 80), and with that, I got the HTML response of [URL]. what can I do to make my computer able to brows the web again? I am running Ubuntu 10.04 with kernel 2.6.32-22 on a Compaq Presario CQ60-417DX Notebook.
I reload the indexpage of the General Help forum, the screen view is not the same as what it was. When I read the indexpage the first thread is listed at the top of the screen, I scroll down to have that.After reloading I see the top of the page on top of the screen and I have to scroll down again.This used to be different. It used to be after reloading the page I would get the same view.See attached pictures to see what I mean. The first one is the view I like to see, the second is what I get after reloading the page.Is this something which is caused by software in my computer, or is this dne on the website side?
I use Mint 9, related to Lucid 10.04 in 64 bits version and Google Chrome as webbrowser. To reload pages automatically I use the plug-in Autoreload and have it set at 10 seconds.I just saw in Firefox it works good, so it must be Chrome.I also placed this thread in the "Forum Feedback and Help" forum but since that is a less popular forum I want to place it here.
whenever I go to Pogo.com to play some games, it says I need a java plugin. How do I do this? I click download, and this other pages comes up, but I don't know which one to choose! (Look at screen shots) Screenshot-1.pngScreenshot-2.png
when i print a document and select only certain pages the printout still prints all thepages. this is very annoying, since i have to print everything from anther compusing ubuntu 10.04 with latest updates.
Whilst I like Ubuntu as an operating system and generally am fine with Firefox, could anyone explain why I sometimes get errors in trying to get through to various web pages. I don't get the same errors in IE8. As an example, my telephone billing provider; I can login on both IE8 and Firefox and get into all the screens in IE8 but not always in Firefox as happened to me 30 mins ago (I think the version installed is 3.5 on Ubuntu 9.10). There is a message at the bottom of the screen showing that Firefox is trying to connect to a sub page, but it just can't display. Is there anything that I can do to resolve this and is it an encryption issue? This isn't the first time I've had issues with Firefox. In Ubuntu 9.04 the screen used to sometimes go fuzzy when I used the firefox version there.
I can't seem to fire up any web pages but I can do everything else on my ubuntu box. This seems to have started within the past day or so. My /etc/reslov.conf file looks solid.Again, I can ssh, vnc and ping to the outside but just can't load any web pages.