This is to do with accessing Dos era CD rom under Linux.The characters in directory and file titles appear as "chinese".As I know that I've loaded and installed programs from these CD roms onto a Windows 2000 machine, I'm wondering why I can not read the file names now. They are definitely in English.I've research and found the mount -o "characterset" but I shouldn't need to do that as they are not foreign language CD roms.The only other thing I can think off is that they are both degraded, but I would not have expected that of commercial CD roms.
What command could I use in terminal to delete all ASCII characters? That is, delete a-z, A-Z, 0-9, and all punctuation? I have a file containing Chinese characters, and I want to remove everything else and leave just the Chinese.
I can use grep to leave only the lines that have Chinese in them, but this still leaves a lot of non-Chinese stuff on those lines. Does anyone know how I could actually remove everything that isn't Chinese?
I update system today. Just before , everything is normal. Once it finished , I found that my KDE can't display some characters. I am a Chinese user. The English words are normal,. Just some Chinese characters can't be displayed. When I refreshed the browser, they can be displayed again but some of them are font default and some of them are font Yahei. And they are displayed messed together. It is not pretty. AND it is not only in browser, anywhere displayed Chinese characters looks the same as browser. How to change them back to default font as one font not two types of font messed together?
My installation of slakware linux 12.0 seemed Ok, I can see the beautiful KDE Window and I can also read Chinese homepages by Foxfire browser. But I don't know how I can input Chinese characters. I in fact installed everything from the DVD-package. It appeared that SCIM was installed, and I don't know if I have CLE.
I am a starter ,though there are lots of software installed almost for everything, I found that it is uneasy for a Chinese learner to use it.Almost every PC user prefer sip to linux around me. it is likely that there is still a long way to go.but I have every confidence in myself! one of my question is:Why my pdf reader does not support Chinese characters?What should I do to improve this situation?
When try to deployed squirrelmail, configured display language to "Chinese Simp" or "Chinese Trad", it could not display mail folder and reported "Reason Given: GB2312 character set is not supported." Checked locale/zh_CN[zh_TW]/LC_MESSAGES/squirrelmail.po, file charset encode is utf-8, also locale/zh_CN[zh_TW]/setup.php charset to utf-8, but at functions/i18n.php, it used charset of zh_CN/zh_TW to gb2312/big5. after changed it to utf-8, it working fine.However, all messages received is corrupted characters in chinese Either the title is corrupted characters, or information is corrupted characters.
I just installed CentOS 5.2. I have both fonts-japanese and fonts-chinese installed. But I cannot see characters displayed correctly. All Chinese and Japanese characters are displayed as blocks of hexadecimals, except Japanese kana. How can I make them displayed correctly?
*** Appendix 1: /etc/X11/xorg.conf *** # Xorg configuration created by system-config-display Section "ServerLayout" Identifier "single head configuration" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Synaptics" "CorePointer" EndSection .....
Is there a way to get Document Viewer to display Chinese characters in a pdf? Adobe Viewer does but I would prefer to avoid proprietary software. I cannot get either Document Viewer or Okular to properly show Chinese characters in pdf documents downloaded from my college class homepage.
I have all the Chinese language support files, bells and whistles (both traditional and simplified) loaded and operational. When I create a Chinese document in OO Writer and save it as a pdf, both DocViewer and Okular display the the Chinese characters properly. I just cannot get either DocViewer or Okular to display Chinese in pdfs that are downloaded from the website of my course's online textbook/workbook.
Running 9.10 full boat version on an EEE 1000HD netbook.
Using Fluxbox, have tried this in XFCE and KDE. Chinese characters display properly in whatever browser I use online. I do need to see some in the file manager and this is not working.
I have installed the following chinese display files from Slack -
On Linux Mint FBReader (both the latest version and the one in the 10.04 repositories)displays Chinese characters as boxes(see screenshot) for some reason, but on Windows it works fine. Is there any way to fix it?
I heard after I install debian, it could display Chinese information normally without any further work, but I have trouble to see all the Chinese characters.
So I googled and installed Chinese fonts, use dpkg-reconfigure locales to add Chinese support, but none of them work.
After installing Debian Squeeze, which packages should one install to enable a Chinese interface besides the usual English interface? Is this applicable to both Gnome and KDE ?
+++-=====================-=====================-========================================================== ii k3b 1.90.0~rc1-1 A sophisticated CD/DVD burning application
No CDROM burning exists anymore.. they remove lot of capabilities
There is lot of iso, and the kernel is the very important thing for the install since the hardware is depending, directly. Unfortunately kernels are changing and it could be interesting to have the information for each iso cdrom. I guess it might be somewhere but that's not so obvious and easy to find (never found where). Debian is cool, nicest distro ever !
I can't figure out the syntax on the apt-cdrom command.
My CDROM is flaky, but have a portable CDROM attached to a usb.
Here's the output of lsusb: Code: Select allroot@home:/etc/apt/apt.conf.d# lsusb Bus 005 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 009 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 008 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 004 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 007 Device 002: ID 04b3:310b IBM Corp. Red Wheel Mouse
[Code] ....
Here's the output of df: Code: Select allroot@home:/etc/apt/apt.conf.d# df Filesystem 1K-blocks Used Available Use% Mounted on /dev/sda1 464279312 4089292 436582932 1% / udev 10240 0 10240 0% /dev tmpfs 1619088 9132 1609956 1% /run tmpfs 4047716 164 4047552 1% /dev/shm
[Code] ....
Here is my modified fstab: Code: Select allroot@home:/etc# cat fstab # # Use 'blkid' to print the universally unique identifier for a # device; this may be used with UUID= as a more robust way to name devices # that works even if disks are added and removed. See fstab(5).
[Code] ....
Here is my configure file: Code: Select allroot@home:/etc/apt/apt.conf.d# cat config.usb.cdrom Acquire::/dev/sr1::/media/bill;
Here is my command and output: Code: Select allroot@home:/etc/apt/apt.conf.d# apt-cdrom -d=/media/bill -c=config.usb.cdrom add Using CD-ROM mount point /media/bill/ Unmounting CD-ROM... Waiting for disc... Please insert a Disc in the drive and press enter
[Code] ....
There must be something simple that I have missed.
Well I bought this book "Debian GNU/Linux 3.1 and it came with Debian installer. Thought this was too good to be true! Well, during the install I get a message that "cdrom not detecting install components". I ran the the diagnostic utility that was included in the disc and it did not report any issues.
I'm trying to create a new VM with Debian 7.6. I downloaded the 3 install iso discs and tried to install. When I begin the installation it seems to take control of the CDROM and I am no longer able to eject it by either pressing the eject button or trying to eject through Windows. The installation goes fine until it asks if I have any other discs to install (which I do ... I have 2 more). At this point I say "Yes" but I cannot get the CDROM to open up at all since it doesn't automatically eject for another disc and as I mentioned before ... I can't do it manually or in Windows.
- I cannot do an installation over the internet because this is a government computer which is requires to be hooked to a government network. There are certain certificates required to access the network which are not available to my newly made (or in the process of making) Linux system but rather only to my host Windows system. - I cannot use a newer version of VirtualBox because that opens up a whole new set of issues dealing with WinVerifyTrust getting CERT_E_REVOCATION_FAILURE most likely due to our symantec security software.
So my Storage Settings in VirtualBox are as follows for my CDROM drive: - Controller: IDE Controller - Type: PIIX4 - "Use Host I/O Cache" is checked - Host Drive D: - CD/DVD Drive: IDE Secondary Master - "Passthrough" is checked
I cannot install the Debian 7.6 OS because I cannot eject each disc and put in the next disc. The only way I can eject the disc is if I reboot my entire system (host windows system) which obviously leaves me at an unfinished point in my install and therefore I have to restart the install and end up right where I was 30 minutes earlier.
The CDROM/DVD drive in the new computer was defective. Everything is working now.
My ancient Dell desktop finally kicked the bucket but the hard drive was relatively new (less than a year old).So I replaced that old desktop with a new desktop (not Dell this time) and decided to just plug in the old hard drive - to see what happens.Well, I was pleasantly surprised when Debian booted up and executed as normal (ok then, ALMOST normal)!
The problem is that the CDROM/DVD-RW is not recognized.For example, I have an audio CD in it, but neither file managers or multimedia players can see it (where 'it' is either the CD itself or the CDROM device).
I've tried GUI applications like mplayer, kplayer, Vlc, dolphin, pcmanfm; and I've tried the command line as well (bash).Short of doing a complete reinstall of Debian, is there a way to get it to recognize this 'new' hardware?
just installed squeeze today, all went well but when I put a cdrom or dvd disc in it will not auto mount.. have to manually mount them from terminal. I've looked a /ect/fstab and do not see any problems there. Anyone else run into this problem on a squeeze install and fixed it?
While modifying the definition of my PS1, I saw that "[" and "]" markers should be added to help bash to compute the right display lenght. Many exemples on the web do not use them or even mention them.I searched for a solution to add them automatically, like with sed, but I didn't find any example.Are they still needed and is there a recommandation not to use sed to define PS1?
I am attempting to install debian for the first time on my pc that has no cdrom drive. I downloaded the Jessie CD image and wrote it to a 4GB stick, it didnt work. Then tried the netinstall image but face the same issue.
To write the usb stick I used unetbooting first, then tried win32diskimager and finally tried DD while stick was not mounted
Issue is still the same: I boot from the usb and after selecting language and keyboard it fails to detect cdrom drive (no drive at all in my pc). Same behavior using normal or expert mode.
Im also unable to manually specify the drive (it looks at /cdrom and I wanted to change it to the usb stick itself or mount usb to /cdrom but I cant find my stick in /dev)
I tried to install skype: apt-get install skype but I got the following error: Failed to fetch cdrom:[Debian GNU/Linux 6.0.1a _Squeeze_ - Official i386 DVD Binary-1 20110322-15:11]/pool/main/q/qt4-x11/libqtgui4_4.6.3-4_i386.deb Hash Sum mismatch
I have been trying plop floppy to boot a bootable cdrom from a mobile USB cdrom reader, but the usb cdrom are not recognized.I was thinking that with grub or grub2 or syslinux that would be possible, no ?
I have my OpenSuse 11.1 box set up with utf-8, however, every time I try to open a file with utf-8 characters with vi it can't handle those characters properly.
I'm having a problem with getting the console to display special characters. I can type special characters in on the command line but they arent outputted properly when using something like aptitude or man. What I find strange is that in X the same programs work fine.
Heres the locale settings: LANG=en_GB.UTF-8 LC_CTYPE="en_GB.UTF-8" LC_NUMERIC="en_GB.UTF-8"
[Code].....
Unfortunately I dont know how to find the font settings.
Debian won't display Japanese characters properly, it shows them as symbols. Is there a language pack or a particular browser plugin I need to install? It's sort of a noobish question, but I looked for something related to this issue in my Package Manager, and didn't find anything that seemed suitable/related.
i would like to be able to display/type all the characters/letters in my browser, character map and any other place you could think of. right now most of the languages in my character map are displayed as hex codes.
when i run dpkg-reconfigure locales and the gui comes up its mostly strange characters like in the picture.URL...Its like that on every gui im opening.its an vps im ssh'ing to. So my first guess was that it was an ssh client error but i cant change anything in the client,im using tunnelier.