i need to know more about openssl.In particular i'm having problems with some basic coammand-line stuff to do with signing and base64 encoding.You'll have to excuse me but i'm a security n00b. What is the command for signing some text file with a given private key and then after that base64 encoding the same file.Can this be done with a single command? what's wrong with:
Code:
openssl rsautl -sign -in textfile -inkey privatekey.pem enc -base64 -in textfile
or should that be:
Code:
openssl rsautl -sign -in textfile -inkey privatekey.pem | openssl enc -base64 -
I tried to compile C program that uses Openssl libraries on shell but got this error. I guess libraries are not linked properly. undefined reference to SSL_library_init()
but when I run "msfconsole", I got the following error messages telling me that ruby-openssl is not installed. I installed it "apt-get install libopenssl-ruby" but same message still comes again. I'm running Ubuntu 9.10.
root@qa-ud910-32-1:/opt/metasploit3/msf3/external/ruby-lorcon2# msfconsole *** The ruby-openssl library is not installed, many features will be disabled! *** Examples: Meterpreter, SSL Sockets, SMB/NTLM Authentication, and more [-] ***
I have a problem my ubuntu is the latest distro but the shiftkeys are not working like when i whant an @ sign i cant make that sign the keybaord layout has been changed i even have done most of the solutions found on the site and no use.
when ever i try to sign in to my messaging system it gives me this message and wont let me sign in,< Received unexpected response from [URL] useTLS=1 is not allowed for non secure requests.>
I'm using Ubuntu 10.04 64-bit. I created a PGP key pair using Applications|Accessories|Passwords and Encryption Keys. I used DSA El Gamal as the encryption type and a key strength of 2048 bits. However; when I right click on a file or folder I don't see the Encyrpt... and Sign options.
When I do a "openssl x509 -in server1.pem -issuer -noout" after I've supposedly signed it with the CA, the issuer is, for some reason, the DN string of server1. If server1 generated the CSR, and it is coming up as issued by server1, doesn't that indicate a self signed cert? How could the CA be producing a cert that has an issuer of another server? Am I just completely off base? Sorry, I'm a bit of a newb with the SSL pieces.
I hope this is the right place for this, but I'm having some difficulty using the java keytool and OpenSSL tool on a Solaris system.
I have a server (CA server) with OpenSSL installed that I would like to use as a Certificate Authority. The second server (server1) is a WebLogic server with JDK 1.6.0_21. I'm trying to configure it to use a certificate that has been signed by server1.
For some reason it keeps giving me this error when I try to import the signed SSL certificate: keytool error: java.lang.Exception: Public keys in reply and keystore don't match
Am I doing something wrong in this whole process?
1) Generate the Private Key for the CA server openssl genrsa -out CA.key -des 2048
2) Generate the CSR on the CA openssl req -new -key CA.key -out CA.csr
3) Sign the new CSR so that it can be used as the root certificate openssl x509 -extensions v3_ca -trustout -signkey CA.key -days 730 -req -in CA.csr -out CA.pem -extfile /usr/local/ssl/openssl.cnf
4) On server1, create Server Private Key KeyStore keytool -genkey -alias server1 -keysize 2048 -keyalg RSA keystore server1.jks -dname "CN=server1.domain.com,OU=Organization,O=Company,L=City,ST=State,C=US"
5) On server1, create a CSR from the recently created Private Key keytool -certreq -alias server1 -sigalg SHA1WithRSA -keystore server1.jks -file server1.csr
6) Transfer the CSR over to the CA (server1) so that it can be signed openssl x509 -extensions v3_ca -trustout -signkey CA.key -days 365 -req -in server1.csr -out server1.pem -extfile /usr/local/ssl/openssl.cnf
7) Transfer CA Public Cert to server1 and Import into keytool keytool -import -trustcacerts -alias CA_Public -file CA.pem -keystore server1.jks
8) Import recently signed CSR to app server keystore (This is where I receive the error) keytool -import -trustcacerts -alias server1 -file server1.pem -keystore server1.jks
Quote: Security expert Georgi Guninski has pointed out a security issue in the 1.0 branch of OpenSSL that potentially allows SSL servers to compromise clients. Apparently the hole can be exploited simply by sending a specially crafted certificate to the client, causing deallocated memory to be accessed in the ssl3_get_key_exchange function (in ssls3_clnt.c). While this usually only causes an application to crash, it can potentially also be exploited to execute injected code.
In order to mitigate risks linked to the use of the classic syslog protocol (spoof, replay, tampering, lost messages...) I am looking for a product implementing the syslog-sign capability: [URL] which is still a draft in the IETF for the moment. On NetBSD, the sylog daemon is able to run this feature: [URL]. Did anybody tried this feature on a Linux system?
Im an academic (university networks and security lecturer) studying/teaching network and operating system security, and inspired by the work of Hovav Shacham set about testing ASLR on linux. Principley I did this by performing a brute force buffer overflow attack on Fedora 10 and Ubuntu 9. I did this by writting a little concurrent server daemon which accidently on purpose didnt do bounds checking.
I then wrote a client to send it a malicious string brute forcing guessed addresses which caused a return-to-libc to the function usleep with a parameter of 16m causing a delay of 16 seconds as laid out in [URL] Once I hit the delay I new I had found the function and could calculate delta_mmap allowing me to create a standard chained ret-to-libc attack. All of that works fine. However .... To complete my understanding I am trying establish where I can find the standard base address for ubuntu 9 (and other distros) for the following, taken from Shacham:-
Quote:
[code]....
/proc/uid/maps gives me some information but not the base address ldd also gives me the randomised starting address for sections in the user address space but neither gives me the base address. Intrestingly ... when a run ldd with aslr on for over (about) 100 times and checked the start point of libc I determined that the last 3 (least significant) hex digits were always 0's and the fist 4 (most significant) where between 0xB7D7 and 0xB7F9. To me this indicated that bits 22-31 were fixed and bits 12-21 were randomized with bits 11-0 fixed. Although even that doesnt define the boundaries observed correctly.
Note: I am replicating the attack to provide signatures to detect it using IDS, and for teaching purposes. I am NOT a hacker and if needed to could reply from my .ac.uk email address as verification.
I have a router that is 1000 Base T and two computers each with ethernet cards that support 1000 Base T. All are equipped with Cat 5e cable. Before I had a router that only went up to 100 Base T and I would setup one box with linux running proftpd. On the other box,I would use win xp pro and use firefox to ftp into the other box and download a file. Download speeds went up to 11.2 MB/sec. Now when I switched routers, I expected something like 120 MB/sec but I'm only getting 5.3 MB/sec. What do I need to change?
There's so little documentation for GLC that it's nearly impossible to get it working. I just tried encoding a file I made using the script provided, and got this output:
Code: joseph@joseph:~/Desktop$ sh encode.sh lugaru-17880-0.glc [: 103: lugaru-17880-0.glc: unexpected operator
After several google searches and forum searches, installing restricted repos, encoders, codecs, ffmpec still REFUSES to encode H.264. Here the output of an FLV to be converted:
This is my first post in this forum. After trying out every video compositing program I found for Linux, I am about to settle with KDEnlive. It looks like the best allrounder professional program in this section to me.
I only got one problem. I cant encode files with mpeg-4. All options here are grayed out and if I hover over one of the options for a moment it's going to tell me:
"Unsupported audio codec: libmp3lame" Whats funny is that it worked a time before i reinstalled my system. I also checked and I got libmp3lame installed. So why has KDEnlive problems to access this library?
I recently installed Ubuntu 10.10 after using 10.04. While using 10.04 I used ffmpeg quite a bit and just a few days ago I was using ffmpeg to encode video with the h263p codec.Anyway, for some reason ffmpeg no longer seems to want to decode h263p anymore. If I type "ffmpeg -codecs" to get a codec list, I get" EV h263p H.263+ / H.263-1998 / H.263 version 2"When it used to be "DEV" for decode / encode video. But now just " EV"Even on the ffmpeg website here http://ffmpeg.org/general.html#SEC3 it says ffmpeg should still be able to decode h263p. Yet, it can't!I've tried installing all sorts of codec libraries and I even followed a guide to download and compile ffmpeg myself. Nothing seems to change.
I can't for the life of me find the option to encode my video in H264 in Pitivi. There's just about every other codec imaginable in the list. I'm running Ubuntu 11.04 32-bit and everything is up to date.
we have postgresql database in our server. when i tried to install our application it throws some error like "org.postgresql.util.PSQLException: ERROR: encoding UTF8 does not match locale en_US Detail: The chosen LC_CTYPE setting requires encoding LATIN1". the locale which is set in our server is
I have a lot of old video footage around that, I am ashamed to say, I encoded with heavily proprietary codecs like DivX and such during the dark ages (aka. the Windows Times).
Now, I would like to redeem myself from the mistakes of my past by re-encoding everything into open formats. Since those videos are often not of the best quality (poor camera, poor codecs, poor knowledge), I do not want to loose more quality in the process.
So, to avoid any more mistakes in the future, I would be glad if someone could answer me some of the following questions: 1. Is it even possible to re-encode those movies into something like x264/vp8/theora without loosing any more quality? 2. What tools should I use for that? Command line is actually preferred. 3. What would be the most desirable format to have? I'm thinking about x264 in Martroska with ogg Auto. Is there anything better suited?
I used K3B to rip some of my DVDs (using the Xvid codec) onto my laptop to take away on holiday with me. This worked fine for 2 of the DVDs but the 3rd one ended up with the audio out of sync with the video. Fortunately VLC comes to the rescue with the ability to offset the video and audio tracks.Is there a way to re-encode the ripped files to permanently fix the offset.
I was trying to find a shell script that will encode video to a an iso that I can burn for a dvd. It seems I have lost a few of my dvds and I backed them up into mkv and I need to convert it to pal so it will work until I find them. My search was leading to other formats, but I was wanting to use pal because it is smaller. My search lead me here, but this shell script is not in pal.I need a solution that would work with slackware.
I'm trying to create a script that will trawl through a directory structure looking for VOB files, convert them to AVI and place them in a ./converted/ sub-directory of the the original source folder. I've knocked up the script below which appears to work fine when I place an "echo" in front of the actual ffmpeg encode line, I get the output on the screen with ffmpeg finding all the VOB files in the directories and looking to convert them to the correct sub-directory. Yet, when I remove the echo and run the script for real, ffmpeg converts the first file and then stops. No errors. Is this something to do with ffmpeg or a problem with my script below?
I Updated from Fedora 9 to Fedora 13 several months ago and noticed that AAC audio encoder was no more available in Avidemux (I use version 2.5.3.3 for fedora 13 from rpmfusion repo)Unfortunately AAC is mandatory on the IPod (and probably other Apple devices) to have the sound working.I had faad2-libs (decoder) and faac (encoder) packages installed, but avidemux was not using them.After some search on the web, I found that faac support was removed because of legal issues.Some people recommend to rebuild Avidemux from source but this was a little bit complicated, and I found an easier solution.
In fact the only thing needed to add AAC support to Avidemux is to add the required libADM_ae_faac.so library in /usr/lib/ADM_plugins/audioEncoders, as it is no more supplied in the package avidemux-plugins. Searching for libADM_ae_faac.so on the Web, I found some packages containing this file, such as the ones from other distros, and the package for the same Avidemux version than the one I use but for Fedora 11 : avidemux-plugins-2.5.3-3.fc11.i586.rpm.