Program to Convert Unicode to ASCII
Last Updated :
08 Feb, 2024
Given a Unicode number, the task is to convert this into an ASCII (American Standard Code for Information Interchange) number.
ASCII number
ASCII is a character encoding standard used in communication systems and computers. It uses 7-bit encoding to encode 128 different characters 0-127. These values generally include numbers, letters, punctuation, and some control characters. It primarily focuses on representing English symbols.
UNICODE number
Unicode is a character encoding well known used in conversation structures and computer systems. Unlike ASCII, which uses 7-bit encoding and encodes 128 unique characters (zero-127), Unicode makes use of variable-period encoding to symbolize a full-size variety of characters from numerous scripts and languages. Unicode can represent over 1,000,000 distinct characters, making it a comprehensive character encoding standard.
Examples:
Input: Unicode = ‘A’
Output: ASCII = 65
Input: Unicode = ‘Z’
Output: ASCII = 90
Approach: Follow the below steps to convert unicode to ASCII number
- Declare a string variable unicodeInput and initialize it with the Unicode character “A”.
- Call the unicodeToAscii function with unicodeInput as an argument.
- Define the unicodeToAscii function that takes a string unicodeNum as a parameter.
- In the try block:
- Extract the first character from the unicodeNum string.
- Convert the Unicode character to ASCII by casting it to an integer.
- Return the ASCII value.
- In the catch block:
- Handle StringIndexOutOfBoundsException by printing an error message with the exception’s message.
- Return -1 to indicate an error.
- In the main function:
- Check if the result from unicodeToAscii is not equal to -1.
- If true, print the Unicode input and the corresponding ASCII output.
- The program outputs the Unicode character “A” and its corresponding ASCII value.
Following is the code to get the ASCII value of a given character.
C++
#include <iostream>
#include <stdexcept>
class UnicodeToAsciiCpp {
public :
static int UnicodeToAscii(std::string unicodeNum) {
try {
char unicodeChar = unicodeNum[0];
int asciiNum = static_cast < int >(unicodeChar);
return asciiNum;
} catch ( const std::out_of_range& e) {
std::cerr << "Error: " << e.what() << std::endl;
return -1;
}
}
};
int main() {
std::string unicodeInput = "A" ;
int asciiOutputCpp = UnicodeToAsciiCpp::UnicodeToAscii(unicodeInput);
if (asciiOutputCpp != -1) {
std::cout << "Unicode: " << unicodeInput << std::endl;
std::cout << "ASCII: " << asciiOutputCpp << std::endl;
}
return 0;
}
|
Java
public class UnicodeToAsciiJava {
public static int unicodeToAscii(String unicodeNum)
{
try {
char unicodeChar = unicodeNum.charAt( 0 );
int asciiNum = ( int )unicodeChar;
return asciiNum;
}
catch (StringIndexOutOfBoundsException e) {
System.out.println("Error: " + e.getMessage());
return - 1 ;
}
}
public static void main(String[] args)
{
String unicodeInput = "A";
int asciiOutputJava = unicodeToAscii(unicodeInput);
if (asciiOutputJava != - 1 ) {
System.out.println("Unicode: " + unicodeInput);
System.out.println("ASCII: " + asciiOutputJava);
}
}
}
|
Python3
def unicode_to_ascii_py(unicode_num):
try :
ascii_num = ord (unicode_num)
return ascii_num
except TypeError as e:
print (f"Error: {e}")
return None
unicode_input = 'A'
ascii_output_py = unicode_to_ascii_py(unicode_input)
if ascii_output_py is not None :
print (f" Unicode : {unicode_input}")
print (f"ASCII: {ascii_output_py}")
|
C#
using System;
public class UnicodeToAsciiCSharp
{
public static int UnicodeToAscii( string unicodeNum)
{
try
{
char unicodeChar = unicodeNum[0];
int asciiNum = ( int )unicodeChar;
return asciiNum;
}
catch (IndexOutOfRangeException e)
{
Console.WriteLine( "Error: " + e.Message);
return -1;
}
}
public static void Main()
{
string unicodeInput = "A" ;
int asciiOutputCSharp = UnicodeToAscii(unicodeInput);
if (asciiOutputCSharp != -1)
{
Console.WriteLine( "Unicode: " + unicodeInput);
Console.WriteLine( "ASCII: " + asciiOutputCSharp);
}
}
}
|
Javascript
function unicodeToAsciiJs(unicodeNum) {
try {
const asciiNum = unicodeNum.charCodeAt(0);
return asciiNum;
} catch (error) {
console.error("Error:", error.message);
return null ;
}
}
const unicodeInput = 'A' ;
const asciiOutputJs = unicodeToAsciiJs(unicodeInput);
if (asciiOutputJs !== null ) {
console.log(`Unicode: ${unicodeInput}`);
console.log(`ASCII: ${asciiOutputJs}`);
}
|
Output
Unicode: A
ASCII: 65
Time Complexity: O(1)
Auxiliary Space: O(1)
Share your thoughts in the comments
Please Login to comment...