Open In App

Swift – Literals

Improve
Improve
Like Article
Like
Save
Share
Report

Literals are the actual values of integer, decimal number, strings, etc. Or we can say that literals are the values of a variable or constant. We can use literals directly in a program without any computation. By default in Swift, literals don’t have any type on their own. Primitive type variables can be assigned with literals. For example, 12 is an integer literal, 3.123 is a floating-point literal, “GeeksforGeeks” is a string literal, and true is a Boolean literal. When we specify the type annotation for a literal, then the type notation is required to be a type that can be inferred from the literal value. The annotation type must follow Swift standard library protocols:

  • ExpressibleByIntegerLiteral: This is used for integer literals.
  • ExpressibleByFloatLiteral: This is used for floating-point literals.
  • ExpressibleByStringLiteral: This is used for string literals.
  • ExpressibleByBooleanLiteral: This is used for  Boolean literals.

For example, Int32 follows the “ExpressibleByIntegerLiteral protocol”, hence we use it as a type annotation for the literal value 25 as,

 let x: Int32 = 25.

Swift supports mainly four types of literals as explained below.

Integer literal

Integer literals are used for the representation of integer values. If the alternate base of an integer is not specified then the default type is allotted which base-10 that is, decimal. We can specify negative integer literal also by using a minus(-) operator. For example:

-25 // literal

These literals can also contain underscore(_) between their digits for better representation of numbers but these underscores are ignored and have no effect on the overall value of the literal. 

For example:

// Both integer literals are the same

025 // literal 1
25  // literal 2

Even, we can use leading zeros with an integer literal but again they have no effect on the overall value of the literal. For example,

// Both integer literals are the same

025 // literal 1
25  // literal 2

Integer literals are the numeric values and can be one of these

  • Binary constant: A Binary constant can contain only two values, 0 or 1.
  • Decimal constant: A decimal constant can contain values in the range 0 to 9.
  • Octal constant: An octal constant can contain values in the range 0 to 7.
  • Hexadecimal constant: A hexadecimal constant can contain values in the range 0 to 9 or from A to F or a to f.

By default, an integer literal has no type on its own. For example,

let myVariable: Int32 = 10

Swift uses its explicit type notation to infer that the literal 20 is of type Int32. When the explicit type notation is not used, then the swift compiler infers that the literal’s type is one of the default literal types that are pre-defined in the Swift standard library. So the default type for integer literals is Int. Except decimal literal, all literal constants start with a prefix as specified below,

  • binary: 0b
  • decimal: no prefix
  • octal: 0o
  • hexadecimal: 0x

These prefixes used with literals are unique for the compiler and has special meaning. Below is the implementation in Swift using these literals.

Example:

In the below program we have represented a number 10 as 10 in decimal, 0b1010 in binary, 0o12 in octal, and 0xA in hexadecimal. 

Swift




// Swift program to demonstrate the working of literals
  
// Initializing variables
// 10 in decimal notation
let decimalNumber = 10    
  
// 10 in binary notation
let binaryNumber = 0b1010   
  
// 10 in octal notation
let octalNumber = 0o12 
  
// 10 in hexadecimal notation
let hexadecimalNumber = 0xA   
  
// Print these numbers
print("Decimal Number:", decimalNumber)
print("Binary Number:", binaryNumber)
print("Octal Number:", octalNumber)
print("Hexadecimal Number:", hexadecimalNumber)


Output:

Decimal Number: 10
Binary Number: 10
Octal Number: 10
Hexadecimal Number: 10

Floating-point Literals

Floating-point literals have a decimal point in their representation. Here, decimal floating-point literals represent a sequence of decimal digits that are followed by either a decimal fraction, or a decimal exponent, or both. Floating-point literals don’t possess any type of their own. If the alternate base of a floating-point literal is not specified then the default type is allotted which base-10 that is, decimal. We can specify negative floating-point literals also by using a minus(-) operator. For example, 

-1.123 // literal

Like an integer literal, we can use underscores between the digits of a floating-point literal. These underscores are ignored and have no effect on the overall value of the floating-point literal. For example,

// Both floating-point literals are the same

0.2_5 // literal 1
0.25  // literal 2

We can provide explicit type notation to infer that a floating-point literal is of type Float. For example,

let myVariable: Float = 10.123 // Explicitly specifying that myVariable is of type Float

When we do not use explicit type notation, then the Swift compiler infers that the type of the literal is one of the default literal types that are pre-defined in the standard library of Swift language. For example,

let myVariable = 10.123 // Swift compiler internally infers that myVariable is of type Double

 So the default type for floating-point literals is Double. There are two ways in which we can classify floating-point literals:

1. Decimal floating-point literal: It represents a sequence of decimal digits that follows by either a decimal fraction, or a decimal exponent, or both. It consists of an optional exponent that is represented by an uppercase (E) or lowercase (e). If a decimal floating-point number has x as an exponent then the base is multiplied by 10x. For example,

// Both are the same value
0.25e1 
2.5 

Example: In the below program we have represented decimalFloatingNumber2 is equal to = 0.123e2 = 0.123 * 102  = 12.3.

Swift




// Swift program to demonstrate the working of
// decimal floating-point literal
  
// Initializing variables
let decimalFloatingNumber1 = 0.12  
let decimalFloatingNumber2 = 0.123e2  
  
// Display the result
print("decimalFloatingNumber1: \(decimalFloatingNumber1)")  
print("decimalFloatingNumber2: \(decimalFloatingNumber2)")


Output:

decimalFloatingNumber1: 0.12
decimalFloatingNumber2: 12.3

2. Hexadecimal-floating point literal: A Hexadecimal-floating point literal consists of an exponent and it is represented by an uppercase (P) or lowercase (p). If in this literal x is exponent then the base is multiplied by 2x. For example,

// Both are the same value
0.25p1 
0.5

Example: In the below program we have represented hexadecimalFloatingNumber1 is equal to = 0xFp1 = 15 * 21 = 30 since F in hexadecimal is equal to 15 and hexadecimalFloatingNumber2 is equal to = 0xFp-1 = 15 * 2-1 = 7.5 since F in hexadecimal is equal to 15.

Swift




// Swift program to demonstrate the working of 
// Hexadecimal floating-point literal
  
// Initializing variables
let hexadecimalFloatingNumber1 = 0xFp1 
let hexadecimalFloatingNumber2 = 0xFp-1  
    
// Print variables
print("hexadecimalFloatingNumber1: \(hexadecimalFloatingNumber1)")  
print("hexadecimalFloatingNumber2: \(hexadecimalFloatingNumber2)")


Output:

hexadecimalFloatingNumber1: 30.0
hexadecimalFloatingNumber2: 7.5

String Literals

A single-line string literal is a collection of letters or characters surrounded by two double quotations marks. For example,

 "GeeksforGeeks" // A string literal

Example:

Swift




// Swift program to demonstrate the 
// working of string literals
  
// Initializing strings
let myString1 = "GeeksforGeeks"
let myString2 = "Geeks" 
  
// Print strings
print(myString1)
print(myString2)


Output:

GeeksforGeeks
Geeks

String literals do not possess any type of their own but if the type is not specified then the default type allotted internally is String. Swift uses its explicit type notation to infer that the literal “GeeksforGeeks” is of type String. When the explicit type notation is not used, then the Swift compiler infers that the literal’s type is one of the default literal types that are pre-defined in the Swift standard library.  For example, 

let myVariable = "GeeksforGeeks"           // Default type is String
let myVariable: String = "GeeksforGeeks"   // Explicitly specifying that the 
                                           // "GeeksforGeeks" is of type String

Escape Sequences are the special sequence of characters. When they are used inside a string, instead of representing themselves, they are translated into another character or sequence of characters that are difficult to represent directly. String literals can’t have an unescaped double quotation mark (“), a carriage return, an unescaped backslash(\), or a line feed.

In Swift, we can use multi-line literals also. A multi-line string literal is surrounded by three double quotation marks. Or we can say that multiple strings are present in between the opening and closing three double quotation marks. You can also use a backslash (\) to break the line inside the multiline string literal. For example, 

let myVariable: String = """
                            GeeksforGeeks \
                            GFG
                            
                         """

We can include special characters in string literals using the following escape sequences.

Escape sequences Significance
\b  Backspace
\0 Null Character
\\ Backslash
\f Form feed
\n Newline
\r Carriage return
\t Horizontal tab
\v Vertical tab
\’  Single Quotation mark
\” Double Quotation mark
Unicode scalar (\u{n}) Here, n is a hexadecimal number that contains one to eight digits

We can use the actual value of an expression by using a backslash character just before the expression. The expression must be surrounded by parenthesis. Note that the expression can contain a string literal only. In other words, We can’t use an unescaped backslash, a carriage return, or a line feed in an expression.

For example:

// All the following string literals are the same

"5 6 7"
"5 6 \("7")"
"5 6 \(7)"
"5 6 \(3 + 4)"

let x = 7; 
"5 6 \(x)"

A string that is surrounded by double quotes and a balanced set of number signs(#) is known as the “string delimited by extended delimiters”. These strings have the following form,

// Delimited string 1
#"GeeksforGeeks"#

// Delimited string 2
#"""
GeeksforGeeks
"""#

We can make special characters appear as normal characters in the final output by converting them into a delimited string.

Example:

Swift




// Swift program to illustrate the working of delimited strings
  
// Initializing a constant variable
let x = 10
  
// Initializing another variable using a delimited string
let delimitedString = #"\(x)"#
  
// Initializing another variable using a 
// non-delimited string literal
let normalString = "\\(x)"
  
print("delimitedString:", delimitedString)
print("normalString:", normalString)
  
print("Are delimitedString and normalString equal?",
      delimitedString == normalString)
// Prints "true"


Output:

delimitedString: \(x)
normalString: \(x)
Are delimitedString and normalString equal? true

When a set of more than one extended delimiters are used then whitespaces must not be given between delimiters. That is,

let delimitedString = ##"\(x)"# #   //  Wrong
let delimitedString = ##"\(x)"##    //  Right

In Swift, we can concatenate two string literals and concatenation occurs during the compile time.

Syntax:

string1 = "Bhuwanesh"  // String 1
string2 = "Nainwal"    // String 2

string1 + string2   // "Bhuwanesh Nainwal"

Note that the order of strings during concatenation is important.

Example:

Swift




// Swift program to concatenate two strings
  
// Initializing a string
let string1 = "GeekforGeeks is one of the "
  
// Initializing another string
// by directly concatenating two strings
let string2 = "best learning " + "platforms."
  
// Concatenate strings
let result = string1 + string2
  
// Print the resulting string
// after concatenation
print("result:", result)


Output:

result: GeekforGeeks is one of the best learning platforms.

Boolean Literals

A Boolean literal can represent one of the following types of values,

  • true
  • false

A Boolean literal doesn’t contain any type of its own. For example, 

let myVariable: Bool = true

We can use the explicit type notation to infer that the literal true has the Bool type. When the explicit type notation is not used, then the swift compiler infers that the literal’s type is one of the default literal types that are pre-defined in the Swift standard library. So the default type for boolean literals is Boolean.

Example:

Swift




// Swift program to demonstrate the
// working of boolean literals
  
// Initializing strings
let booleanNumber1 = true 
let booleanNumber2 = false  
  
// Print the value represented by variables
print("booleanNumber1: \(booleanNumber1)")  
print("booleanNumber2: \(booleanNumber2)")


Output:

booleanNumber1: true
booleanNumber2: false


Last Updated : 03 Jan, 2022
Like Article
Save Article
Previous
Next
Share your thoughts in the comments
Similar Reads