Open In App

Convert Binary to Decimal in Swift

Last Updated : 24 Oct, 2023
Improve
Improve
Like Article
Like
Save
Share
Report

In computers, we use two main number systems: binary (with 0s and 1s) and decimal (with numbers from 0 to 9). Converting binary numbers (like 1010) into decimal numbers (like 10) is a common task in programming. In this article, we’ll learn how to do this in Swift, a programming language developed by Apple.

Problem Statement

The problem is to convert a binary number represented as a string to its decimal equivalent in Swift.

Example:

Example 1:
Binary Number: “1010”
Decimal Equivalent: 10

Example 2:
Binary Number: “11011”
Decimal Equivalent: 27

Approach 1: Using the Built-in Binary Integer Function

In this method, we use a special function built into Swift to do the conversion easily.

Swift




// Auther: Nikunj Sonigara
 
let binaryString = "1010"
let decimalNumber = Int(binaryString, radix: 2)!
print("binaryString: ", binaryString) // binaryString: 1010
print("decimalNumber: ", decimalNumber) // decimalNumber: 10


Output:

binaryString:  1010
decimalNumber:  10

Here, we define our binary number as a string called binaryString. We then use the Int function with radix set to 2 to tell Swift that it’s a binary number. The result is stored in decimalNumber.

Complexity Analysis:

Time Complexity: O(n)

Auxiliary Space: O(1)

Approach 2: Implementing the Conversion Algorithm Manually

In this method, we write our own code to convert binary to decimal.

Swift




func binaryToDecimal(binary: String) -> Int {
    var decimal = 0
    var base = 1
     
    for digit in binary.reversed() {
        if digit == "1" {
            decimal += base
        }
        base *= 2
    }
     
    return decimal
}
 
let binaryString = "1010"
let decimalNumber = binaryToDecimal(binary: binaryString)
print("binaryString: ", binaryString) // binaryString: 1010
print("decimalNumber: ", decimalNumber) // decimalNumber: 10


Output

binaryString:  1010
decimalNumber:  10

Here, we create a function called binaryToDecimal that takes the binary string as input and calculates the decimal number. We go through each digit in the binary string, and if it’s a 1, we add a certain amount to the decimal value. We multiply this amount by 2 for each digit we process. Finally, we return the decimal value.

Complexity Analysis

Time Complexity: O(n)

Auxiliary Space: O(1)



Like Article
Suggest improvement
Share your thoughts in the comments

Similar Reads