Unset least significant K bits of a given number
Given an integer N, the task is to print the number obtained by unsetting the least significant K bits from N.
Examples:
Input: N = 200, K=5
Output: 192
Explanation:
(200)10 = (11001000)2
Unsetting least significant K(= 5) bits from the above binary representation, the new number obtained is (11000000)2 = (192)10
Input: N = 730, K = 3
Output: 720
Approach: Follow the steps below to solve the problem:
- The idea is to create a mask of the form 111111100000….
- To create a mask, start from all ones as 1111111111….
- There are two possible options to generate all 1s. Either generate it by flipping all 0s with 1s or by using 2s complement and left shift it by K bits.
mask = ((~0) << K + 1) or
mask = (-1 << K + 1)
- Finally, print the value of K + 1 as it is zero-based indexing from the right to left.
Below is the implementation of the above approach:
C++
#include <bits/stdc++.h>
using namespace std;
int clearLastBit( int N, int K)
{
int mask = (-1 << K + 1);
return N = N & mask;
}
int main()
{
int N = 730, K = 3;
cout << clearLastBit(N, K);
return 0;
}
|
Java
import java.util.*;
class GFG{
static int clearLastBit( int N, int K)
{
int mask = (- 1 << K + 1 );
return N = N & mask;
}
public static void main(String[] args)
{
int N = 730 , K = 3 ;
System.out.print(clearLastBit(N, K));
}
}
|
Python3
def clearLastBit(N, K):
mask = ( - 1 << K + 1 )
N = N & mask
return N
N = 730
K = 3
print (clearLastBit(N, K))
|
C#
using System;
class GFG{
static int clearLastBit( int N,
int K)
{
int mask = (-1 << K + 1);
return N = N & mask;
}
public static void Main(String[] args)
{
int N = 730, K = 3;
Console.Write(clearLastBit(N, K));
}
}
|
Javascript
<script>
function clearLastBit(N , K)
{
var mask = (-1 << K + 1);
return N = N & mask;
}
var N = 730, K = 3;
document.write(clearLastBit(N, K));
</script>
|
Time Complexity: O(1)
Auxiliary Space: O(1)
Last Updated :
08 Apr, 2021
Like Article
Save Article
Share your thoughts in the comments
Please Login to comment...