Given an integer N, the task is to find the maximum number of distinct positive integers that can be used to represent N.
Examples:
Input: N = 5
Output: 2
5 can be represented as 1 + 4, 2 + 3, 3 + 2, 4 + 1 and 5.
So maximum integers that can be used in the representation are 2.
Input: N = 10
Output: 4
Approach: We can always greedily choose distinct integers to be as small as possible to maximize the number of distinct integers that can be used. If we are using the first x natural numbers, let their sum be f(x).
So we need to find a maximum x such that f(x) < = n.
1 + 2 + 3 + … n < = n
x*(x+1)/2 < = n
x^2+x-2n < = 0
We can solve the above equation by using quadratic formula X = (-1 + sqrt(1+8*n))/2.
Below is the implementation of the above approach:
C++
#include <bits/stdc++.h>
using namespace std;
int count( int n)
{
return int ((-1 + sqrt (1 + 8 * n)) / 2);
}
int main()
{
int n = 10;
cout << count(n);
return 0;
}
|
Java
class GFG
{
static int count( int n)
{
return ( int )(- 1 + Math.sqrt( 1 + 8 * n)) / 2 ;
}
public static void main (String[] args)
{
int n = 10 ;
System.out.println(count(n));
}
}
|
Python3
from math import sqrt
def count(n) :
return ( - 1 + sqrt( 1 + 8 * n)) / / 2 ;
if __name__ = = "__main__" :
n = 10 ;
print (count(n));
|
C#
using System;
class GFG
{
public static int count( int n)
{
return (-1 + ( int )Math.Sqrt(1 + 8 * n)) / 2;
}
public static void Main()
{
int n = 10;
Console.Write(count(n));
}
}
|
Javascript
<script>
function count(n)
{
return parseInt((-1 + Math.sqrt(1 + 8 * n)) / 2);
}
var n = 10;
document.write(count(n));
</script>
|
Time Complexity: O(1)
Auxiliary Space: O(1)