Count of 3 length strings using given characters containing at least 2 different characters
Last Updated :
12 Nov, 2021
Given three integers a, b and c which denotes the frequencies of three different characters ‘A‘, ‘B‘, and ‘C‘ respectively, and can be used to form strings of length 3. The task is to count the total number of possible combinations of A, B and C such that it forms a string having at least 2 different characters.
Example:
Input: a = 2, b = 3, c = 3
Output: 2
Explanation: Possible strings which satisfies the given conditions are: {“ABC”, “ABC”}
Input: a = 5, b = 4, c = 3
Output: 4
Approach: Total number of strings of length 3, that can be formed with the given frequencies is (a+b+c)/3, assuming to select any character for any string. But as only strings with 2 different characters are required, so it’s essential to check that if that’s possible or not. To check that:
- Assume that, all (a+b+c)/3 strings are formed up to two places only with any and all does have a remaining space left to be filled.
- Now, till this point are strings are valid because:
- If the string has two different characters, then it’s valid.
- If the string has two same characters, then it can be made valid by inserting a different character.
- So, the total number of different characters we need is, let’s say count where count = (a+b+c)/3, assuming that one character is required for each string.
- So if the sum of the two smallest frequencies exceeds count, then (a+b+c)/3 number of strings can be formed. Otherwise, it would be the sum of two smallest frequencies.
Below is the implementation of the above approach:
C++
#include <bits/stdc++.h>
using namespace std;
int countStrings( int a, int b, int c)
{
int arr[3];
arr[0] = a;
arr[1] = b;
arr[2] = c;
int count = (arr[0] + arr[1] + arr[2]) / 3;
sort(arr, arr + 3);
if (arr[0] + arr[1] < count) {
count = arr[0] + arr[1];
}
return count;
}
int main()
{
int a = 5, b = 4, c = 3;
cout << countStrings(a, b, c);
return 0;
}
|
Java
import java.util.*;
class GFG
{
public static int countStrings( int a, int b, int c)
{
int [] arr = new int [ 3 ];
arr[ 0 ] = a;
arr[ 1 ] = b;
arr[ 2 ] = c;
int count = (arr[ 0 ] + arr[ 1 ] + arr[ 2 ]) / 3 ;
Arrays.sort(arr);
if (arr[ 0 ] + arr[ 1 ] < count) {
count = arr[ 0 ] + arr[ 1 ];
}
return count;
}
public static void main(String[] args) {
int a = 5 , b = 4 , c = 3 ;
System.out.println(countStrings(a, b, c));
}
}
|
Python3
def countStrings(a, b, c) :
arr = [ 0 ] * 3 ;
arr[ 0 ] = a;
arr[ 1 ] = b;
arr[ 2 ] = c;
count = (arr[ 0 ] + arr[ 1 ] + arr[ 2 ]) / / 3 ;
arr.sort();
if (arr[ 0 ] + arr[ 1 ] < count) :
count = arr[ 0 ] + arr[ 1 ];
return count;
if __name__ = = "__main__" :
a = 5 ; b = 4 ; c = 3 ;
print (countStrings(a, b, c));
|
C#
using System;
public class GFG
{
public static int countStrings( int a, int b, int c)
{
int [] arr = new int [3];
arr[0] = a;
arr[1] = b;
arr[2] = c;
int count = (arr[0] + arr[1] + arr[2]) / 3;
Array.Sort(arr);
if (arr[0] + arr[1] < count) {
count = arr[0] + arr[1];
}
return count;
}
static public void Main()
{
int a = 5, b = 4, c = 3;
Console.Write(countStrings(a, b, c));
}
}
|
Javascript
<script>
function countStrings(a, b, c)
{
var arr = [0,0,0];
arr[0] = a;
arr[1] = b;
arr[2] = c;
var count = parseInt((arr[0] + arr[1] + arr[2]) / 3);
arr.sort((a,b) => a-b);
if (arr[0] + arr[1] < count) {
count = arr[0] + arr[1];
}
return count;
}
var a = 5, b = 4, c = 3;
document.write(countStrings(a, b, c));
</script>
|
Time Complexity: O(1)
Auxiliary Space: O(1)
Share your thoughts in the comments
Please Login to comment...