Skip to main content

Counting Bits

Problem statement

Given an integer n, return an array ans of length n + 1 such that for each i (0 <= i <= n), ans[i] is the number of 1's in the binary representation of i.

Example 1:

Input: n = 2Output: [0,1,1]Explanation:0 --> 01 --> 12 --> 10

Example 2:

Input: n = 5Output: [0,1,1,2,1,2]Explanation:0 --> 01 --> 12 --> 103 --> 114 --> 1005 --> 101

Constraints:

  • 0 <= n <= 105

Follow up:

  • It is very easy to come up with a solution with a runtime of O(n log n). Can you do it in linear time O(n) and possibly in a single pass?
  • Can you do it without using any built-in function (i.e., like __builtin_popcount in C++)?

My solution

/**
* @param {number} n
* @return {number[]}
*/
var countBits = function(n) {
// function convertToBinary(number) {
// let num = number;
// let binary = (num % 2).toString();
// console.log("binary", binary)
// for (; num > 1; ) {
// num = parseInt(num / 2);
// console.log("num", num)
// binary = (num % 2) + (binary);
// console.log("binary", binary)
// }
// }

const arr = Array.from({
length: n + 1
}, (_, index) => {
const ones = index.toString(2).split("").filter(v => v === "1")
// console.log(ones)
return ones.length
})

return arr
};