A couple of years ago, my friend wanted to learn programming, so I was giving her a hand with resources and reviewing her code. She got to the part on adding code comments, and wrote the now-infamous line,

i = i + 1 #this increments i

We’ve all written superflouous comments, especially as beginners. And it’s not even really funny, but for whatever reason, somehow we both remember this specific line years later and laugh at it together.

Years later (this week), to poke fun, I started writing sillier and sillier ways to increment i:

Beginner level:

# this increments i:
x = i 
x = x + int(True)
i = x

Beginner++ level:

# this increments i:
def increment(val):
   for i in range(val+1):
      output = i + 1
   return output

Intermediate level:

# this increments i:
class NumIncrementor:
	def __init__(self, initial_num):
		self.internal_num = initial_num

	def increment_number(self):
		incremented_number = 0
		# we add 1 each iteration for indexing reasons
		for i in list(range(self.internal_num)) + [len(range(self.internal_num))]: 
			incremented_number = i + 1 # fix obo error by incrementing i. I won't use recursion, I won't use recursion, I won't use recursion

		self.internal_num = incremented_number

	def get_incremented_number(self):
		return self.internal_num

i = input("Enter a number:")

incrementor = NumIncrementor(i)
incrementor.increment_number()
i = incrementor.get_incremented_number()

print(i)

Since I’m obviously very bored, I thought I’d hear your take on the “best” way to increment an int in your language of choice - I don’t think my code is quite expert-level enough. Consider it a sort of advent of code challenge? Any code which does not contain the comment “this increments i:” will produce a compile error and fail to run.

No AI code pls. That’s no fun.

  • charizardcharz
    link
    111 day ago

    Why not wait for a random bit flip to increment it?

    int i = 0;
    while (i != i + 1);
    //i is now incremented
    
    • AceOP
      link
      fedilink
      6
      edit-2
      1 day ago

      but if i gets randomly bitflipped, wouldn’t i != i+1 still be false? It would have to get flipped at exactly the right time, assuming that the cpu requests it from memory twice to run that line? It’d probably be cached anyway.

      I was thinking you’d need to store the original values, like x=i and y=i+1 and while x != y etc… but then what if x or y get bitflipped? Maybe we hash them and keep checking if the hash is correct. But then the hash itself could get bitflipped…

      Thinking too many layers of redundancy deep makes my head hurt. I’m sure there’s some interesting data integrity computer science in there somewhere…

      • @[email protected]
        link
        fedilink
        122 hours ago

        You just wait for the right bit too be flipped and the wrong ones flipped are flipped an even number of times

      • charizardcharz
        link
        11 day ago

        I didn’t really dig too deep into it. It might be interesting to see what it actually compiles to.

        From what I can remember result of i+1 would have to be stored before it can be compared thus it would be possible for i to experience a bit flip after the result of i+1 is stored.