this post was submitted on 09 May 2026
842 points (99.6% liked)

Programmer Humor

31374 readers
369 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
top 47 comments
sorted by: hot top controversial new old
[–] apftwb@lemmy.world 188 points 3 days ago* (last edited 3 days ago) (5 children)

<>

These are angle brackets

ᐸᐳ

These are Canadian Aboriginal Syntax Blocks

[–] d_k_bo@feddit.org 22 points 2 days ago

Yes, upper case angle brackets.

[–] boonhet@sopuli.xyz 75 points 3 days ago

Most compilers tell you what's up these days, but

;

Greek question mark

;

Semicolon

[–] kivihiili@lemmy.blahaj.zone 30 points 3 days ago* (last edited 3 days ago) (2 children)

mm our font renders them basically the same hehe

screenshot:

[–] Rubanski@discuss.tchncs.de 5 points 2 days ago

That's the classic Japanese font also used in many Nintendo games like Wii sports, right?

[–] Warehouse@piefed.ca 21 points 3 days ago (1 children)

j1qbTRtQxXk45X1.jpg

It looks like this for me. Granted I'm on Piefed so that's probably part of the reason.

[–] Rekorse@sh.itjust.works 3 points 2 days ago

Mines like yours too.

[–] raspberriesareyummy@lemmy.world 23 points 3 days ago

Corporate needs you to find....

[–] ruuster13@lemmy.zip 6 points 3 days ago

There's room for my mom and your mom in programming.

[–] slazer2au@lemmy.world 155 points 4 days ago (2 children)

Unicode truly is amazing.

Like that fake apple site that uses the Cyrillic A instead of the Latin A.

Or the Greek question mark being a different code to Latin question marks.

[–] gkaklas@lemmy.zip 73 points 3 days ago* (last edited 3 days ago) (2 children)

Greek-Latin question marks

Actually the Greek question mark (;) looks like the Latin semi-colon (;)!

Last time I looked it up I think I found they are the same characters, and I tried compiling C with a Greek question mark instead of a semi-colon and it compiled fine! But I'm curious if it was because of something else, like my computer's keyboard layout, or the compiler simply being able to handle them 🤔

[–] palordrolap@fedia.io 41 points 3 days ago (1 children)

Something somewhere was definitely doing the conversion for you, but it could have been your editor, the compiler or something in between like a C preprocessor directive getting loaded in by your configuration.

[–] 418_im_a_teapot@sh.itjust.works 6 points 3 days ago (1 children)

I'd be pissed if it was my editor. A compiler used on a global scale would make sense.

[–] raspberriesareyummy@lemmy.world 21 points 3 days ago (1 children)

Nah, I would absolutely want my compiler to error out hard on characters that are not allowed per the standard.

[–] mkwt@lemmy.world 3 points 2 days ago* (last edited 2 days ago) (1 children)

In C and C++, the source character set is implementation defined. This means that each compiler sets its own rules about what characters are accepted. For example compilers could choose to accept ASCII or EBCDIC or Unicode, or some combination, etc.

So the ISO standard will say that ; character is the end of statement punctuation. But it is up to the compiler to say which character(s) or code point(s) represent the ISO ;.

The ISO standards also require compilers to define a separate execution character set to specify values that can be stored in char and used with the string library functions. The execution character set doesn't have to be the same as the source character set.

Edit: I should also mention that the rules for this stuff are changing a lot in ISO C23 and C++23. (Which standards I haven't yet personally adopted.) Basically the ISO 23 standards mandate compilers to support UTF-8 source files, and they map every source character in the ISO standard to its corresponding Unicode character.

[–] raspberriesareyummy@lemmy.world 2 points 2 days ago (1 children)

Mhh today I learned. That's wild. I would have thought that any sane person would allow only 7-bit ASCII for the source code, and forward-compatible character sets in strings (every standard iteration being allowed to add characters, but not remove them).

[–] mkwt@lemmy.world 3 points 2 days ago (1 children)

At the time that C was designed, ASCII was not a universal standard. It was one encoding competing with other encodings.

Ok that's a fair point I had overlooked. Thanks for explaining.

[–] VindictiveJudge@lemmy.world 33 points 3 days ago (1 children)

Wait, does C read like valley girl speech in Greek?

[–] raspberriesareyummy@lemmy.world 16 points 3 days ago

Shit - the next five weeks I'll read C++ lines in upspeak in my head :(

[–] Australis13@fedia.io 133 points 4 days ago (2 children)

I don't know whether to be impressed or horrified.

[–] obelisk_complex@piefed.ca 93 points 4 days ago

"Both" is also acceptable.

[–] Solemarc@lemmy.world 2 points 1 day ago

Imagine reinventing preprocessor macros in go...

[–] mercano@lemmy.world 33 points 3 days ago

Things to remember if you ever enter an obfuscated code competition.

[–] pooberbee@lemmy.ml 67 points 4 days ago

My old job legitimately did this in C++ with a Perl script because we had to be able to build on some weird, old systems and couldn't use C++ templates.

[–] 30p87@feddit.org 55 points 4 days ago (2 children)
[–] einkorn@feddit.org 51 points 4 days ago

They were too preoccupied with whether they could instead of asking whether they should.

[–] Deebster@infosec.pub 20 points 4 days ago (2 children)

I wonder if you could write a valid program in two different languages using this technique.

[–] 9point6@lemmy.world 41 points 4 days ago (1 children)

You can do it with any language where whitespace doesn't matter and Whitespace

[–] SmackemWittadic@lemmy.world 8 points 3 days ago* (last edited 3 days ago) (1 children)

That's what I use to show people the exact message I sent before. It gets around any app that doesn't let you send blank messages. I have it saved on my clipboard for this

[–] Natanael@slrpnk.net 22 points 4 days ago (1 children)

Absolutely, that's a polyglot file

[–] vrek@programming.dev 35 points 3 days ago (1 children)

Not quite this exact case but I love showing people https://github.com/mame/quine-relay

It has 128 languages, it starts with ruby which prints out its own source code in Scala, then the Scala program executes to generate the next source code, repeat for 128 languages and eventually returns to the original ruby code.

For extra fun, look at the source code on a large monitor.

[–] cypherpunks@lemmy.ml 5 points 3 days ago

amazing! it's great to see that is still being maintained after so many years.

[–] trem@lemmy.blahaj.zone 48 points 3 days ago

I had to start reading that three times over, because I saw they mentioned "Canadian" and just assumed the angle brackets are a joke in reference to the Canadians in South Park:

Drawn characters with angles for their mouths.

[–] yetAnotherUser@lemmy.ca 38 points 3 days ago

Uncaffeinated needs Lisp in their life. The programming language doesn't have a feature you need? Implement it yourself 👍

[–] poopsmith@lemmy.ml 36 points 3 days ago (4 children)

The OOP goons eventually won and Go added generics a few years back.

[–] gnutrino@programming.dev 27 points 3 days ago

Generics aren't really OOP, OOP tends to use run time dynamic dispatch through inheritance. Generics come from functional programming type constructors.

[–] Corbin@programming.dev 24 points 3 days ago

You're thinking of architecture astronauts when talking about generics. The biggest win of the object-oriented folks was to get a garbage collector included by default; compare and contrast with Rust, which ended up not having garbage collection.

[–] firelizzard@programming.dev 2 points 3 days ago

You make that sound like a bad thing

[–] LodeMike@lemmy.today 12 points 3 days ago (2 children)

Doesn't go actually have generics?

[–] dbx12@programming.dev 22 points 3 days ago

Didn't have them nine years ago. Let alone three (?).

[–] anon_8675309@lemmy.world 7 points 3 days ago

That’s the problem with the internet, it has no memory.

[–] freeman@sh.itjust.works 6 points 3 days ago