💩 is part of Unicode 6. But how many environments can truly support 💩?
Confused? Disgusted? A good starting point isthe glorious history of 💩.
encoding | output | notes |
---|
Hex | f0 9f 92 a9 | |
Base64 | qQ== | |
URL encoding | %F0%9F%92%A9 | |
MIME header, Q encoding | =?utf-8?q?💩?= | |
MIME header, B encoding | =?utf-8?b?qQ==?= | |
punycode | xn--ls8h | e.g.http://💩.github.io/ |
MD5 | bd49d549f7c1f0169d6d61322a02d39d | |
SHA1 | 82ab1e5bf66129bdbb3d5477dfe48bfcb2545cbd | |
Javascript string | "\ud83d\udca9" | |
proto | 💩? | Notes |
---|
HTTP Headers | sort of | ymmv. might need to use MIME or percent encoding |
HTTP Responses | 👍 | duh, you're reading an HTTP response that's full of 💩 |
| String | Identifier | Notes |
---|
Factor |  |  | Minor:typing 💩 janks the cursor for the text following it |
Browser | 💩? |
---|
Chrome / Chromium | 👍 (partial on OS X) |
Safari | 👍 |
Find a new way to use and abuse 💩? I'd love to include it, so submit a pull request! Only one rule: every commit message must contain at least one "💩".