
Look, I need to show you something that might hurt your feelings:
CREATETABLEMongoDB(_idUUIDPRIMARYKEY,dataJSONB);
Before you @ me in the comments, hear me out. What if I told you that 90% of your NoSQL use cases could be handled by this one weird Postgres trick that database admins don't want you to know about?
The Problem: Why We Think We Need NoSQL
We've all been there. It's 2 AM, you're knee-deep in a schema migration, and you're questioning every life choice that led you to this moment. Your product manager just asked for "just one more field" and now you're writing migration scripts like it's 2009.
"Maybe I should just use MongoDB," you whisper to yourself. "Schema flexibility! No migrations! Document storage!"
But here's the thing: You probably don't need MongoDB. You need JSONB.
Enter JSONB: The Hero We Don't Deserve
JSONB isn't just JSON slapped into a Postgres column. Oh no, my friend. It's JSON's cooler, faster, more attractive older sibling who went to the gym and learned how to use indexes.
Here's what makes JSONB special:
- Binary storage format (the B stands for Binary, not 🐝)
- GIN indexes that make queries stupid fast
- Native operators that would make a JavaScript developer weep with joy
- Full SQL power combined with NoSQL flexibility
It's like MongoDB and Postgres had a baby, and that baby grew up to be a superhero.
Mind-Blowing Features Most Devs Don't Know About
The Operators That Will Change Your Life
-- The containment operator @>-- "Does this JSON contain this structure?"SELECT*FROMusersWHEREpreferences@>'{"theme": "dark"}';-- The existence operator ?-- "Does this key exist?"SELECT*FROMproductsWHEREattributes?'wireless';-- The arrow operators -> and ->>-- -> returns JSON, ->> returns textSELECTdata->>'name'ASname,data->'address'->>'city'AScityFROMusers;-- The path operator #>-- Navigate deep into nested JSONSELECT*FROMeventsWHEREdata#>'{user,settings,notifications}'='true';
Indexing Specific JSON Paths (Wait, What?)
This is where things get spicy. You can create indexes on specific paths within your JSON:
-- Index a specific fieldCREATEINDEXidx_user_emailONusers((data->>'email'));-- Index for existence queriesCREATEINDEXidx_attributesONproductsUSINGGIN(attributes);-- Index for containment queriesCREATEINDEXidx_preferencesONusersUSINGGIN(preferences);
Now your JSON queries are faster than your coworker who claims they "don't need indexes because MongoDB handles it."
Full-Text Search Inside JSON 🤯
Hold onto your keyboards:
-- Add full-text search to JSON fieldsCREATEINDEXidx_content_searchONarticlesUSINGGIN(to_tsvector('english',data->>'content'));-- Search like a bossSELECT*FROMarticlesWHEREto_tsvector('english',data->>'content')@@plainto_tsquery('postgres jsonb amazing');
Real Code Examples (The Meat)
Let's start with something practical. Say you're building a SaaS product (likeUserJot - shameless plug for my feedback management tool) and you need to store user preferences:
-- The hybrid approach: structured + flexibleCREATETABLEusers(idUUIDPRIMARYKEYDEFAULTgen_random_uuid(),emailTEXTNOTNULLUNIQUE,created_atTIMESTAMPTZDEFAULTNOW(),preferencesJSONBDEFAULT'{}',metadataJSONBDEFAULT'{}');-- Insert a user with preferencesINSERTINTOusers(email,preferences)VALUES('john@example.com','{ "theme": "dark", "notifications": { "email": true, "push": false, "frequency": "daily" }, "features": { "beta": true, "advancedAnalytics": false } }');-- Query users who have dark theme AND email notificationsSELECTemailFROMusersWHEREpreferences@>'{"theme": "dark", "notifications": {"email": true}}';-- Update nested preferencesUPDATEusersSETpreferences=jsonb_set(preferences,'{notifications,push}','true')WHEREemail='john@example.com';
The Event Log Pattern (Chef's Kiss)
This is where JSONB absolutely shines:
CREATETABLEevents(idUUIDPRIMARYKEYDEFAULTgen_random_uuid(),event_typeTEXTNOTNULL,user_idUUID,occurred_atTIMESTAMPTZDEFAULTNOW(),dataJSONBNOTNULL);-- Index for fast event type + data queriesCREATEINDEXidx_events_type_dataONevents(event_type)WHEREevent_typeIN('purchase','signup','feedback');CREATEINDEXidx_events_dataONeventsUSINGGIN(data);-- Insert different event types with different schemasINSERTINTOevents(event_type,user_id,data)VALUES('signup','user-123','{ "source": "google", "campaign": "summer-2024", "referrer": "blog-post"}'),('purchase','user-123','{ "items": [ {"sku": "PROD-1", "quantity": 2, "price": 49.99}, {"sku": "PROD-2", "quantity": 1, "price": 19.99} ], "discount": "SUMMER20", "total": 99.97}'),('feedback','user-123','{ "type": "feature_request", "title": "Add dark mode", "priority": "high", "tags": ["ui", "accessibility"]}');-- Find all purchases with a specific discountSELECT*FROMeventsWHEREevent_type='purchase'ANDdata@>'{"discount": "SUMMER20"}';-- Calculate total revenue from eventsSELECTSUM((data->>'total')::NUMERIC)AStotal_revenueFROMeventsWHEREevent_type='purchase'ANDoccurred_at>=NOW()-INTERVAL'30 days';
Product Catalog with Dynamic Attributes
This is the example that makes MongoDB developers question everything:
CREATETABLEproducts(idUUIDPRIMARYKEYDEFAULTgen_random_uuid(),nameTEXTNOTNULL,priceNUMERIC(10,2)NOTNULL,attributesJSONBDEFAULT'{}');-- Insert products with completely different attributesINSERTINTOproducts(name,price,attributes)VALUES('iPhone 15',999.00,'{ "brand": "Apple", "storage": "256GB", "color": "Blue", "5g": true, "screen": { "size": "6.1 inches", "type": "OLED", "resolution": "2532x1170" }}'),('Nike Air Max',120.00,'{ "brand": "Nike", "size": "10", "color": "Black/White", "material": "Mesh", "style": "Running"}'),('The Pragmatic Programmer',39.99,'{ "author": "David Thomas", "isbn": "978-0135957059", "pages": 352, "publisher": "Addison-Wesley", "edition": "2nd"}');-- Find all products with 5GSELECTname,priceFROMproductsWHEREattributes@>'{"5g": true}';-- Find products by brandSELECT*FROMproductsWHEREattributes->>'brand'='Apple';-- Complex query: Find all products with screens larger than 6 inchesSELECTname,attributes->'screen'->>'size'ASscreen_sizeFROMproductsWHERE(attributes->'screen'->>'size')::FLOAT>6.0;
When JSONB Absolutely Destroys (Use Cases)
Here's where you should absolutely use JSONB:
User Preferences/Settings: Every user wants different things. Don't create 50 boolean columns.
Event Logs: Different events = different data. JSONB handles it like a champ.
Product Catalogs: Books have ISBNs, shoes have sizes, phones have screen resolutions. One schema to rule them all.
API Response Caching: Store that third-party API response without parsing it.
Form Submissions: Especially when you're building something like UserJot where user feedback can have custom fields.
Feature Flags & Configuration:
CREATETABLEfeature_flags(keyTEXTPRIMARYKEY,configJSONB);INSERTINTOfeature_flagsVALUES('new_dashboard','{ "enabled": true, "rollout_percentage": 25, "whitelist_users": ["user-123", "user-456"], "blacklist_countries": ["XX"], "start_date": "2024-01-01", "end_date": null}');
The Plot Twist: When You Still Need Real Columns
Let's be real for a second. JSONB isn't always the answer. Here's when you should use regular columns:
- Foreign Keys: You can't reference JSONB fields in foreign key constraints
- Heavy Aggregations: SUM, AVG, COUNT on JSONB fields are slower
- Frequent Updates: Updating a single JSONB field rewrites the entire JSON
- Type Safety: When you REALLY need that data to be an integer
The secret sauce?Hybrid approach:
CREATETABLEorders(idUUIDPRIMARYKEY,user_idUUIDREFERENCESusers(id),-- Real FKtotalNUMERIC(10,2)NOTNULL,-- For fast aggregationsstatusTEXTNOTNULL,-- For indexed lookupscreated_atTIMESTAMPTZDEFAULTNOW(),line_itemsJSONB,-- Flexible item detailsmetadataJSONB-- Everything else);
The Grand Finale: Migration Strategy
Here's how to migrate from MongoDB to Postgres/JSONB:
# Pseudo-code for the braveimportpsycopg2frompymongoimportMongoClient# Connect to bothmongo=MongoClient('mongodb://localhost:27017/')postgres=psycopg2.connect("postgresql://...")# Migrate with stylefordocinmongo.mydb.mycollection.find():postgres.execute("INSERT INTO my_table (id, data) VALUES (%s, %s)",(str(doc['_id']),Json(doc)))
Try This One Query and Tell Me It's Not Magic
Here's your homework. Create this table and run this query:
-- Create a tableCREATETABLEmagic(idSERIALPRIMARYKEY,dataJSONB);-- Insert nested, complex dataINSERTINTOmagic(data)VALUES('{"user": {"name": "Alice", "scores": [10, 20, 30], "preferences": {"level": "expert"}}}'),('{"user": {"name": "Bob", "scores": [5, 15, 25], "preferences": {"level": "beginner"}}}');-- Mind-blowing query: Find users with average score > 15 AND expert levelSELECTdata->'user'->>'name'ASname,(SELECTAVG(value::INT)FROMjsonb_array_elements_text(data->'user'->'scores')ASvalue)ASavg_scoreFROMmagicWHEREdata@>'{"user": {"preferences": {"level": "expert"}}}'AND(SELECTAVG(value::INT)FROMjsonb_array_elements_text(data->'user'->'scores')ASvalue)>15;
If that doesn't make you reconsider your MongoDB addiction, I don't know what will.
Bonus: The Ultimate JSONB Cheat Sheet
-- Operators@>-- Contains<@-- Is contained by?-- Key exists?|-- Any key exists?&-- All keys exist||-- Concatenate--- Delete key/element#--- Delete at path-- Functionsjsonb_set()-- Update value at pathjsonb_insert()-- Insert value at pathjsonb_strip_nulls()-- Remove null valuesjsonb_pretty()-- Format for humansjsonb_agg()-- Aggregate into arrayjsonb_object_agg()-- Aggregate into object-- Performance tips1.UseGINindexesfor@>and?operators2.Usebtreeindexesfor->>onspecificfields3.Partialindexesforcommonqueries4.Don't nest more than 3-4 levels deep5. Keep JSONB documents under 1MB
The Real Talk
Look, I'm not saying MongoDB is bad. It has its place. But before you reach for a separate NoSQL database, ask yourself: Could JSONB do this?
9 times out of 10, the answer is yes. And you get to keep:
- ACID transactions
- Joins when you need them
- Your existing Postgres knowledge
- One less database to manage
- Money in your pocket (Postgres is free!)
AtUserJot, we use JSONB extensively for storing user feedback metadata, custom fields, and integration configurations. It gives us MongoDB-like flexibility with Postgres reliability. Best of both worlds.
Now go forth and@>
all the things! Drop a comment with your wildest JSONB use case. I'll be here, answering questions and probably making more bad database jokes.
P.S. - That MongoDB table at the beginning? It actually works. I'm not saying you should use it, but... you could. 😈
P.P.S. - If you're collecting user feedback and want something better than a JSONB column (though honestly, JSONB would work), check outUserJot. We built it with lots of JSONB magic under the hood.
Top comments(29)

- Joined
Jsonb is well known, but so so slow compared to doing aggregations with mongodb. For simple use cases yes, you don't need that performance, but by simple I mean: your personal blog. In 12 years I have never encountered any enterprise app where jsonb was a good choice in the long run.

I agree , I tried this with gin index to try if it really is faster than mongo.
it wasn't
Writes were extremly slow
Reads and updates were keeping up with thehelp of indexes.
Git link -github.com/faeiiz/Jsonb-Postgresql

The only time i have used jsonb was when i was capturing update logs in a log table in postgress and storing data of the user had changed. 2 collumns -> old_data and new_data(updated value).
did'nt want to store the whole object.

- Joined
That sounds like a good use case

- EducationDidn't finish high school :(
- PronounsNev/Nevo
- WorkOSS Chief @ Gitroom
- Joined
pretty epic seeing someone call out most nosql hype like this - ever find yourself picking tech just to avoid migrations, or is it usually about performance?

- LocationBelgium
- Joined
While jsonb is great, just from the examples I see the overuse of the column type.
User Preferences/Settings: Every user wants different things. Don't create 50 boolean columns.
Create a preferences table with an id and the name of preference, and have a pivot table with the ids of preferences and user ids.
Product catalog and form submissions, a similar solution.
I see more a shortcut to not do data normalization, than splitting document structured data from relational data.
While they are pushing you to their Atlas solution, there is still the freecommunity server. No need to pay, just a willingness to set it up.
From the post you made i see you had a good experience with jsonb, and that is great. The problem I have that the information you provide is skewed by the experience. it is a trap I fell for many times, so I'm as fallible as you. Over time I learned all tools have their good and bad sides, and we have to use them for their good sides.

This is really great article. A gem in a pile of AI crap usually posted on DEV.to nowadays.
And I agree that JSONB brings the best of both worlds. Moved from MongodDB two years ago using exactly the same strategy. We also combine this with static JSON files stored in buckets storage.
The big plus is also full text search. This cannot be achieved in MongoDB but is easily done with JSONB.

- LocationLausanne, Switzerland
- EducationMaster MIAGE, Université Paris-Sud, France
- WorkDeveloper Advocate at MongoDB
- Joined
This approach works only on very small databases because indexing this structure is more challenging and limited compared to document databases like MongoDB. Using UUIDv4 for IDs can be harmful (UUID PRIMARY KEY DEFAULT gen_random_uuid()), as these IDs are scattered throughout the table, which can negatively impact performance, especially with shared buffer caches and B-Trees.
But there is worse with the single JSONB. I've written a series on the pain points people experience when using PostgreSQL as a document database. The lack of Index Only Scan, the lack of range or sort covering with GIN index, the read and write amplification:dev.to/mongodb/no-hot-updates-on-j...

- LocationJohannesburg, South Africa
- WorkSoftware Engineer
- Joined
This reminds me of back in the day when XML support was added to MSSQL, now we could store SOAP responses directly into our columns, it was magic. Looking back, it was bloated and needed wizardry to parse elements. In fact i see a circle, soap+xml, rest+json, grpc+protobuf and now json-rpc+json
Some comments may only be visible to logged-in visitors.Sign in to view all comments.
For further actions, you may consider blocking this person and/orreporting abuse