Skip to main content
Photo of DeepakNess DeepakNess

Raw Notes (92)

Plaintext raw notes containing useful resources, incomplete thoughts, ideas, and learnings as I go about my day. You can subscribe to the RSS feed to stay updated.


Replit partners with Notion

Replit has partnered with Notion where Notion can now be used to host content when building apps using Replit – think of Notion as being used for the backend for your apps now. They also have a quick YouTube video explaining how it works.

I think, this will be a great tech-stack for people who love Notion and want to start a blog using that. They can write their post in Notion and it will be live on the website.

I tried searching about it Google to see if someone has created something interesting using the setup, but couldn't find anything as of yet. But I'm sure we'll be seeing some cool use-cases in the next few weeks, as more people learn about it.


A poll about Windsurf, Cursor, and VS Code

Theo Browne ran polls asking "Which IDE do you use primarily?" on X (Twitter), LinkedIn, and YouTube and the results are really interesting.

Below are the results from all these platforms, at the time of writing this post. There are still a few hours left for the polls to complete, but I'm sure that shares are not going to change drastically.

Platform Total Votes Windsurf Cursor VS Code Others
X 43,523 4.7% 30.5% 30.6% 34.1%
LinkedIn 4,172 4% 28% 47% 21%
YouTube 18,000 2% 18% 50% 30%

The most interesting thing is, VS Code is winning in all 3 polls, Cursor is at the second position and then Windsurf is at the last. Also, interesting is that thousands of people are still not using any of these three IDEs.

Also, from huge VS Code shares, I can interpret that there must be some people using GitHub Copilot, or Cline, or other AI assistants and then there would also be some people who are not using AI at all.


Cloud computing – own nothing and be happy

I really love this post from DHH talking about how cloud computing makes you renters for life. You actually own nothing and still be happy. To quote DHH exactly:

Cloud computing is the most successful "you will own nothing and be happy" psyop in history. We gave up on DARPA's beautifully, decentralized design for the internet to become renters for life. Tragic.

While this totally makes sense, I think, we don't have easily-digestible information about "self-hosting" on the internet. I mean, I haven't looked about it, but I still think that this should be more normalized among the dev peeps.

In fact, a person did point this out:

Someone should disrupt the setup/ops by making it actually EASY to learn.


Unsloth AI makes fine-tuning easier

Was reading about Unsloth AI and how it can be used to fine-tune open-source models like qwen3, llama 4, gemma 3, phi-4, etc. faster with 70% less memory. It's open-source and supports LoRA and QLoRA fine-tuning methods.

I got to know about this from this Andrea Volpini post about a fine-tuned reasoning model that thinks like an SEO. They created SEOcrate_4B_grpo_new_01 by using Unsloth AI for fine-tuning the Gemma 3 4B model via Unsloth.

Unsloth AI has a free plan that supports fine-tuning Mistal, Gemma, and LLaMA models that you can run on your computer. And they also have multiple beginner-friendly Google Colab notebooks available for different models that you want to train.

I have yet to try this, but will be going through their documentation and try to fine-tune for some data.


From RSS feed to Bluesky and Mastodon via n8n

I am consistently taking notes in the raw section of my blog and wanted to keep posting new posts on Mastodon and Bluesky as URLs. I used n8n to achieve this automation successfully – even though n8n doesn't have an official Mastodon and Bluesky node.

In this post, I will explain how I set this up:

At first, I used the RSS Feed Trigger node in n8n so that it auto-triggers every time a new raw note is published. It gives me the most recent post in the below format, and I can use the data from here to publish on both platforms.

[
  {
    "title": "",
    "link": "",
    "pubDate": "",
    "content": "",
    "contentSnippet": "",
    "id": "",
    "isoDate": ""
  }
]

From RSS feed to Mastodon

I needed the following 3 things for this automation via n8n:

  1. RSS feed URL: it's deepakness.com/feed/raw.xml in my case
  2. Mastodon URL instance: my account is at mastodon.social
  3. Your Mastodon access token: visited [INSTANCE_URL]/settings/applications, created a new application with full write scope, and got the access token

After the previous RSS Feed Trigger node in n8n, I created another HTTP Request node, and entered the following information:

  1. Authentication: None
  2. Request Method: POST
  3. URL:
    https://[INSTANCE_URL]/api/v1/statuses?access_token=[ACCESS_TOKEN]
  4. Ignore SSL Issues (Insecure): OFF
  5. Response Format: JSON
  6. JSON/RAW Parameters: OFF
  7. Options: Nothing
  8. Body Parameters: Nothing
  9. Headers: Nothing
  10. Query Parameters:
    1. Name: status
    2. Value: [POST_CONTENT] from previous nodes

And this simply worked, I didn't have to do anything else at all. If you're interested, you can learn more by going through their documentation.

From RSS feed to Bluesky

First, I needed to create a Bluesky Session and then only I was able to publish. For this, I needed the following things:

  1. App password: Created a new app and got the app password from bsky.app/settings/app-passwords page
  2. Profile identifier: Your profile identifier [username].bsky.social

Node 1: HTTP Request

First, you need to create a HTTP Request node to get the Session ID. Fill in the following info:

  1. Authentication: None
  2. Request Method: POST
  3. URL:
    https://bsky.social/xrpc/com.atproto.server.createSession
  4. Ignore SSL Issues (Insecure): OFF
  5. Response Format: JSON
  6. JSON/RAW Parameters: OFF
  7. Options
  8. Body Parameters:
    1. Name: identifier
    2. Value: deepakness.bsky.social
    3. Name: password
    4. Value: [APP_PASSWORD]
  9. Headers: Nothing
  10. Query Parameters: Nothing

From here, you need to get the accessJwt token in the next node to be able to publish the post.

Node 2: Date & Time

Yes, the date parameter is required in the Bluesky API so you need to add the Date & Time node and get the current time:

  • Operation: Get Current Date
  • Include Current Time: ON
  • Output Field Name: currentDate
  • Options: Nothing

Node 3: HTTP Request

Now, I needed another HTTP Request node to be able to actually publish. Below are the options:

  1. Method: POST
  2. URL: https://bsky.social/xrpc/com.atproto.repo.createRecord
  3. Authentication: None
  4. Send Query Parameters: OFF
  5. Send Headers: ON
    1. Specify Headers: Using Fields Below
    2. Header Parameters:
      1. Name: Authorization
      2. Value: Bearer [accessJwt_from_previous_node]
  6. Send Body: ON
    1. Body Content Type: JSON
    2. Specify Body: Using JSON
    3. JSON: see here
    4. Options: Nothing

So far, it's working for me but I might further improve it in the future.


You're an average of five closest companions

Came across this thought-provoking post from my friend Rohit, that forced me to think about the subconscious impact of staying surrounded by AI.

I am the average of my five closest people. What if three of them are AI?

Recently, I have been ‘conversing’ a lot with AI models. In fact, the time I spend with AI is about to exceed the time I spend with a lot of good friends.

The problem is, I don't know how it is shaping my personality. I don’t know what quirks I am imbibing from these models.

I know what I learn from AI consciously – the answers these AI models give. I don’t know what I am learning from them subconsciously.

Definitely some food for thought.


About alcohol

I saw Pieter Levels posting about alcohol and how it badly affects the person and the people around, and I couldn't help myself but to write this note. I completely agree with the points he makes, as I have seen families get destroyed due to excessive drinking problems in my village.

I am ~29 years old and I have never had even a single sip of alcohol in my entire life, in fact, no one in my close family drinks, but many of my friends drink. However, I don't have a problem with that as I often hang out a lot with my friends. But I am glad that I still didn't pick up the habit.

Not sure what the future holds, but I am very sure that I will have absolutely no reasons to drink.


OpenAI acquires Windsurf

OpenAI finally acquired Windsurf for $3 billion. It was in the news for some time now, and I am excited about the acquisition as I am more interested in what OpenAI actually does to improve Windsurf.

Currently, I am mainly using Cursor AI for coding but if Windsurf actually gets improved after the acquisition, I would be very happy to switch. However, I do have Windsurf installed and I do sometimes use it for smaller tasks.

I want OpenAI to make Windsurf better and it shouldn't just keep pushing OpenAI models, but includes all other models that are best for coding. Because at the moment, Claude 3.7 Sonnet and Gemini 2.5 Pro are the best coding models. I have tried OpenAI's GPT-4.1 and O3 but they're not up to the mark as of now.

Also, I learned that OpenAI tried buying Cursor twice before they acquired Windsurf but, I think, Cursor guys didn't want to sell.


Setting up MastoFeed

I publish a lot in my /raw notes section, so I thought about connecting my raw feed to my Mastodon account so that whenever I publish a new note, it automatically gets published on the platform.

I was thinking about writing some Python or Node.js script to automate this, but then I looked up about it online and found this tool called MastoFeed that does precisely what I wanted. It's completely free to use and one can connect multiple feeds that gets published to the same account. So then I dropped my idea of creating a custom script and set up MastoFeed instead, and this is the very first post that I am doing after that.

Another option for me was, by using Typefully. I use Typefully to post and schedule posts on X, Mastodon, Threads, etc. platforms, so I could have used the Typefully API and set up an n8n automation for this. But I didn't want to publish a lot of posts on X, so the MastoFeed set up works the best for me.

Update:

I waited for more than an hour but it didn't get published, and I also noticed other people discussing about the same. So I have removed the set up for now and have already found a better solution using n8n.


iOS outside-app payments with Stripe

The US court gave its decision in the favor of Epic games and now Apple iOS is forced to allow payments outside of the app. Earlier, Apple used to charge 30% commission on revenue from in-app purchases and subscriptions.

And now that it's no longer the case, Stripe enabled iOS developers to accept payments outside of the app via Stripe, without heavy commissions. It's super fast implementation from Stripe, as they have also put together a detailed documentation about the same.

If you want some background about the Apple issue, here's a detailed article that explains how Apple is forced to accept external payments now.


Gemini 2.5 Pro plays Pokemon

They made Gemini 2.5 Pro model play the Pokemon game which is successfully completed, but it took the model 37 long days to finish the game. It's continuously being livestreamed on Twitch, in case you're interested.

About the same, Sundar Pichai tweeted this:

What a finish! Gemini 2.5 Pro just completed Pokémon Blue!

Special thanks to @TheCodeOfJoel for creating and running the livestream, and to everyone who cheered Gem on along the way.

Very impressive.


How to be a good technical writer

A person asked the question "i want to become insanely good at technical writing, any tips?" on X and received some good replies. Below I am collecting the good ones that I liked:

Arpit Bhayani replied:

write a lot, a lot.

I have been writing since 2012, and it is a form of meditation for me.

I write everyday, it makes me think harder, it helps me articulate my thoughts well, more importantly it makes me more empathetic.

Ayush Chaurasia replied:

Imho, The popular advise 'write more' improves only grammar and vocabulary. neither are important for great technical writing.

What works best for me is writing with either of these in mind:

  1. What if I were explaining it to my friends
  2. What if I were to learn this over again.

Prateek Joshi replied:

writing is a lagging indicator of observing + doing + thinking. if you keep up the activity, you’ll have a bunch of interesting stuff to say (which can be translated into the written form).

Prabhdeep replied:

I write technical articles on medium, here are a few tips...

  1. Explain the "why" and how things tie in together. Explain things intuitively instead of dropping plain facts.
  2. Storyboard. Write the subtitles first, and then fill in the information. That way, your writing won't be all over the place.
  3. Spend time explaining the foundations briefly at the start. That way, nobody will be discouraged to read.
  4. Emphasize the applications. Why is this technical thing important?

And most importantly, write a lot. Over time, you'll begin to develop an intuition on how to write better. That's how I improved.

Vincy replied:

Remember the phrase "MMM: Meat, Material, Medium" as a framework for good content production. By Gary Vaynerchuk (GaryVee).

Breakdown of MMM in Content Creation:

Meat: The core substance or valuable message of the content.

Material: The quality of the content (research, storytelling, production value).

Medium: The platform or format where the content is shared (video, podcast, blog, etc.).

Gary Vee emphasizes that great content must have strong substance (Meat), be well-produced (Material), and be optimized for the right platform (Medium).

boisselle replied:

read more technical papers, copy them, rearrange them, write your own.

Arunachalam replied:

I recommend you to follow technical style guide used by freCodeCamp or Digital ocean.

I'm personally a technical writer at freeCodeCamp. It's a well recognised platform

Here's the link

Apart from this, most of the people suggested that the more you write the better you become.

I think, the best way here would be to imitate others (who are good at it), and then innovate when writing for yourself.


The good AI

Came across this blog post that talks about what would good AI look like, and I really like the simple points that the author makes:

  1. trained on data which has been accessed with consent
  2. doesn't hallucinate like current models
  3. balanced sustainability and energy consumption
  4. actually open source
  5. should be led by the community
  6. available and accessible to all

Regularly updated personal blogs list

Found this interesting project blogroll.org which is basically a human-curated list of cool personal blogs which are regularly updated. You can filter blogs by topics and can also submit your own blog to the list.

ooh.directory is another such directory website containing over 2,000 blogs on different topics.


The Mobian project

Came across this crazy Mobian project that enables your Android phones to run fully native Linux KDE Plasma, as explained in this video. You can multi-task just like how you do on a desktop, and also install and run any supported apps.

They have a detailed wiki that you can go through and start installing it on the devices it currently supports. The only cons is that it doesn't support front or rear camera currently, otherwise all other functionalities work perfectly.

I would love to learn more about it, and would definitely install it on a device to test it.


Auth and payments on Chrome extensions

Came across this tweet and started exploring different solutions for adding authentication and payments for Google Chrome extensions. And I did find a few options that I have yet to try, but I will be listing out all different options for future references:

  1. Authenticate users with Google: Found this page on Chrome docs that teaches you how to use Google auth for Chrome apps, and it seems very detailed as it covers most of the required things.
  2. Authenticate with Firebase: Yes, Firebase docs also has a detailed page about using the Firebase auth for Chrome extensions. And from the first look, it looks simpler than the #1 method above.
  3. ExtensionPay: Found this amazing open-source API to accept payments in Chrome extensions. It has multi-browser and multi-device login supports. They also have a detailed documentation about the same, and also noticed it being mentioned at multiple places for being the easiest to set up.
  4. BrowserBill: I also found this solution being mentioned as a way to easily monetize Chrome extensions.
  5. Plasmo: A complete browser extension open-source SDK that claims to make the development process faster - calls itself Next.js for browser extensions.
  6. Better-Auth for Extensions: Can be used as an auth solution for Chrome extensions, uses Plasmo in the background.
  7. ExtensionFast: Another react-based boilerplate with inbuilt payments, auth, database, etc.

Of course, one can directly use Stripe or PayPal for accepting payments but those are a bit complicated to set up, at least for me at this point. I am seeing more and more people using the #3 ExtensionPay option for payments, as being discussed in forums and communities.

While researching, I found some other useful stuff for building Chrome extensions as well:

  1. ChromeKit: A boilerplate with ReactJS and TailwindCSS to speed up the development process. Started from $49 at the time.
  2. TurboStarter: Another boilerplate, not specific to Chrome extensions, but does have an option for extensions. Costlier, starts from $199.

And then I also found this informative blog post about different ways of earning money from Chrome extensions.


Coolify is awesome

Recently, I set up a Coolify instance on a Hetzner VPS and I am loving it so far. I didn't know that I can do a lot of things using Coolify, and all for free.

Although, the documentation for Coolify installation should be a bit more descriptive for new-comers like me. There are so many things that are not mentioned in the docs, that I had to look up on Google or ask ChatGPT. But it's good overall, as I got it working.

As of now, I have a Postgres database hosted on the VPS and have also hosted a simple yet full-stack Next.js app. It's working flawlessly.


Showing date as well as time in WordPress

I thought I will need another plugin to also show the time in addition to the published/modified date in WordPress, but no. Not to mention, I am talking about Full Site Editing themes.

When editing a template, click on the post/page date field and turn off the Default format toggle and then select the preferred format from the dropdown that appears, as you can see in this screenshot. You can choose something like Apr 29, 2025 4:40 AM and it would work perfectly.

And the same can be done to the last modified date, just turn on the Display last modified date, turn off the Default format toggle and then select your preferred format.


Self-hosted n8n password reset

For some reason, my self-hosted n8n instance is not remembering the password and wasn't letting me log in using the same credentials I created the account with. And all the docs about resetting passwords are very confusing, either deliberately or otherwise.

But I was finally able to reset mine after spending almost an hour.

First, I logged into my server via SSH by running ssh root@<your-ip-address> and then ran the following command:

docker exec -u node -it n8n-docker-caddy-n8n-1 n8n user-management:reset

It returned "Successfully reset the database to default user state." message. After that I restarted by Hetzner server (where it's hosted) and then the user was reset and I could reset my password.

And yes, all my n8n scenarios did not get deleted after the process.


Keybindings to stage, commit, and sync at once

Came across this tweet that provides a keybinding to automatically stage, generate AI commit message, commit, and sync with GitHub inside Cursor (below, I added the VS Code version as well). Not much, it's going to save 10-20 seconds each time you push your code.

Keybindings for Cursor AI

{
    "key":"ctrl+enter",
    "command":"runCommands",
    "args":{
        "commands":[
        {
            "command":"git.stageAll"
        },
        {
            "command":"cursor.generateGitCommitMessage"
        },
        {
            "command":"git.commitAll"
        },
        {
            "command":"git.sync"
        }
        ]
    }
}

To use this, press cmd+shift+p on macOS or ctrl+shift+p on Windows and type "Preferences: Open Keyboard Shortcuts (JSON)". It opens a file called keybindings.json where you have to add the above JSON code (do not delete the existing code though).

My entire keybindings.json file looks something like this:

// Place your key bindings in this file to override the defaults
[
    {
        "key": "cmd+i",
        "command": "composerMode.agent"
    },
    {
        "key":"ctrl+enter",
        "command":"runCommands",
        "args":{
            "commands":[
            {
                "command":"git.stageAll"
            },
            {
                "command":"cursor.generateGitCommitMessage"
            },
            {
                "command":"git.commitAll"
            },
            {
                "command":"git.sync"
            }
            ]
        }
    }
]

Keybindings for Copilot in VS Code

If you're using GitHub Copilot in VS Code, then the JSON would be like this:

{
    "key":"ctrl+enter",
    "command":"runCommands",
    "args":{
        "commands":[
        {
            "command":"git.stageAll"
        },
        {
            "command":"github.copilot.git.generateCommitMessage"
        },
        {
            "command":"git.commitAll"
        },
        {
            "command":"git.sync"
        }
        ]
    }
}

And then when you press ctrl+enter while being inside Cursor or VS Code, it automatically stages all changes, generates AI commit messages, commits, and then pushes to GitHub within seconds.


Spent $4,300 on AI tools

From early 2023 until today, I have spent around $4,300 on AI tools trying, experimenting, and creating content. I have also tweeted about the same, but here's the breakdown:

Tool Name Amount Months Used Total
OpenAI API $3,384.00 $3,384.00
Anthropic API $50.00 $50.00
Perplexity API $25.00 $25.00
DeepSeek API $10.00 $10.00
GroqCloud API $15.00 $15.00
fal.ai API $32.00 $32.00
ChatGPT Plus $20.00 15 $300.00
Claude Pro $20.00 6 $120.00
Cursor AI $20.00 8 $160.00
Cursor Extra $45.00 $45.00
Bolt.new $9.00 4 $36.00
Lovable.dev $25.00 1 $25.00
v0.dev $20.00 3 $60.00
GitHub Copilot $10.00 1 $10.00
Ideogram $20.00 2 $40.00
$4,312.00

The biggest spending is on the OpenAI API which is from the early days when the API was launched, I created a lot of content and ran a lot of experiments in the initial days.

As of now, I am mainly using subscriptions of ChatGPT Plus, Cursor AI, and v0.dev along with the OpenAI API.


Getting updated docs for LLMs

The Upstash team has launched Context7 which provides updated documentation for LLMs and AI code editors. They also have a MCP server that can be conveniently connected to code editors like Cursor, Windsurf, VS Code, Claude Code, Claude Desktop, etc.

On the homepage, I see more than 4,500 different libraries listed including all popular frameworks like Next.js, Laravel, Clerk, MongoDB, FastAPI, Supabase, etc. And clicking on any one of the libraries takes to a separate page for the library where docs are available as llms.txt or JSON formats.

Currently, it's free to use as I don't see any mentions of pricing anywhere. But they also have an API waitlist page where it's mentioned that:

The Context7 API is currently in private preview. Enter your email and we'll let you know as soon as it's live.

So there are chances that it will become paid, or at least some parts of it will be paid. I am perfectly okay to pay as long as it correctly does what it says.

I tried using it with Cursor and just mentioning "use context7" in the prompt itself triggers the tool and it does work. I am still testing it and will update soon.


Get data from local LLMs to Google Sheets

I created this GitHub repo ollama-py more than a year ago when experimenting with getting local LLM's data via Ollama to Google Sheets, and it now has 20 stars on GitHub. It makes me super happy.

This is how it works:

  1. Run the local LLM via Ollama on your computer
  2. Run the provided Python script
  3. Set up ngrok to create a tunnel to access localhost:5001
  4. Use Google Sheets Apps Script to access locally running LLM's outputs

I also have a detailed blog post as well as a video on the same.


Is it okay to copy someone's HTML/CSS?

I came across this forum thread from 2007 where people are discussing whether it's okay to copy somebody's HTML or not. Some people are in favour, some are against, and then some answers are - it depends.

I absolutely love how useful these forum discussions used to be back then, that I didn't get the chance to participate in.

Arguments for

rocknbil says:

[...] the truth is the code itself most likely came from some other source in the first place, so in a sense we've all borrowed, exchanged, intermixed and added our own flavor to chunks of code to make it our own.

Those that would argue this are probably under the impression they have created something that has never been done before, and are most likely to jump in here and cry copyright. But the truth is, any chunk of html or CSS you copy from a site is likely to be out there in duplicate thousands of times.

BeeDeeDubbleUm says:

It's the nature of the Internet to copy ideas and concepts. Who hasn't used some CSS snippet which originally came from somewhere else? If an idea is seen to work then other people will use it and that's how the Internet has developed.

Dabrowski says:

[...] with HTML/CSS being such a simple 'language', there's generally only 1 or 2 ways to produce a given page, and most people would use it.

If you wrote it from scratch you'd end up with something pretty similar, so go ahead and pinch the code. I would.

Demaestro says:

It isn't stealing and no there is nothing wrong with it. Being inspired by a site and using some of it as a jumping off point is good time management.

Show me an original HTML/CSS website and I will show you original HTML/CSS site that is similar. It is called re-purposing... why re-invent the wheel?

Josefu says:

The answer is pretty simple, but a few of you are mixing things up a bit. Correct that HTML (or any other markup language) is not "computer code".

Arguments against

8kobe says:

I would say that by the letter of the law it is illegal. However it would be pretty difficult to prove that you did it unless the layouts are very unique. [...] However if someone is doing something unique and different and you copy it they may be able to come after you.

swa66 says:

Copyright is never intended to protect the way to do something (the idea), but is intended to protect the expression of that idea.

E.g. it'll does not protect the idea of writing a novel where the butler did it, but it'll protect the words, the drawings, etc. of the cover, the text of the book etc.

So in my book:

  • if you copy a css file: copyright problem
  • if you look at other css files to see how they solve a problem, no problem at all from a copyright point of view if you use the same method to solve the problem. (might be a patent issue, but living where software patents are void and nonexistent I really don't care about them at all).

Basically, for the most point, it's okay to get inspired by a certain design on the web. And since you wouldn't exactly use the HTML/CSS code without modifying for your website, there shouldn't be a problem.


Access Prisma Postgres from frontend

Prisma has introduced a way to securely talk to your database directly from the frontend, without needing the API layer between the client and the database. It's still in early access and they have shared the details in this blog post, and below is what they say:

Security rules in Prisma Postgres allow you to:

  • have an authenticated connection to your database
  • define fine-grained permission rules in plain TypeScript

I think, this can be huge after this is properly tested by the community.