As an HR initiative, every Monday we receive an email about an inspiring story. While some of the stories are great, others are just okay, and some have become so repetitive that they have lost their inspirational luster.

Today, I decided to try an experiment, and this is the result of that experiment.

In ancient India, there lived a young couple named Rohit and Radhika. They were deeply in love and were considered a perfect match by all who knew them. However, one day, Radhika began to doubt Rohit’s love and accused him of being unfaithful. Rohit was heartbroken and tried his best to prove his love, but Radhika remained suspicious.

One day, Lord Vishnu appeared before the couple in the guise of an old sage. He saw their distress and offered to help. Rohit and Radhika agreed, and the sage shared with them a story from Hindu mythology about the importance of trust in a relationship.

The sage told the story of a mighty king named Raja Janaka, who had everything a man could ever want – wealth, power, and a loving wife. Despite all of this, the king was not happy. One day, Lord Vishnu appeared before him and asked why he was so unhappy. The king confessed that he had lost the trust of his wife, and that his marriage was on the verge of collapse.

Lord Vishnu then told the king a parable about two pots of gold. One pot was placed in a safe, while the other was kept in an open field. The king was asked to guess which pot of gold would be safe from theft. The king replied that the pot in the safe would be safe, as it was protected by strong walls and a lock.

Lord Vishnu then revealed that the pot in the open field was actually safe. The king was confused and asked how that was possible. Lord Vishnu explained that the pot in the open field was guarded by the trust of the villagers, and that no one would dare to steal from it.

The sage then explained to Rohit and Radhika that just like the pot of gold, a relationship needs trust to be safe and secure. Without trust, a relationship will be like the pot in the safe – vulnerable to theft and destruction. But with trust, a relationship will be like the pot in the open field – guarded and protected.

Rohit and Radhika listened to the sage’s words and realized that their relationship was like the pot in the safe. They made a vow to each other to nurture their relationship by planting the seeds of trust and watering it with love and understanding.

– an experiment

I won’t disclose the details of the experiment in this post, but I promise to reveal it in a future post. If you have already figured out what the experiment was, please share your thoughts in the comments section below.


Handling Multiple Inputs with argparse in Python Scripts

argparse demo for multiple inputs

The problem.

ffmpeg allows multiple inputs to be specified using the same keyword, like this:

ffmpeg -i input1.mp4 -i input2.webm -i input3.mp4

Let’s say you are trying to write a script in python that accepts multiple input sources and does something with each one, as follows:

python_script -i input1.mp4 -i input2.webm -I input3.mp4

How do we do this in argparse?

Using argparse, you are facing an issue as each option flag can only be used once. You know how to associate multiple arguments with a single option (using nargs=’*’ or nargs=’+’), but that still won’t allow you to use the -i flag multiple times.

How can this be accomplished?

Here’s a sample code to accomplish what you need using argparse library

import argparse

parser = argparse.ArgumentParser()
parser.add_argument('-i', '--input', action='append', type=str, help='input file name')

args = parser.parse_args()
inputs = args.input

# Process each input
for input in inputs:
    # Do something with the input
    print(f'Processing input: {input}')

With this code, the input can be passed as: -i input1.mp4 -i input2.webm -i input3.mp4

The key in the whole program is the phase “append” in the action keyword.

Hope this helps.

Learn more

If You’re Not in the Game, You Can’t Hit a Home Run

New Year resolutions often face skeptical responses from friends, with the idea dismissed as ineffective. However, I still believe in their value.

Although many of my resolutions fail, some stick and bring positive changes to my life, such as a daily routine of waking up early, reading a chapter of a book, and doing 30 pushups.

This supports the sentiment expressed in the book “How to Change” by Angela Duckworth and Katy Milkman, which highlights the science of achieving desired outcomes.

“Of course, I understand where they’re coming from. I’ve been frustrated with failed resolutions in the past, too, and I’m committed to teaching more people about the science that can help them succeed.

But this question still drives me a little crazy. As actor David Hasselhoff has said, “If you’re not in the game, you can’t hit a home run.”

In my opinion, New Year’s resolutions are great! So are spring resolutions, birthday resolutions, and Monday resolutions. Any time you make a resolution, you’re putting yourself in the game.

Too often, a sense that change is difficult, and daunting prevents us from taking the leap to try. Maybe you like the idea of making a change, but actually doing it seems hard, and so you feel unmotivated to start. Maybe you’ve failed when you attempted to change before and expect to fail again. Often, change takes multiple attempts to stick.

I like to remind cynics that if you flip the discouraging statistics about New Year’s resolutions on their head, you’ll see that 20 percent of the goals set each January succeed. That’s a lot of people who’ve changed their lives for the better simply because they resolved to try in the first place.

Just think of Ray Zahab, transforming himself from an unhappy, out-of-shape smoker to a world-class athlete.

For some people, fresh starts can help prompt small changes. But they can also inspire transformative change by giving you the will to try pursuing a daunting goal.”

How to Change” by Angela Duckworth and Katy Milkman

Read More

How to Suppress Terminal Window For Python Scripts

In windows python scripts are executed by python.exe by default. This executable opens a terminal, which stays open even if the program uses a GUI.

What to do if you do not want this to happen?

Well use the extension .pyw. This will cause the script to be executed by pythonw.exe by default. pythonw.exe suppresses the terminal window on startup.


you can run your script using the pythonw.exe command like this

c:>pythonw.exe c:\scripts\

Hope this helps. Most of my automation/daily backup scripts on my office computer are running this way and do not leave a visible footprint on the taskbar

Read More


I am an optimist. It’s used to feel uncomfortable if friends taunted me over this. They gave looks that said, “naïve fool”.  But I had always a different view on optimism and was glad to read someone verbalizing the exact feeling I feel about optimism

Physicist David Deutsch says “optimism is a way of explaining failure, not prophesying success.” My interpretation of that is: Saying you are optimistic does not mean you think everything will be flawless and great. It means you know there are going to be failures and problems and setbacks, but those are what motivates people to find a new solution or remove an error – and that is what you should be optimistic about.

Part of the reason commercial air travel is relatively safe is because after every accident comes an intense learn-and-fix process that reduces the odds of future accidents. The same is generally true for businesses, economies, governments, pandemics, etc. The reason things tend to get better is that there are so many blunders, screw-ups, and disasters to learn from. You can’t separate one from the other. Evolution doesn’t teach by showing you what works, but by destroying what doesn’t.

Read more here

Json tool hidden in plain sight

I have been using the json module for as long as I can remember. It’s part of the standard library and is part of my daily work, so it was a pleasant surprise when I learnt about the hidden json.tool.

import json
python -m json.tool 
C:\Users\sukhbinder.singh>python -m json.tool -h
usage: python -m json.tool [-h] [--sort-keys] [--json-lines] [infile] [outfile]

A simple command line interface for json module to validate and pretty-print JSON objects.

positional arguments:
  infile        a JSON file to be validated or pretty-printed
  outfile       write the output of infile to outfile

optional arguments:
  -h, --help    show this help message and exit
  --sort-keys   sort the output of dictionaries alphabetically by key
  --json-lines  parse input using the jsonlines format

Example usagse

curl -s | python -m json.tool



        “id”: 301,

        “question”: “India is an ___ land”,

        “answer”: “ancient”,

        “inum”: 2,

        “due_date”: “2021-04-05T06:23:12+05:30”,

        “active”: false,

        “chapter”: 0,

        “isvoiceonly”: false,

        “subject”: 4



        “id”: 302,

        “question”: “As of May 2008 there are ___ officially recognised or scheduled languages in India”,

        “answer”: “22”,

        “inum”: 2,

        “due_date”: “2021-04-05T06:23:17+05:30”,

        “active”: false,

        “chapter”: 0,

        “isvoiceonly”: false,

        “subject”: 4



        “id”: 303,

        “question”: “___ is chosen as the official language of the government of India”,

        “answer”: “hindi”,

        “inum”: 3,

        “due_date”: “2021-04-24T12:39:27+05:30”,

        “active”: false,

        “chapter”: 0,

        “isvoiceonly”: false,

        “subject”: 4



I love this. I was always using VScode to preety print my json files but with this find all this and more can be done using standard python.

It’s not the Cup

Here’s a story that started my year. Hoping I remember this for the entirety of this year and beyond.

A group of highly established alumni gathered to visit their former university professor. The conversation between them soon turned into complaints about their stressful life and work.

The teacher went to his kitchen and returned with a large with many large cups of coffee and a selection of mugs including porcelain, plastic, glass crystal, some ordinary, some expensive and some exquisite. The teacher asked them to pour themselves coffee

After all the students had the cup of coffee in their hands the teacher said, “Have you noticed that all the beautiful cups are taken and only the cheapest one are left? Although it is normal for everyone to want the best for themselves it is the source of problems and stress in their life. The cup itself does not escalate the test of coffee. In most cases it is more expensive and hides what we drink”

The teacher continued. “What everyone desired was coffee not the cup but everyone consciously picked nice expensive cups and then started looking at each other’s cup.”

via this

Git: SSL Certificate problem: unable to get local issuer certificate.

Here’s a problem that I ran into a few months ago on a system while cloning a remote git repo.

 $ git clone
Cloning into 'winsay'...
fatal: unable to access '': SSL certificate problem: unable to get local issuer certificate

I am using Git on Windows. Installed the msysGit package. Test repository has a self-signed certificate at the server. Can access and use the repository using HTTP without problems. Moving to HTTPS gives the error:

SSL Certificate problem: unable to get local issuer certificate.

I have the self signed certificate installed in the Trusted Root Certification Authorities of my Windows 7 – client machine. I can browse to the HTTPS repository URL in Internet Explorer with no error messages.


Open Git Bash and run the command if you want to completely disable SSL verification.

git config --global http.sslVerify false

Note: This solution opens you to attacks like man-in-the-middle attacks. Therefore, turn on verification again as soon as possible:

git config --global http.sslVerify true

Hope this helps others who get this or similar error. Do let me know.

Other related posts

Something Gota Give

Yes, if you take on a lot of projects and you have the same number of hours something has got to give. And posting on this blog was that something for me.

From September 2022, the posting receded and stopped. Well now things are back in control, and I hope I will be able to give the blog the same consistent attention again.

Lets start from here. Thank you for reading.

Happy New Year. I wish you a healthy wealthy 2023!!

Export PowerPoint Slides with Python

A couple of years ago, I had this issue where I needed to export slides of powerpoint as png. There were a lot of them, so doing them manually was out of question, here’s a quick python script to export powerpoint slides to png.

import sys, win32com.client

class ApplicationEvents(object):
    def OnQuit(self):

spath = r"C:\Users\sukhbinder\Desktop\cool_presentation.pptx"

app = win32com.client.DispatchWithEvents("Powerpoint.Application", ApplicationEvents)
doc.Export(r"C:\Users\sukhbinder\Downloads", "PNG")

Hope this helps someone.

Some related posts

Lion And Cow

Recently was looking at my book notes and found this from the Book Antifragile by Nassim Taleb.

Note a subtlety in the way we are built: the cow and other herbivores are subjected to much less randomness than the lion in their food intake; they eat steadily but need to work extremely hard in order to metabolize all these nutrients, spending several hours a day just eating. Not to count the boredom of standing there eating salads.

The lion, on the other hand, needs to rely on more luck; it succeeds in a small percentage of the kills, less than 20 percent, but when it eats, it gets in a quick and easy way all these nutrients produced thanks to very hard and boring work by the prey.

So take the following principles derived from the random structure of the environment: when we are herbivores, we eat steadily; but when we are predators we eat more randomly. Hence our proteins need to be consumed randomly for statistical reasons.

So if you agree that we need “balanced” nutrition of a certain combination, it is wrong to immediately assume that we need such balance at every meal rather than serially so. Assuming that we need on average certain quantities of the various nutrients that have been identified, say a certain quantity of carbohydrates, proteins, and fats.

There is a big difference between getting them together, at every meal, with the classical steak, salad, followed by fresh fruits, or having them separately, serially. Why? Because deprivation is a stressor—and we know what stressors do when allowed adequate recovery.
Convexity effects at work here again: getting three times the daily dose of protein in one day and nothing the next two is certainly not biologically equivalent to “steady” moderate consumption if our metabolic reactions are nonlinear.

It should have some benefits—at least this is how we are designed to be.

Ponder over this.

Some related posts that you may like

Noise to Signal

My cousin is good at stick picking. He has the right temperament. I had seen him pick up a lot of positions in march-jun 2020 when the world was closing down due to Covid. He was shrewd and fearless and got some good stocks.

But one thing that I fail and don’t understand is his regular and hourly checks of his portfolio or the stock market.

I pointed this out to him, but he was least bothered. By profession, he is a businessman and this is how he spends most of his free time.

Here’s a passage from the book Antifragile by Nassim Taleb that I keep sending him to.

The more frequently you look at data, the more noise you are disproportionally likely to get (rather than the valuable part, called the signal); hence the higher the noise-to-signal ratio.

And there is a confusion which is not psychological at all, but inherent in the data itself. Say you look at the information on a yearly basis, for stock prices, or the fertilizer sales of your father-in-law’s factory, or inflation numbers in Vladivostok.

Assume further that for what you are observing, at a yearly frequency, the ratio of signal to noise is about one to one (half noise, half signal)— this means that about half the changes are real improvements or degradations, the other half come from randomness.

This ratio is what you get from yearly observations. But if you look at the very same data on a daily basis, the composition would change to 95 percent noise, 5 percent signal.

And if you observe data on an hourly basis, as people immersed in the news and market price variations do, the split becomes 99.5 percent noise to 0.5 percent signal.

That is two hundred times more noise than signal—which is why anyone who listens to news (except when very, very significant events take place) is one step below sucker.


Less is more. Less news, less noise.

Here are a few related posts you may like.

Example of Subparser/Sub-Commands with Argparse

I like argparse. yes there are many other utilities that have and make life easy but I am still a fan of argparse mostly because it’s part of the standard python installation. No other installs needed

Argparse is powerful too, if you have used, git you should have experienced the subcommands. Here’s how one can implement the same with argparse.

def main():

    parser = argparse.ArgumentParser(description="Jotter")
    subparser = parser.add_subparsers()

    log_p = subparser.add_parser("log")
    log_p.add_argument("text", type=str, nargs="*", default=None)

    show_p = subparser.add_parser("show")
    show_p.add_argument("--all", action="store_true")
    show_p.add_argument("--id", type=int, default=0)
    show_p.add_argument("-s", "--skip", type=int, default=0)
    show_p.add_argument("-l","--limit", type=int, default=100)

    search_p = subparser.add_parser("search")
    search_p.add_argument("search", type=str, default=None)
    search_p.add_argument("-limit", type=int, default=100)

    args = parser.parse_args()

In the above code jotter is our main command, it has other subcommands like jotter log, jotter show jotter search.

Have you used this before?

Some related posts

Automating Copying of Files from Raspberry Pi using Python

My Rasberry pi has just a 32GB memory card, so another issue I face with my timelapse automation is regularly copying the files from the raspberry pi to my laptop.

I have tried various options like git, secure copy (SCP), FTP, ssh etc All of them work but have their limitations.

But there is one system that I have finally stuck and works seamlessly. As again its implemented with python and used wget cmd-line tool

Here’s the code that lets me transfer the files from the raspberry pi to my laptop. I just run this on schedule on my mac every week.

from datetime import datetime, timedelta
import os
import subprocess
import argparse

BASE_URL = r"{}"

def get_dir(day=1, outfolder=r"/Users/sukhbindersingh/pyimages"):
    if day > 0:
        day = day * -1
    yesterday = now+timedelta(days=day)
    datestr = yesterday.strftime("%m_%d_%Y_")
    fname = "v_{}_overval.mp4".format(datestr)
    fname_src = BASE_URL.format(fname)
    cmdline = "wget {}".format(fname_src)
    print("downloading {}".format(fname_src))
    iret =
    return iret

parser = argparse.ArgumentParser("download_video", description="Download raspberry pi videos")
parser.add_argument("-d", "--days",type=int,  help="No of backdays to download", default=1)
parser.add_argument("-o", "--outdir", type=str, help="Output dir where downloaded file will be kept", default=None)

args = parser.parse_args()

outfolder = args.outdir
if outfolder is None:
    outfolder = r"/Users/sukhbindersingh/pyimages"

for day in range(args.days):
    iret = get_dir(day+1, outfolder)

How will you solve this issue? Do you have another way that this can be solved? Do let me know in the comments.

Read related posts

The Power Of Financial Independence

Here’s something that showed up on my Twitter timeline on India’s 75th Independence day.


Your savings, believe it or not, affect the way you stand, the way you walk, the tone of your voice. In short, your physical well-being and self-confidence.

A person without savings is always running. You must. You must take the first job offered, or nearly so. You sit nervously on life’s chairs because any small emergency throws you into the hands of others. Without savings, a person must be too grateful. Gratitude is a fine thing in its place. But a constant state of gratitude is a horrible place to live in.

A person with savings can walk tall. You may appraise opportunities in a relaxed way, have time for judicious estimates and not be rushed by economic necessity.

A person with savings can afford to resign from a job, if your principles so dictate. And for this reason you’ll never need to do so. A person who can afford to quit is much more useful to the company, and therefore more promotable. You can afford to give your company the benefits of your most candid judgments.

A person always concerned about necessities, such as food and rent, can’t afford to think in long-range career terms. You must dart to the most immediate opportunity for ready cash. Without savings, you will spend a lifetime of darting, dodging.

A person with savings can afford the wonderful privilege of being generous in family or neighborhood emergencies. You can take a level stare into the eyes of any one, a friend, a stranger or an enemy. It shapes your personality and character.

The ability to save has nothing to do with the size of income. Many high-income people, who spend it all, are on a treadmill, darting through life like minnows.

The dean of American bankers, J.P. Morgan, once advised a young broker, “Take waste out of your spending: you’ll drive the haste out of your life.”

Will Rogers put it this way, “I’d rather have the company of a janitor, living on what he earned last year… than an actor spending what he’ll earn next year.”

If you don’t need money for college, a home or retirement then save for self-confidence. The state of your savings does have a lot to do with how tall you walk.

(From an advertisement in the 60s, edited to make it gender neutral)

Principal Component Analysis in pure Numpy

In 2009 I was working with principal component analysis PCA in my job. It was my first introduction to this topic, so I played with it in the office and at home in my spare time.

Python was my favourite play tool at that time. Stumbled upon this code that I wrote in 2013 as part of a personal project.

In case you are wondering what is PCA?

Principal component analysis (PCA) is a standard tool in modern data analysis and is used in many diverse fields from computer graphics, machine learning to neuroscience, because it is a simple, non-parametric method for extracting relevant information from enormous and confusing data sets.

With minimal effort PCA provides a map for how to reduce a complex data set to a lower dimension to reveal the sometimes hidden, simplified structures that often underlie it.

Shame I did not have GitHub then, or it would have been posted there, so here it goes.

# -*- coding: utf-8 -*-
Created on Sun Jan 31 11:03:57 2013

@author: Sukhbinder

import numpy as np

def pca1(x):
    """Determine the principal components of a vector of measurements

    Determine the principal components of a vector of measurements
    x should be a M x N numpy array composed of M observations of n variables

    PCA using covariance
    The output is:
    coeffs - the NxN correlation matrix that can be used to transform x into its components
    signals is MxN of projected data

    The code for this function is based on "A Tutorial on Principal Component
    Analysis", Shlens, 2005
    (M,N)  = x.shape
    Mean   = x.mean(0)
    y      = x - Mean
    cov    =,y) / (M-1)
    (V,PC) = np.linalg.eig(cov)
    order  = (-V).argsort()
    coeff  = PC[:,order]
    signals =,y.T)
    return coeff,signals,V

def pca2(x):
    """Determine the principal components of a vector of measurements
    Determine the principal components of a vector of measurements
    x should be a M x N numpy array composed of M observations of n variables
    The output is:
    coeffs - the NxN correlation matrix that can be used to transform x into its components
    signals is MxN of projected data
    The code for this function is based on "A Tutorial on Principal Component
    Analysis", Shlens, 2005
    (M,N)  = x.shape
    Mean   = x.mean(0)
    y      = x - Mean
    yy = y.T/np.sqrt(M-1)
    u,s,pc = np.linalg.svd(yy)
    signals =,y)
    return pc,signals,v

scikit-learn etc and other libraries do have PCA so what was the need to write PCA code?

Well, I was trying to understand PCA deeply and I couldn’t use the library sklearn so this piece of code was written completely in numpy which helped me reduce the resolutions of my family pictures back in 2013 before google photos made this redundant. 🙂

Related Posts

Rakesh Jhunjhunwala story in his own words

The long weekend that just went by started with a sad news. Rakesh Jhunjunwala is no more.

He was an eternal optimist on India. And this is what I liked about him.

Here’s a story of Rakesh Jhunjhunwala in his own words.

A must watch for everyone, even if you have the least interest in investing and stocks.

Happy Independence Day

The Internet is amazing. I wrote a python program that does the above interactive animation almost ~9+ years ago to celebrate Independence Day and this is still available and working. Amazing. I am truly surprised.

To give it a try follow this link. Click run and then drag your mouse within the black screen.

Do give it a try, the gif is a poor rendition of what the program actually produces.

As always the code is available here too on this blog.