However, the author differentiate strictly DPI from PPI. Although he briefly said that PPI doesn’t matter, because the video system in a computer doesn’t use that concept, but I will say that it matters. Because to users, video system recognized by them is video system used in computer architecture + manufactured pixel density of your monitor.
This one is another blog post which explains the same thing.
However, I will call this post wrong. It’s not because he explained things wrongly, but he didn’t understand there are people who says “DPI” when they actually mean “PPI”.
He didn’t differentiate DPI and PPI. If he wrote it for people who mean PPI while mumbling DPI, he should wrote it differently. But it looks to me that he wrote the post for people who mean DPI while mumbling “DPI”.
PPI is rather recent term. Previously, DPI was used interchangeably for monitors and printers. Back then, still there was difference in printing pixel density and video pixel density, but at that time we had used DPI for monitors and printers. That custom keeps goes on. Can’t believe me? What does MS call their S/W solution for modern monitors with higher density pixels? High-DPI-aware.
Apple calls it Resolution Independency, but they also mumbles the word, High-“DPI”, in their documentation.
So, those two very technical companies, Apple and MS, call it High-“DPI” when they actually mean “PPI”. In this industry where the oldies and newbies live together, “history” and “habit” are important things to understand what others say.