Thursday, December 12, 2019

Lost it

There are some things, I will refer mainly to technologies, that in my opinion lost contact with reality and their audience.

1. C++
It started well as an object oriented language that was productive enough and prevented sone 'shoot yourself in the foot' behavior. It was familiar to C programmers and expressive enough to construct large-scale software. It fitted almost everywhere. Then there came STL, a brilliant idea, but it created chaos and made the language extremely hard to read. Then there was boost, another layer on top of STL trying to solve everything. Then the countless C++ revisions 0x11, 0x13, 0x17... They were meant to simplify the language (auto, lambdas, etc) but instead they succeeded to make it more unreadable. I had a shock when I tried to read C++ with the new 0x14 syntax extensions. It was... incomprehensible. Tooling is also suffering - only CLang is striving to offer some now taken for granted features of other languages in C++. Still no standard library. No common string type. Bizarre paralel features.

Right now, although still powerful, it is eaten by Rust, Swift, Go, D - all lighter and flexible alternatives to C++.

2. Angular
Again good idea ended bad. AngularJS/1 was not inline with WebComponent/Polymer but tried to impose some programming discipline. Then came Angular/2. Typescript. Why??? What was the added value. Compatibility was broken. No migration tooling. Still no Polymer conformance. Then Angular/4. Nightmare. There are simpler ways to keep JS SPA fit and sane as Flux/Redux, React, Vue.


Sunday, November 24, 2019

AlgoCheating

AlgoCheating

Created on 2019-11-25 06:12

Published on 2019-11-25 07:00

I had several interviews at tech giants. Some of them were successful some of them not. The main characteristic of the interviews was the focus on algorithms - and this is for a good reason as data grows you will need faster algorithms to partially keep up with data. In that period I studied mainly from the classical books (CLR, Skienna, ...) and solved problems that I knew that were given in the local IOI/CACM qualification rounds. I was able to do recursion, graphs, data structures and concurrency due to my individual study and also due to my university subjects. It was a decade long process. Getting to an interview was a huge deal then and required quite a lot of work.

In the menawhile things have changed a bit. Now it is pretty easy to get to an interview but more or less the focus on algorithms remained unchanged. As of today I see lots of sites, books and apps that promise instant learning of algorithms and data structures. They present both some theoretical parts and also some solutions to well known interview problems. On top of that they are also offering mock interviews to the candidates who want a job at one of the unicorns. On another scale there is also an entire industry of courses and "academies" that also promise fast learning of some CS subjects to people who want a career change

Typical ad for fast track cs learning

As an idea these are not bad things but I am skeptical about their consequences. The first issue I see is that they will create robotic candidates that will be able to mechanically reproduce only the content in the book "Supercharged Algorithms" (fictional title) or whatever they have purchased. These recipe-for-algorithmic-success books are in my own view just a cheatsheet for passing the interview. The second issue is the mock interview. Practicing it is okay but the risk is for a company that adheres to this type of interviewing to hire mostly good actors as it becomes easier for candidates to mock the desired behaviour. The third issue is that learning mechanically some technology is a recipe for disaster on the long term. It's "programming by coincidence" type of behaviour where one who graduated a fast track course on programming will apply what he/she has seen in the course. What it lacks is the element of metacognition, of critically thinking about the problem, context and solution. This is what a "Supercharged Algorithms" or "Java Academy" cannot teach in one month.

There are some major consequences of these. One is that the companies get fooled and hire based on memorised solutions and acting. The results will be seen in about a decade. The second issue is that this "fast track training" industry is undermining the academic degrees. Why learn CS for 3 to 5 years when you can get similar results with 3 to 5 months courses? Of course that there are brilliant people in CS that haven't majored in CS but most of them are self taught. They skilled up in years not in days. Most of these people have solid backgrounds in other sciences as Physics or Mathematics and they were able to apply abstract thinking in their new field. I am very much inline with what Steve Klabnik said:

"To make a terrible analogy: nobody expects plumbers to have a physics degree but they do have to know some things about water physics, and that can be learned in a way that doesn’t necessarily involve getting a physics degree. And that is super cool, totally valid, and not a problem. But that doesn’t mean that physics degrees are bullshit or not useful to plumbers"

What can be done to circumvent these? In my opinion the most important thing is to detect the "fake". Candidates should write code together with the interviewer, should be asked meta questions, cross checked their knowledge against other field (when someone says that an algorithm complexity is O(n) it would be a nice follow-up to ask how one demonstrates this mathematically - it would prove some basic calculus skills). Breaking one's script and making him/her think outside the box would be also a good way of discovering both potential and meta cognition.

I will finish in a bitter note though, as I see that the industry is going mostly for cheap and universities don't fight back in a determined way. It goes much in line with E.W. Dijsktra said: "So, if I look into my foggy crystal ball at the future of computing science education, I overwhelmingly see the depressing picture of "Business as usual". The universities will continue to lack the courage to teach hard science, they will continue to misguide the students, and each next stage of infantilization of the curriculum will be hailed as educational progress." Worse, the same infantilized curriculum is taken out from universities and is further stripped down of the little science it still has on the bones, and then, is sold as a panacea for successful interviews.

Wednesday, July 3, 2019

The Programmer's Toolchest

The Programmer's Toolchest

Created on 2019-07-02 14:33

Published on 2019-07-03 16:14

No alt text provided for this image

One of the most inspiring images is the above from the "Fine Woodworking" book back cover. It is emblematic to illustrate the attention for quality tools that ease the work professionals in some domain have. On my home DIY attempt I also tend to buy better tools as I find them easier to use, I got spare parts and they generally last longer in workshop conditions.

When it comes to software the things get more complicated as there the expensive tooling is not necessarily the best one. Clearcase or TFS although expensive do not do a better job than Git that is free. Neither MSVC compiler does a better job than GCC. So it becomes hard to choose a good tool for one's toolchest.

I generally select software tools after some time playing with them and I always weight up both the qualities and the price of the tool. I have many free tools in my collection but there are also some that cost, although there are also free alternatives for them. I have worked many years with some free tools before changing to non-free ones or to other free tools. Among the qualities I appreciate in a tool are:

Jacques Carelman - Teapot for Masochists
  1. Ease of use - It makes no sense to have a tool you cannot operate or if yo can operate it is not comfortable for longer usage. It would be like continuously using Jacques Carelman's contraptions. So before adopting a new tool I play with it a couple of weeks to check how it works on my system, maybe on another operating system. Then I try accomplishing both a simple, routine task with it as well as a more complicated one. Sometimes the decision is easy: VS vs. Rider but sometimes is damn hard: Eclipse vs. Idea, HG vs. GIT. In every situation have a tool that you know well and handle it perfectly. Know it's options, config files, keyboard shortcuts.
  2. Community - the more people are using it the better is it. Most of the time this is true, although there can be some more obscure tools or libraries that accomplish the job better or cheaper. Sometimes is wiser to keep a look on the second runner and also make some investments on them just to protect from the hype. AngularJS anyone today? It was great some years ago. Flash? Crowd is generally right but it can be also manipulated. So again, testing the tool and comparing it with alternatives helps a lot. If one can also read the source code, there are some important hints about the heath of the project.
  3. Coverage - I have my favourite tools for every aspect of software development: design, coding, testing, packing, operating. In every aspect I like to have a handy tool that ease my work and that, ideally, can be used to automate tasks.
  4. Specialized/Generic - as with power tools there are some tools that can, as a cordless drill can drill and screw, do several things. One of these tools is VS Code - you can do almost anything in it, a feature that only Emacs had. But VS Code still lacks behind other specialised tools, for example PyCharm when it comes to managing projects, libraries, virtual environments. So it's good to have both in portfolio and use them according to the actual needs.
  5. Extendable - I have mentioned Emacs before. It was the first true extensible tool. It could be customised from a Tetris game to a high end Prolog environment. Nowadays most of the tools follow the same trend and they can be adapted to get more functionality that initially designed for. I'd rather use VS Code to VS as the former one can be customised in so many ways while the latter one is quite rigid.
  6. Self-contained and small - I do not want to install 32 Gigabytes of libraries and frameworks just to run a single application. And I also do not want that the system gets bloated and unstable just because I had to install a design tool.

In the end how much should one developer invest in the tools used? To make again an analogy with the tools one buys for himself I consider that it would be fair to invest the same percentage of money in software tools as in DIY. If one can use the tools for freelancing or project work then this expenses can be deducted so in the end it really make sense to have a rich tool chest. Buying software or making a donation to an open source project really helps the creation of better/nicer tools as the money covers the countless hours of research we generally take for granted. Think about the quantity of research behind LLVM or Eclipse's EMF...

Thursday, June 20, 2019

Fedora 30

I have tested many distributions over the years and went from RedHat 5 to RHEL 8 and from Debian Sid to Gentoo. I even played with Gobo, Alpine, Deepin, Kali or Elementary. But among all those only two remained close to my heart. Ubuntu and especially Fedora.

Ubuntu is a swiss army knife with huge repositories and pretty neat user interface. But in my opinion it's a victim of its own success.

On the other hand Fedora used to be less polished but had decent quality software and was always on the bleeding edge of new software.





Things changed with Fedora 30. Both server and workstation work smooth. Workstation is very polished and integrates well in the AD network. For the daily development tasks it is the best I had so far. Ironically I am developing .NET core applications on it using VS Code.

In conclusion Fedora has just regained its position as my main development system for a while.

Monday, March 25, 2019

Organisational Annealing

Organisational Annealing

Created on 2019-03-25 19:52

Published on 2019-03-25 20:23

A lot of hard problems can be solved through meta heuristics inspired by annealing method. The idea is to make a change in a system and then "cool" down the system at a slow rate so that a new stable state is reached, generally with a less total entropy/internal energy. For some class of problems this works well - combinatorial problems benefit a lot of this. But what about organisations and groups?

In case of an organisation annealing could be seen as moving people between projects or responsibilities. After the things cool down, hopefully things in the organisation are smoother as some of the power circuits are broken and the total energy is less. It might be a good idea of disbanding cliques and clientelism, making the whole organisation more manageable. Some parts of an organisation are more brittle than other as their internal power structures tend to have different patterns.

On the other hand, unlike the natural annealing, this might lead to extreme tensions and rifts. People are not always easy to move, the new position might not be suitable for them, either due to Peter's principle or because they interfere with somebody's else path. Moreover people are often rational agents that try to maximize a fitness function for them. Hence the question that appears first is: "What's in it for me?". If the answer is "Nothing" or worse, then something happens. This is the equivalent of a crack, of a structural defect, that might lead to a rift in the metal especially in the brittle regions. This is especially dangerous in case of a leader relocation as the adjacent members in an organisation will be in a vulnerable state until a new leader emerges. If one in a leadership position decides to leave due to an internal annealing, chances are that other people from his adjacency will also leave. And in some case this is bad, extremely bad as it would take time to fill back the gap.

The worst possible situation is to repeat the process at short time intervals without letting time to the organisation to cool off and rebuild the internal informal structures - mostly a tempering than an annealing. The brittleness and resistance to change will increase dramatically and no change agents will be able to repair the troubled and tensioned structure. Organisational changes, role reassignments should be timely prepared and announced so no tempering can appear. People should be informed about the changes and at least some incentive should be in so that they would accept the change.

Organisational annealing

A lot of hard problems can be solved through meta heuristics inspired by annealing method. The idea is to make a change in a system and then "cool" down the system at a slow rate so that a new stable state is reached, generally with a less total entropy/internal energy. For some class of problems this works well - combinatorial problems benefit a lot of this. But what about organisations and groups?
In case of an organisation annealing could be seen as moving people between projects or responsibilities. After the things cool down, hopefully things in the organisation are smoother as some of the power circuits are broken and the total energy is less. It might be a good idea of disbanding cliques and clientelism, making the whole organisation more manageable. Some parts of an organisation are more brittle than other as their internal power structures tend to have different patterns.

On the other hand, unlike the natural annealing, this might lead to extreme tensions and rifts. People are not always easy to move, the new position might not be suitable for them, either due to Peter's principle or because they interfere with somebody's else path. Moreover people are often rational agents that try to maximize a fitness function for them. Hence the question that appears first is: "What's in it for me?".

If the answer is "Nothing" or worse, then something happens. This is the equivalent of a crack, of a structural defect, that might lead to a rift in the metal especially in the brittle regions. This is especially dangerous in case of a leader relocation as the adjacent members in an organisation will be in a vulnerable state until a new leader emerges. If one in a leadership position decides to leave due to an internal annealing, chances are that other people from his adjacency will also leave. And in some case this is bad, extremely bad as it would take time to fill back the gap.

The worst possible situation is to repeat the process at short time intervals without letting time to the organisation to cool off and rebuild the internal informal structures - mostly a tempering than an annealing. The brittleness and resistance to change will increase dramatically and no change agents will be able to repair the troubled and tensioned structure. Organisational changes, role reassignments should be timely prepared and announced so no tempering can appear. People should be informed about the changes and at least some incentive should be in so that they would accept the change.

Thursday, February 21, 2019

Parturiunt montes, nascetur ridiculus mus

Image by Game McGimsey; Public Domain - http://www.avo.alaska.edu/image_full.php?id=5927

Parturiunt montes, nascetur ridiculus mus

Created on 2018-10-11 18:46

Published on 2019-02-21 09:04

Or, in plain English, "Mountains are in labor, a ridiculous mouse will be born" (Horace).

Many organisations consume tons of resources and logistics just to put in place something that is so insignificant and useless that it really makes the sanity questionable. Generally this kind of projects start from a good idea, something that requires some kind enterprise engineering at some levels. People identify issues and possible solutions. They look around for examples of similar issues and how others succeeded in overcoming the obstacles. Examples of such a good idea are: common ways of working (especially after mergers) common development infrastructure, re-architecting products and so on.

Usually what happens is that, once the management acknowledges the initial issue, a committee that will handle the problem is put in place. The committee is formed by generally capable individuals but each one with its own baggage of interests and stakes in the project. They start thinking the idea, usually from conflicting stances and some progress emerges. Sooner or later, in these groups, some stakes become more important than others or some change resistance interests appear. IKEA syndrome is booming. Some projects have so many technical debt so that for them change would be suicidal. Others discover in terror that their functionality is already implemented by others, But the charter is there and managers committed to implement it. So the backstabbing and parallel communication begins, people forget the scope and the charter is liberally interpreted. The relations of trust between the committee's members are shaken and nobody works towards the initial goal. So, instead of giving birth to a magnificent mountain a mouse is born.

What to do in this case?

Generally, business as usual. Travel, meetings, paperwork, slideware, buzzwords. Everything is like a movie set, looks impressive from distance. On a closer scrutiny... well there is nothing or in best case very little worthy. The little mouse was created. Everyone involved is still in a newly painted silo. Many "change agents" are created so they can tell, as in "Emperor's New Clothes", that something really great is to be seen, no wonder that the organisation struggled so hard to produce it...

How to deal with this? There are several ways.

One is that this cannot be solved from inside but rather from the outside. I would personally bring some consultants from outside that are unbiased and can judge the decisions with no emotions or attachments. They should really listen and be mediators. They have to ponder all the aspects and write down pragmatic, sometime cruel decisions. They should be smart and technology wise so they can see migration paths and human enough to be able to compensate the emotional burdens.

The second view is that there might be a chance of solving it internally. This really involves a coordination game. There should be no hidden cards, no hidden agenda and in fact it involves a lot of sincerity, openness and altruism. These are hard to find in organization as changes really hit the amygdala or the base Maslow's pyramid. People will fear changes, they'll sabotage it even unconsciously. To prevent petty results and huge efforts people should reach agreements based on clear and open communication. They have to assume some suffering and discomfort. There should be some safety net professionally and materially that give them courage to go on.

Coming back to the output of the huge labor, rodents could grow big (think of a capybara) but this requires no natural enemies, at least not internally. Scoping correctly the goals and reducing the internal sabotage is the key for a viable and valuable output. Also keeping the internal theatre as low as possible also helps. If there is no internal struggle then the mouse can evolve, can grow - maybe in the end it will really be magnificent. Otherwise it will be always the petty effort of majestic mountains...

Wednesday, February 20, 2019

Wizards vs. Goblins




In a parallel universe the world is ruled by bit magic. The bit-magic flies around everywhere, it is used by peons to supervise their crops, by dwarves to control their mining rigs. There is for sure no place where the bit-magic cannot be found.

However the bit-magic is hard to capture, only some gifted characters can do it. They can catch the bit-magic and weave it in forms that are usable by peons, dwarves or other races.

The wizards live in high ivory towers, they are old and wise. They have endless scrolls written in elves languages that describe fantastic bit-magic weaving. Their magic is elegant, the shapes are pure, almost mathematically described, everything is formalized in higher level language that describe the meta-magic and the concrete magical creations are generating by casting the streams of meta magic into bit-magic generating crystals. The wizards communicate through owls that fly between the towers, carrying asynchronous messages in a decoupled temporally and spatially network. The wizards meet once every one thousand years and they discuss the completeness of the meta-magic and the crystalline structure optimizations for producing higher quality bit-magic. Those meetings are often a month long and they fight often because the elves languages they use are not always the same and they interpret words differently or disagree on the hierarchies. All the beautiful tools created by the wizards are updated once a millennium hence they have the aura of antiquities and people put high value on them although sometimes they are useless or utterly outdated.

The goblins on the other hand have simple ways of weaving the bit-magic. They write them down in the simplest form so that they all can read it. They go to the peasants and for small amounts of money they cast the bit-magic. Not always the best magic, but always workable. Everyone can use the products of their spell albeit they are sometimes cumbersome. When they see problems that they cannot solve immediately, they gather in their underworld caves and refine the spell so that they can always deliver a better version next day. Again goblin's aesthetics is not comparable with the intricate embroideries that the wizards put into their bit-magic spell but seems to be more reliable. Goblins seem to get along pretty well with each-other as they care about the global well-being of the goblin nation and they understand that every bit of magic they sell is for their collective well-being.

On this imaginary world whom would be the long time winner? The almost eternal wizards or the myriad of goblin generations? Whom would ultimately produce the best bit-magic?

The answer for this is simple - the survival will be dictated by the market. The ones that will be able to adapt to what market asks will be the preferred supplier. The ones who make less assumptions about their clients and are able to deliver to heterogeneous clients.

Similarly, in nowadays software markets, in our world, we can see companies that act like respected wizards or entrepreneurial goblins. Think of any need you have and you will see a creation made by an wizard-like corporation and countless of other that serve the same need but evolve faster (e.g SAP vs. Salesforce, Odoo; WebEx vs. Skype, Viber, Signal; AIX vs. Linux, FreeBSD). The second approach generally gives more generations of software in the same time frame so they adapt to the market quickly, they evolve smoothly. Moreover this also brings a delivery discipline. Even for desktop applications we can see a continuous delivery stream, for example Google's Chrome updates. This devops like culture that is focused on short and fast release cycles and is oriented towards value streams seems to get mainstream and thrive on every market. The only pain point is the attention to the details and the craftsmanship the wizards have in their products and that also be carried in this novel approach

Tuesday, February 12, 2019

The Indian Neighbour

I live in a block of flats on a dead end street near some student dorms.

The area is extremely crowded and one can barely find parking space. Therefore we always race to occupy any free parking spot as soon as we spot it otherwise the students will occupy it.

Basically there are 4 parking spots for every 15 flats in the area. But as cars multiplied they are always scarce.

Although there are some people with two cars they manage somehow to juggle with the cars and still be reasonable.

However we have an Indian neighbour. Initially he had just one small car. Perfect one for the crowded area. He was sometimes visited by his in-laws for a week so a second car was parked near the block. His in-laws live about 300 meters away - so walking would not have been a problem.

Then a few of months ago he bought a second car. A huge old sedan that barely can be manoeuvred  in the tight space we have. The sedan is now occupying a place as he doesn't drive it for weeks.

We were not happy but as there were also other neighbours with two cars we accepted the situation.
But the situation has got worse. The neighbour just bought his 3rd car - a station-wagon. And again he doesn't drive it at all but keeps it parked near the next block - making also our neighbours happy.

It is not about the nationality of the neighbour. It is about the lack of coordination in the Romanian society and the lack of mechanisms of regulating excesses. The neighbour could have been Japanese, German or Romanian. The problem is the same. Adapt to the resources you have. Do not use shared/public propriety in an exclusive way.

There is no easy way for us to regulate this as the town hall is an opaque institution and there is no law that can be applied in this case. The parking is not given by God to the righteous ones but rather it is a resource that should be prioritised for those that really need a place nearby - people with young children, elder people, etc. If one is in any of the fore-mentioned categories this is not a right for a second (or even a 3rd or 4th car - if we count the in-laws) on all the 4 places our block has.

I do not understand why the town hasn't yet responded to our petition of marking the places. I am not expecting that they will rent them (as it's normal in other cities) but at least make the waste visible - this is the first step for creating a control mechanism.