Latest Posts

Brains to computers, not happening

The idea of somehow uploading the state of the brain into an artificial one is often mentioned not just as science fiction, but also as some sort of transhumanist hope that is being worked on. This may be an interesting exercise to entertain, a research goal to pursue, but it's definitely doomed to fail, because it's a flawed idea on multiple levels.

The brain is a physical and very dynamic object, one build by cells which are living things. Neurons are destroyed continuously and are partly generated. Nutrition, physical trauma and many more subtle things, all affect the continuous changes of the brain. Neurons in time can establish different connections... everything is so dynamic and biological, it's a system so complex and so dependent on external environment that it's practically impossible to somehow recreate the necessary complete system that would operate, respond and evolve even remotely like the actual thing.

So, a digital brain that by some incredibly futuristic technology would be able to initially mirror an original biological brain, would progressively diverge from the original brain, because of fundamental mechanics but also because it couldn't possibly be exposed to the same effects, unless the digital brain would be so advanced that would for example be able to sample the blood of the host for drugs and alcohol and simulate those effect that a normal brain would have... but here we'd be talking about an understanding and a simulation so complex that by the time that one would be able to achieve that, the human brain would be practically irrelevant.

Of course one could simply decide to switch to a digital brain and go along with its relatively crude simulation, perhaps unable to process external effects related to what one ingests and breathes in. That would definitely quickly become something very different, where, paradoxically, the potential plasticity of that artificial brain may have to be limited to mimic the real thing using some arbitrary and approximative parameters. That is assuming that one may even reach that level of sophistication that today is unthinkable.

In conclusion, the digital brain replacing a biological one is just a flawed idea. If the goal is some sort of immortality, then one either tackles it from a biological point of view. The alternative would be to recreate a perfect biological simulation, basically a small virtual universe that mirrors the laws of nature, while also operating under the laws of nature (hard/impossible task in itself). Otherwise one simply decides to switch into some technology that is incredibly advanced, but simplified, and that is tweaked to mimic the real thing with some sort of containment programming put in to avoid that the artificial brain takes its own wild evolutionary path... which doesn't sound fun at all.

The “rich enough” fallacy

How many times have we heard about the ultra-rich wanting to "give back", saying that they just have too much money and that they feel the need to get rid of it.

That's of course a lie. It's just a way to avoid paying taxes, allocating that wealth instead towards charitable organizations that in exchange will accept to sustain whatever cause the donor wants to push for. It does make sense not to want to give your money to the government, but it is unfair that ultra-rich are able to do this while the rest of the population can't. Of course everyone can donate, but that doesn't buy them power and influence.

The point though it's that there's no such thing as feeling like you have too much wealth and that you have to give it away. It takes a lot of effort to reach a certain amount of wealth. One needs to own 10 millions before owning 100, and then needs to own 100 and 500 before owning 1 billion.

There's plenty of time and plenty of chances to stop becoming that rich in the course of the many years that it takes. It also takes a special drive to want to accrue that much wealth and the power that comes with it.

The general population can't normally wrap their heads around the idea of wanting more wealth and power for its own sake (or more likely, as a competition). When you live dreaming of owning a couple of houses and a couple of sports cars, you may think that anything beyond that is just gravy.

That's now how it works. To become extremely wealthy it takes a special kind of drive, the kind of drive that doesn't just dissipate. In fact, with time one may feel inclined to think that he's some kind of god figure, and may start to feel entitled to mold the world to his own image. This works for politicians as well, although those tend to settle for perhaps less actual power, but more visibility.

Anti-socials

Four months in, quitting Twitter and Facebook (I never really browsed Instagram) feels great. As I wrote when I decided to quit, I felt that it was repetitive and that I wasn't going to miss anything. I can confirm that now.

I still have personal accounts, but I use them solely to repost posts from my business accounts, something that happens maybe twice per month. I also have a Discord chat for business, which in itself is a bit of a community. Discord has an IRC vibe to it, although it's of course corporate stuff with no privacy.

I also didn't do much about alternative socials such as Gab and Minds. I've had accounts there for a long time. When I was on Twitter I'd at least repost those tweets, but now that I'm out of the loop, I don't feel like posting anything on any social networks. I think that it's the format itself that is wrong.

What makes more sense it's to just hang out in private groups of friends on messaging apps. That's definitely more natural and also a lot safer for people that may be afraid to lose their job for saying certain things on public squares such as Twitter. Thankfully that's not my case, but nevertheless, I recognize the benefit of small chat groups as opposed to openly posting on socials where it's more likely to be noticed the day that one says something that gets him crucified.

Hypnotized emotional beings

As a kid I grew up excited and optimistic about the future. The wonders of modern technology, the great achievements of the human species, such as landing on the Moon and the promises of computers, AI and robotics, all gave a picture of living in an era exponentially more advanced when compared to the past up to that point.

The end of the Cold War was also an incredibly hopeful event, one that showed great promise towards some sort of ideal Star Trek-future, where humans have settled all major diatribes and now need to go out in Space to look for new challenges also at the societal level.

As far as technology goes, I still dream of what will be possible in future, but when it comes to people, that's a different matter. No matter how we advance technologically, at the core we're still very emotional and instinctive. There's clearly a place for emotions in our biological make up, because those tend to favor procreation.
It's hard to argue against natural selection. Nevertheless, I think that it's important to understand how other humans think and behave, because we all depend on each other and we're subject to the choices that others may impose on us, directly or indirectly.

I have been following on and off the US politics during the years. I got dragged into it again in 2015 with the presidential campaigns and elections. By the end of 2020 I came to the conclusion that we live in a very artificial reality. There are people that are incredibly motivated to come to power, there's a power structure that allows them to reach a certain level, and there's infrastructure such as the media (news and social) that is also vital to support those power structures.

In all this, the general population is stuck in the middle. They are pushed and pulled until they pick a team and they are periodically fed half truths from their team. In addition to that, today, suspicion of the government and of institutions in general will quickly label anyone as a gullible conspiracy theorist.
Admittedly, many people do go overboard with their skeptical thinking at some point or another, but they can't be blamed too much for losing trust in institutions that are virtually always selective with facts, supposedly for the greater good.

Looking a bit deeper into the past, my conclusion is that humans haven't evolved that much. There's less poverty and maybe more justice now, but emotions still run the show. Rational arguments have a place, but they are not where power lies on. Power is acquired by persuading the masses, and that happens pretty much by any means possible.

I used to think that it made sense to call on hypocrisy the other side on a political or ideological war, but I realize now that that's an unproductive defensive position. It's naive to think that hypocrisy is something that is going to dissuade someone from getting their chunk of power. In the battle for ideals, those that stop and try to be rational are those that tend to lose.

Perhaps rationality has a good enough effect in small groups, but at the larger level, all that matters is sound bites and general propaganda. Repeat, repeat, repeat, until there's a working reality distortion field of your choosing. Lying is perfectly valid. It doesn't really matter as long as you can sell it.
Of course I wouldn't do that, but I'm also not trying to start a career in politics 8)

Teach a dog to wear its muzzle

Recently I left the house during daytime to go for some errands in a more crowded area of Tokyo. It was impressive to see 99.9% of the people all dutifully wearing a mask, even in the open and in spots that are not so crowded. I personally wear a mask when I enter a closed area, mostly out of respect, because there's a stronger demand to do so, because there's an obvious limitation of the air circulation. In the open however I consider it as an excess, and I think that people should consider reclaiming the freedom of not being muzzled as much as possible.

I guess that many don't take the mask off because it's easier to leave it on. Some I'm sure actually like the idea of a mask. Maybe because it gives them more privacy, maybe because they truly feel that they are "saving lives".

I'll assume however that most people would rather go back to the default of not wearing a cloth also for their face. By now we've heard contradictions on just about anything regarding COVID-19. Yet, most people would rather live a guided life, one where there's no questioning of authority, even in face of obvious contradictions and even at the cost of personal freedom.

Part of me wishes that people would realize that this level of conformity is dangerous to themselves and to society. Lack of individuality can be used to further truly evil plans, but it's also a state in which mediocrity can flourish at the expense of exceptionalism, without which technology and society can't evolve.

The other part of me is "black pilled", if most people are happy with being guided by politicians and high priests of creative science, to the detriment of their freedoms, then perhaps they deserve it. What is my place in all of this then ? Should I scream from the rooftops and try to tell everyone that life is not that good if you're just another brick in the wall, or should I perhaps come to terms with the fact that that's just how society is right now, and if I can't fight the social engineers, then perhaps I should join them ?

If people are willing to be scammed, maybe that's what they deserve and maybe I need to take a piece of that. I suppose that by trading the markets I'm already making a buck out of people's stupidity, because it's likely that most of what my algorithms earn comes from the "dumb money", from people that jump into trading thinking that money is there for the taking, while it's only really there because them and their peers are losing it.

However it goes, I'll always value family and friends, and I still consider myself as a good guy, but in some ways life is a zero sum game. Help those that want to be helped, but don't break your back to help those that act like sheep and are begging to be taken to the slaughter.

Seek financial independence

Although I can be rather formal and I'm generally not a rule breaker, I'm also somewhat of an anticonformist with a strong need not to feel like another cog in some machine, be that machine society or place of employment.

As a kid born and raised in Italy, I remember this culture of aiming to someday "find a job". Passion for computers and programming and a willingness to move outside the country was thankfully my ticket away from a possible life that I would have considered a curse for myself.
I then proceeded to work for many years in game development, doing mostly 3D programming, something that always compelled me, because it was both technically challenging and also offered the exciting perspective of tapping into the creation of virtual worlds where one could easily experience new things.

On my first real try for independence I still went for video games, because that's what I knew best and because there were things that I wanted to create an publish, which can be quite rewarding.
However games aren't really such a great business. Today, game development has largely been commoditized. The first mobile hardware still required a certain degree of technical skills to publish something noteworthy, but that has progressively not been the case.

Although I think that there's still a lot of room for application of technology to game development, that's something that is better done in a larger team, as an employee with maybe a great salary, but still an employee nonetheless, at an age when one is supposed to become a manager and stop worrying about software engineering... no thanks.

From that perspective, the best move that I could have done was probably what I did when I put all my efforts into making something of algorithmic trading. It's taken a lot of time and effort to finally have some degree of confidence in it, but it's given me a direct path to build wealth. Unlike when developing a game, right from the start there's a sense that one could put some algorithm live on some market and start to print money. That's unfortunately not the case, and all things considered it still took a good couple of years of continued work before hoping to truly see some profits being generated with some degree of confidence.

Nevertheless, to be working with finance it's still a much more direct path to wealth, and one naturally develops skills necessary to understand investing, something that everyone should know something (or a lot) about.

All this may sound like I'm obsessed with wealth, but it's really more about independence. We live in an unfair world where money is never enough. At some point or another, one needs an excess of wealth to solve some problem. My paranoid side tells me that it's a mistake to live a standard live with a good salary and hope to get comfort, health and occasionally some justice... that is of course if one values his own individuality, a view that nowadays may not be as popular, but to each their own.

Truth and reality

Premise: Not breaking any new grounds here, but laying it down as a reference.

Something is true if it’s real. Everything that we conceive is the fruit of perception, which is an intake of signals from our sensory abilities, such as vision and hearing. For all intents and purposes, there is no objective reality. Our mapping of reality is limited by our sensory abilities, which are limited by nature. Reality is also how we integrate those signals into our model of the world.

To try and establish a common base reality, one should follow two major rules:

  1. Reality is defined by consistence. This at the root of the scientific method. Nothing can stand on its own, unless it’s consistent with the rest of the established theories built on observations. This does not mean that established theories cannot be changed, but they should only be changed or amended as long as the new model adds new details that gives a better understanding of nature.

  2. The observer should question the environment if there’s a sense of impaired mental capacity of the self. This is to avoid dream-like states of mind. When dreaming, belief is usually momentarily shaken by the fact that one is unable to perform trivial mental exercises, such as actually looking at a screen with code and being able to edit and debug it. This is a sign of the fact that the brain is busy trying to generate its own reality instead of simply processing inputs from the real world. In popular culture this is sometimes defined as pinching oneself to see if there’s a sense of pain. The idea is to perform a sort of brain pinch to see if there’s struggle to achieve a level of mental acuity that is known to be possible.

Of course, these rules are relatively vague in themselves, so for practical purposes they are guidelines, but I believe that they are what one should strive for to establish a reality to work with.

It’s very easy to claim to be consistent. In fact, one should always apply some self-doubt at certain junctures to rethink on whether or not he/she is indeed being consistent and see if perhaps there’s a deeper level at which this may not be the case anymore, such as when added details would negate the consistency of thought.

The take-profit fallacy

In trading there are these stop-loss and take-profit concepts. The former is a necessary pain, the latter is an unnecessary long-term pain, masked as a short-term joy.

A stop-loss is an order to sell if the price falls too far below compared to the price of purchase. This is to prevent losing too much in a single trade.

A take-profit is an order to sell if the price has risen to the point of considering the purchase a success and just get out of it.

However the price moves continuously and there's a high risk of missing more potential profit, simply because one has decided that a, for example, 3% is good enough for the day.

In trading more than anything, if you're not earning you're losing, to settle today for a limited profit means to have a negative balance a few months down the road. All missed profits add up and make a sensible difference on the long term balance of a fund.

One intuitive solution is that of using a take-profit and then make it follow with another buy once the price drops, so that one is still inside the trade, and now even with more assets/margin. The problem is that the price may as well just go up instead, and then one will be forced to buy again, but at a higher price, losing precious buying power. Then the price may eventually just drop anyway and force the trade into a loss.

Of course there may be cases where it makes sense to sell before what the main indicator may suggest... this however should be done under the reasonable expectation that price will drop indeed. Momentum of the market should be taken into consideration, instead of simply settling for a quantity of profit.
One way I do this is at known price levels (calculated by the concentration of past trading volume) that may be a point of resistance. Even then, it's very easy to stumble into false positives. I use this in my algorithms, but very sparsely.
It sometimes works, but only because there's indication that the price may indeed drop, not simply because there's a sense that x% ought to be good enough for now.

Above are two examples of selling at a significant level while expecting a rejection. This is more sophisticated than simply selling after a certain amount of profit, but it can still fail and have an adverse effect on the total balance. What happens when a level is reached could be a rejection as well as an upwards rally.

Digital Perspective

Quantization and resolution of data are two key concepts in computer science that are at the root of the digital revolution. I think that even a cursory understanding of those concepts can be useful towards building a mental framework in trying to seek objectivity.

Quantization

For my generation, the term digital was popularized with Compact Discs. CDs look metallic and shiny and are read by lasers. They fit very well the ideal of something new and futuristic.

Lasers aren’t however what makes CDs so much a digital support. CDs are digital because they store information that is quantized. The perfection in reproduction of a CD is due to the fact that there is a a-priori determination of what the data in a sound sample should be like.
Quantization is a process necessary to encode data for digital storage. It sets the boundaries for a relation between a physical microscopic deformation of a lump of atoms on the CD and a number that it represents.


Bits on a CD.

Quantization is also wasteful, because precision is only achieved by an abundance of physical space to encode each piece of data. Such space is used to avoid ambiguities that could arise from subsequent deformities of the support. Notice that significant deformation would lead to errors in the data, this brings the need for an encoding format that can deal with errors, but we won't bother with that here.

Quantization can be seen as a signed contract between the writer and the future readers that spells out the exact data format and the amounts of bits of data that were used to digitize the input signal, be that a sound wave, an image or anything else.

This is a fundamental perspective as one is trying to determine what's truthful and correct in the generic sense. Of course, truth as it’s applied to everyday life is infinitely more complex than the playback of a sound track, but this is a valid concept nonetheless. Even if it objective truth can’t be achieved, it sets a point of reference that can be kept in mind as one strives for objectivity.

It should be noted that legal text also tends to assume a format that is somewhat objective. This is again done by a certain structure as well as an abundance of information. Legal text is however more objective as it pertains to the structure of the content, but not on the actual claims. In fact, in modern law truth is to be found in the middle of two clearly partisan perspectives.

Resolution

Digital storage relies on physical allocation which is finite and this brings the issue of resolution. Each piece of information is stored, retrieved and processed at some level of resolution, or granularity. It's a concept that it's easier to imagine with an image, where the number of pixels determines its spatial resolution, and where a higher resolution image can reveal more details that may be invisible at lower resolutions.


Same image at multiple resolutions.

The same point of view can be applied to any problem. One may say that his home is being invaded by scary monsters, and he would be right as long as one is looking at the carpet with an electron microscope, spotting countless dust mites.


Scary monsters in your house.

This of course it’s an extreme case, but it’s illustrative of how statements can be true or false depending on the resolution at which one is operating.

We intuitively know that taking a broader perspective on things, instead of focusing on details, is one way to avoid to worry needlessly. The suggestion to “take a step back” or “look at it from the outside” can be thought of as a suggestion to lower the resolution on a problem to avoid getting entangled in the noise.

Day traders also know the dangers of zooming into a bar chart to look at just a few hours of 1-minute candles and feeling like the market is constantly on the verge of exploding up or going for a colossal dump. A quick zoom-out instead shows a much more stable and static price chart, one that is more comfortable and also more productive to operate at.

Objectivity is a concept and it’s therefore made up. To carve a space in which one can argue in an objective manner it’s important to determine a resolution and to stick to it for as long as possible. A debate can get very confusing if people involved decide to argue at different resolutions.
This shifting of resolution often happen, sometimes to highlight the importance of a certain level of detail, but many times simply to find a level, a resolution, at which one’s own argument is still valid. Needless to say, that that’s not a profitable way to come to an objective conclusion.

My Space !

I was never the biggest fan of social media. I embraced Twitter quickly because of the novelty factor, although it never quite made sense and it still doesn't, probably for most people that have tried it.
I refused to be on Facebook for years, but I eventually caved, in part because I felt that I was missing the fun, in part because I needed it to promote my mobile games.

Social media does have its uses, having some sort of instant connection with many people at any given time can lead to interesting exchanges and to some business even. It's also a quick way to keep an eye on friends and family (Facebook mostly). However, social media can also be a big time waster, it's limiting because it pushes one to express himself with short sentences, and it's also weird in a way: there's this constant state of observing and being observed, or rather, being scrolled-up. It's a very passive media.
Content is also not so great: after a while, people tend to repeat themselves quite a bit, me included. What's interesting about people is how they grow intellectually with time. This is not something that can be noticed on social media because it's all in bits and pieces and arguments.

That said, the decision now to start to focus less on social media comes more from the realization that it has become an oppressive environment. It started a few years ago, but it's escalating really quickly since 2020, mostly because of the US Presidential Elections.
Twitter, Facebook and YouTube in particular have set themselves to be the gatekeeper of truth, mostly about politics, but not necessarily. Some people embrace censorship with open arms, because they figure that it's objective and done by "the good guys".
Of course the premise is laughable. It's a fundamental, and frankly basic error to think in terms of solving the World's problems by establishing a Ministry of Truth of sorts.

It baffles me to see how certain fundamental questions about the importance of freedom of expression have been debated for hundreds of years at least, and yet, today the average person is still raised seemingly oblivious to that which should be an obvious conclusion, somehow having to rediscover it once again for himself or herself.

Being a fan of freedom of expression and of freedom in general, I'd feel complicit in eroding those rights if I continued to give as much time to social media as I am today.

It should be noted that there are social media alternatives today that pose themselves as free speech alternatives. While I have great hope for those, I also think that it doesn't make sense to put too much content specifically on yet another service, which will either die off or live long enough to become the next villain.

Translate »