Category: Computers and stuff

Some of you might have noticed that since November 16 this website has not been available. Turns out that somehow this blog was hacked. Took me awhile looking around for anomalies and searching through log files.

After a few late night hours, I finally figured it out. Somehow someone had breached my root directory and added an errant index.php file which blew everything up.

Don't ask me how this is possible, but it happened. I guess I should feel honored that some hacker out there took the time to mess around with my simple and insignificant blog. Better yet I spoofed him back which feels really good.

Take that you awful and scroungy fool of a fool!

amd-radeon-rx-6700-xt.png

Purchased a new video card and installed it in my gaming computer. There is currently a huge shortage of video cards on the market, meaning that they are over-priced. The base price is quoted at €495, but I was desperate for some reason and bought one anyway via an online scalper for €600.

See: AMD Radeon RX 6700 XT Graphics

So why spend tons of extra money on a fancy new Linux computer when with a little extra research and energy you can build your own machine for half the price?

build-your-own.jpg

For awhile I was looking for a new laptop, but considering the computational intensive configuration I required, and the fact that I also wanted a machine that could handle most modern games, I was looking at minimally three thousand euros.

I found a great video link AMD Linux PC Build for computation intensive tasks, and I studied this article How to build a PC: A step-by-step guide which led me to purchase the following components:

  • CPU: AMD Ryzen 9 3900XT Processor (without cooler)
  • GPU: PowerColor Radeon RX580 Red Dragon V2 8GB
  • RAM: Corsair 32GB (2x16GB) DDR4 3200MHz
  • Motherboard: Asus ROG Strix B450-F Gaming II
  • Cooler: Noctua NH-U12A BF19
  • ARCTIC MX-2 (4 Grams) - Thermal Compound Paste
  • SSD: Corsair MP510 NVMe M.2 960GB
  • SSD: Samsung 860 Evo 1000GB 2.5"
  • HDD: Seagate Barracuda 4TB 5400RPM SATA 3.5"
  • Power: Corsair RM750X V2 750W or Corsair RM650X V2 650W
  • Case: Fractal Design Define Mini C Solid Side Panel Black
  • Inateck SSD Mounting Bracket, 2.5 to 3.5 Hard Drive Adapter
  • Inateck SATA Data Cable and SATA Power Splitter Cable

Total cost: € 1625.84

Most of the items I bought through Amazon.nl, but a couple items that were either cheaper or had a faster delivery time, I purchased instead through a local distributor called Max-ICT. Within five days of ordering, I had received everything at home.

I am a little daunted with the number of components, all of the cables and how exactly to put things together without blowing everything up. That's why I invited Lennart over this evening to help me out and make sure I do not do anything stupid.

Yet another fun father son activity.

I never thought that it could ever happen to me but it did. Some evil hacker had somehow compromised a public service website of mine, deleted the database making the website unusable and left the following message for me:

"To recover your lost Database and avoid leaking it: Send us X.XX Bitcoin (BTC) to our Bitcoin address XXXXX and contact us by Email with your Server IP or Domain name and a Proof of Payment. If you are unsure if we have your data, contact us and we will send you a proof. Your Database is downloaded and backed up on our servers. Backups that we have right now: DB1, DB2, DB3, DB4 . If we dont receive your payment in the next 10 Days, we will make your database public or use them otherwise."

Fortunately I made daily backups and could recover the deleted database by restoring the most recently saved data. Also battened down the hatches by replacing all related credentials with very secure encrypted passwords.

That's why in this day and age it's vital to your survival to make backups and have a dependable emergency recovery protocol in place in case disaster strikes.

My laptop runs on Ubuntu 18.04 and is hooked up to the monitor via the ultra-dock docking station's hdmi displayport.

Normally I have been very pleased with this, but when watching video trainings I often need to pause the video to try things out and then restart the video.

The problem with my setup was that there was always a pause of 2-3 seconds before the audio became enabled again. A real pain in the butt.

The fix is very simple in my case, and this is how I achieved it.

screenshot-askubuntu.com-2020.01.png

NG-BE-LoRes-187.jpg

Decided to treat myself to a fun Angular event in December, namely the following training: Angular architectures for enterprise applications. According to the descriptionn:

"In this interactive seminar you develop a critical understanding for planning and implementing large enterprise applications with Angular. You explore and work with approaches to structure huge applications like npm packages, Nx Monorepos, and Microfrontends."

The workshop topics in which I am the most interested to learn more about are:

  • Monorepos and Nx
  • Micro Frontends
  • I18N

You're never to old to learn new stuff, and I'm really looking forward to interacting with interesting people and further honing my Angular skills.

The NG-BE 2019 is a 2-day event in Ghent, Belgium, that brings together Angular developers and experts from all over the world to share ideas, news and opinions about Angular.

An object that carries data between processes. Communication between processes is accomplished via remote interfaces where each call is an expensive operation. The cost of each call is proportional to the round-trip time between the client and the server. This means that the number of calls can be greatly reduced by using an object that aggregates the data that is served by one call only rather than transferred over several calls. This is referred to as a data transfer object (DTO).

Having had the privilege of being a (small) part of the wonderful world of software development, one of the most important things that I have learned is that in the long run it is NOT technology which is the limiting factor.

The possibilities of technology are pretty much limitless, meaning that in general one can achieve anything that is required. Given enough sweat and tears, creative thinking, time and labor. However, it is "human nature" which is the limiting factor, and if left unchecked it is humans and not machines which usually cause a new project or enthusiastic startup to fail.

People relate to one another in a very unpredictable way. Not only within an organization or the chaotic dynamics of the team. External relationships with the market, ever-changing government rules and regulations, public fickleness or an unexpected dip in the economy. The products are usually pretty good, but it is the brand which must be developed and the potential customers which have to be convinced that they really need what we are building.

To make matters even worse, the endless possibilities of technology can confuse us and lead to indecision and uncertainty. Choosing to pivot in one direction and then switching to the other is only too easy. One needs to be flexible and have the ability to switch but not too easily. Hesitation makes us miss the window of opportunity, while forcing premature decisions doesn't usually work either.

Buddha spoke of always taking the middle path, and wavering ever so slightly is not a bad thing either as long as you do not veer off of the road, e.g. at those sharp turns.

I am not pretending to be a wise person, because I do not know the right answers either. Every startup and new project is different, the constraints and people forming a unique mix of variables. Keep focused, prefer the long run to the short, and tackle the future challenges in a positive and confident manner.

If I had to guess the perfect mix of technology versus the human factor, I'd give it a twenty-to-seventy percent. The last ten percent needs to be reserved for good old serendipity.

This is a very interesting video which gives you a taste of the things to come with the Phoenix web framework and the new web.

Web developers have typically been presented with a choice between performance or a productive development environment. With Phoenix, developers can have both while enjoying a wonderful set of abstractions for working with the new web, making streaming data to browsers, native mobile application or embedded clients a breeze. Finally, we will see how Phoenix leverages the Elixir language and the Erlang VM for writing maintainable and scalable code.

For a couple decades we have been able to take a free ride on the technological advances in speed and performance of improved hardware capabilities. First there was the 386, Pentium, Pentium 4, Dual-Core Titanium 2, and on and on. If your software was a bit slow at first, just wait a few months or maybe even weeks and the next generation of hardware will become so much faster that you won't have to worry any more about possible hiccups or performance dips.

However, this is changing faster than you realize so be careful. While this does not mean that Moore's Law is no longer valid, it does mean that the software we write will need to be concurrent in order to fully exploit CPU throughput of multi-core and distributed systems.

"If you haven't done so already, now is the time to take a hard look at the design of your application, determine what operations are CPU-sensitive now or are likely to become so soon, and identify how those places could benefit from concurrency. Now is also the time for you and your team to grok concurrent programming's requirements, pitfalls, styles, and idioms.."

"A few rare classes of applications are naturally parallelizable, but most aren't. Even when you know exactly where you're CPU-bound, you may well find it difficult to figure out how to parallelize those operations; all the most reason to start thinking about it now. Implicitly parallelizing compilers can help a little, but don't expect much; they can't do nearly as good a job of parallelizing your sequential program as you could do by turning it into an explicitly parallel and threaded version..."

"Thanks to continued cache growth and probably a few more incremental straight-line control flow optimizations, the free lunch will continue a little while longer; but starting today the buffet will only be serving that one entrée and that one dessert. The filet mignon of throughput gains is still on the menu, but now it costs extra--extra development effort, extra code complexity, and extra testing effort. The good news is that for many classes of applications the extra effort will be worthwhile, because concurrency will let them fully exploit the continuing exponential gains in processor throughput...

Taken from the article The Free Lunch Is Over: A Fundamental Turn Toward Concurrency in Software by Herb Sutter.

As if I didn't already have enough stuff to learn, that's when I hit yet another jackpot and discovered life's new elixir.

"By being immutable, Elixir also helps eliminate common cases where concurrent code has race conditions because two different entities are trying to change a data structure at the same time..."

And it doesn't stop there either. The deeper I delve into that morass the more there is to discover.

Good old erlang shows up around the corner, the phoenix arises from the ashes.

There's that cowboy living on the ranch that also tempts me.

You're never too old to learn new stuff.

Have you ever wondered how many methods Object has in Ruby? Well, here it is.

$ irb
>> Object.methods
=> [:allocate, :new, :superclass, :freeze, :===, :==, :<=>, :<, :<=, :>, :>=, :to_s, :inspect, :included_modules, :include?, :name, :ancestors, :instance_methods, :public_instance_methods, :protected_instance_methods, :private_instance_methods, :constants, :const_get, :const_set, :const_defined?, :const_missing, :class_variables, :remove_class_variable, :class_variable_get, :class_variable_set, :class_variable_defined?, :public_constant, :private_constant, :singleton_class?, :include, :prepend, :module_exec, :class_exec, :module_eval, :class_eval, :method_defined?, :public_method_defined?, :private_method_defined?, :protected_method_defined?, :public_class_method, :private_class_method, :autoload, :autoload?, :instance_method, :public_instance_method, :pretty_print_cycle, :pretty_print, :pretty_print_instance_variables, :pretty_print_inspect, :nil?, :=~, :!~, :eql?, :hash, :class, :singleton_class, :clone, :dup, :taint, :tainted?, :untaint, :untrust, :untrusted?, :trust, :frozen?, :methods, :singleton_methods, :protected_methods, :private_methods, :public_methods, :instance_variables, :instance_variable_get, :instance_variable_set, :instance_variable_defined?, :remove_instance_variable, :instance_of?, :kind_of?, :is_a?, :tap, :send, :public_send, :respond_to?, :extend, :display, :method, :public_method, :singleton_method, :define_singleton_method, :object_id, :to_enum, :enum_for, :pretty_inspect, :equal?, :!, :!=, :instance_eval, :instance_exec, :__send__, :__id__]

desktop-ubuntu-14.04-LTS.png

I've been staring at this screen all day, it was supposed to be my day off. Trying to learn new stuff is really addictive sometimes. Especially when it's cold and raining outside, and I cannot play any golf.

After I updated to Ubuntu 14.04 LTS, url links in other applications stopped working. Well, when I clicked on the links my google chrome browser would fire up properly, but it would stay stuck at the homepage and not be redirected to the link that I clicked on.

This was irritating me for days and I just could not figure out what was going wrong. I uninstalled and re-installed google chrome a number of times, removed the hidden ~/.config/google-chrome directory, on and on.

Just about the moment I was ready to give up completely and go back to firefox, I had an unexpected insight. The simple solution is just to do the following:

rm ~/.local/share/applications/google-chrome.desktop

Probably something went wrong with the ubuntu upgrade whereby this file got leftover and wasn't deleted properly.

Suppose you try to upload a file and keep getting server errors thrown in your face. Have a look in the apache error log and see if you can find a line looking something like this:

mod_fcgid: HTTP request length 136872 (so far) exceeds MaxRequestLen 131072

If that is the case, then you are in luck. To fix it, look for the apache configuration file called /etc/httpd/conf.d/fcgid.conf and edit it. All you need to do is add the following line at the end of the file.

FcgidMaxRequestLen 2147483648

These are the details for Centos 6, but for other operating systems it should be something very similar.

When I first got my new Thinkpad T431s up and running, I was all excited about having such a blazingly fast machine at my fingertips.

However, when I started using VirtualBox, I was disappointed how sluggish it ran when trying to install virtual machines. Initially I thought it had to do with SSD, and to no avail I tried various system setting tweaks to improve performance.

There are also a number of settings in VirtualBox that you can play around with, like different chipsets (PIIX3 was the older default option whereas ICH9 supports the more modern machines) and enabling I/O APIC (required for 64-bit guest operating systems, and if you want to use more than one virtual CPU in a virtual machine), but none of that helped either.

This morning I had a flash of insight. Perhaps virtualization isn't enabled in the BIOS, let's go and have a look. On boot I hit the Enter-key, paged my way to the security section and lo-and-behold the VT-d feature was turned off. I quickly enabled it, saved my changes and booted.

Installed CentOS 6.5 within ten minutes, and now it's running like a charm.

virtualbox-centos65.png

ubuntu-on-thinkpad-t431s.png

Here's proof that I got Ubuntu 13.10 installed and running on my new Thinkpad T431s.

I decided to give the Computer Networks | Coursera online course another go. This is an excellent course anyone can take for free. Last year I tried and made it through about half way, but due to time constraints was unable to complete it. I already know a lot about network technology and the Internet, but a refresher course like this will give me a broader overview and provide insights to the latest changes.

computer-networks.png

After nearly six years, I've finally decided to renew my life by purchasing a new laptop. Portable 14-inch, 8GB, 256GB SSD SATA3, i7-CPU. After I ordered it, I happened to come across a number of negative reviews, mostly complaining about the screen (anti-glare) and so-called lousy resolution (1600x900), terrible touchpad (5-finger), etc. The problem nowadays is that everything is being compared to the state-of-the-art ultrabooks which are way too expensive for me. Many people say that it doesn't look sexy and it still has an old-fashioned vga connector (shame on them). The smaller screens may have higher resolution and better contrast for browsing the web. I need a slightly larger screen so that I can read comfortably if for instance I want to sit downstairs. When in my study room I can hook it up anyway to my fancy 23-inch high-resolution screen so such a shortcoming doesn't matter. Besides, I prefer a solid and dependable machine that will last. For me it's more than sexy enough.

lenovo-thinkpad-t431s.jpg
Lenovo ThinkPad T431s

Ordered my Nwazet Pi Camera Box from the ModMyPi shopping site which arrived a couple of days ago.

The assembly was a bit complicated but thanks to the online instructions I was able to snap all of the pieces together. It comes with a fancy fish-eye lens. Couldn't get the wifi to work yet though. Here's what it looks like so far.

raspi-camera.png

I also managed to take my first snapshot. Slightly blurry and distorted but a historical photograph nonetheless.

first-raspberrypi-image.png

A couple days later. Figured out how to get the wifi working by installing the wicd-curses utility.

$ sudo apt-get install wicd-curses
$ sudo wicd-curses

Just follow the instructions, choose the correct SSID and then connect to it.

A most disturbing Vim anti-pattern is using the good old arrow keys to navigate around the page. I've been told that this is a terrible habit which makes me more inefficient. I do not like wasting time and being inefficient, so I read up a bit on the subject and decided to add the following lines to my .vimrc configuration file. Say good-bye to arrow keys my friend.

noremap <Up> <NOP>
noremap <Down> <NOP>
noremap <Left> <NOP>
noremap <Right> <NOP>

At first I felt very crippled, but after awhile believe it or not I started to get used to it. Bad habits are hard to break, but if you hang in there you can do it. j,j,l,k,k,k,h,k,j ...

Even using the default vim navigation keys h,j,k,l gets boring after awhile. This is the slow poke way to move like a snail across the editing surface. There are even speedier and thus better ways to accomplish such a feat. If you don't know what they are look it up yourself (:help word).

So in order to end the pain of all pains I extended vim nops to include the following lines as well:

noremap h <NOP>
noremap j <NOP>
noremap k <NOP>
noremap l <NOP>

Slowly but surely I'm on my way to becoming a true blue vim guru, and it feels pretty good. You're never too old to learn new tricks, and if by doing so you become more productive so much the better I'd say.

Alright then, let me help you along by giving you a slight hint. The w, b, e, ge commands allow us to move forward or backward to the start or end of a word. The W, B, E, gE commands are also something you should check into. Much quicker jumping around that's for sure.

You might also want to have a look at Habit breaking, habit making and then install the Hard Mode Vim Plugin like I did.

Now that I have a bit more time on my hands, I decided to get crazy about the ruby programming language. I've always wanted to learn more about this intriguing language as well as the ruby on rails web framework.

crazy-about-ruby.png

As if all that reading material is not enough, I've also been following a couple of online courses, namely:

I also purchased RubyMine which is an advanced IDE to make me even more productive, though it is a bit sluggish but should be alright once I buy a new laptop.

I'm never too old for learning new and interesting stuff.

Learning the ML programming language is a lot of fun. This is my first in-depth initiation into the exciting world of functional programming. Here's something to whet your appetite, an elegant function for appending two lists of any type:

fun append e =
    case e of
        ([],ys) => ys
      | (x::xs,ys) => x :: append(xs,ys)

Here we are passing the function append an expression e, and if the pattern matches empty string plus string we stop, otherwise we prepend the first element to the tail appended to the list.

I am following the online Coursera training by the University of Washington called Programming Languages given by Dan Grossman.

Great stuff to keep my aging brain cells oiled and running efficiently.

I thought it'd be fun to strip down and reconfigure my Raspberry Pi in order to turn it into a mighty mini-webserver.

My starting point is the default Raspian Wheezy download and the setup as explained on the website page Installing Operating System Images on Linux. Run the raspi-config command and do the following:

[Note: you can skip this section with the newer operating system images made available where by default ssh is enabled for pi/raspberrypi and boot to desktop disabled]

  • Enable SSH
  • Disable boot to desktop
  • Use all of the SD Card (in my case 32GB)
  • Rename the hostname
  • Reduce the GPU memory to 16MB

Once you have followed the setup instructions, ssh pi@raspberrypi to get there. Here raspberrypi should be replaced by the IP address, for example 192.168.2.101 in my case. You can then run the command dpkg -l in order to see what is installed, followed by apt-get purge in order to strip out the extra stuff you no longer need.

Since this is a streamlined webserver, there is no need for any of the GUI desktop stuff. I also want to remove Python, sound-related (alsa) stuff, samba and other junk. Therefore, run sudo su - and fire off the following commands:

$ apt-get update
$ apt-get purge xserver* ^x11 samba* ^libx ^lx samba* libsmbclient python* desktop-file-utils nano tsconf xkb-data console-setup penguinpuzzle omxplayer gtk* libgtk* alsa* -y
$ apt-get autoremove -y
$ apt-get upgrade -y
$ apt-get clean

I chose the lightweight nginx as my preferred webserver:

$ apt-get install nginx

Since I will no longer be needing the extra memory for the GUI, I can free up the memory by editing the file /boot/config.txt and ensuring that the following line is present:

gpu_mem = 16

No need to keep the pi user around anymore since I have created a new user to do all of the heavy stuff. You'll probably want to use a different name suiting your needs, but I'll use this as an example. From now on replace the word kiffin with your own.

$ sudo adduser kiffin

Now the tricky part. As the user pi edit the sudoers file by running the command sudo visudo. At the bottom of the file you'll see something like this:

#includedir /etc/sudoers.d
pi ALL=(ALL) NOPASSWD: ALL

Replace the user pi with kiffin and save the file. Enter exit twice to return to your terminal, and then access the server again using the command ssh kiffin@raspberrypi.

Now you can run sudo su - and get rid of the pi user like this:

$ deluser pi
$ rm -rf /home/pi

Since I'm also an avid user of GNU Screen, I installed it as well:

$ sudo apt-get install screen

Now it's time to create a simple webpage by going to the document index directory and creating an index.html file to suit your needs.

$ cd /usr/share/nginx/www
$ cat index.html
<head>
<title>KiffinWeb</title>
</head>
<body bgcolor="white" text="black">
<center>
<h1>Welcome to KiffinWeb!</h1>
<img src="raspberrypi.png"/>
<p>
This is an <a href="http://nginx.org/en/">nginx</a> web server running on a <a href="http://www.raspberrypi.org/">Raspberry Pi</a> mini-computer.
</p>
<p>
<a href="http://www.kiffingish.com/2013/09/raspberry-pi-webserver.html">Make one yourself</a>
</p>
<p>
Brought to you by <a href="http://www.kiffingish.com/">Kiffin Gish</a>.
</p>
</center>
</body>
</html>

Once you've got everything setup to your heart's delight, it's probably a good idea to make a backup of this image just in case. For example, when the electricity fails causing the SD Card to become corrupted.

Shutdown the raspberry pi (sudo shutdown now), wait one minute, take out the SD Card and put it in your laptop. I then run the following command to copy the image to a local backup file:

$ sudo dd bs=4M count=800 if=/dev/mmcblk0 of=/home/kiffin/raspberrypi-kiffinweb-20140925.img

Take the SD Card out and insert it back in the device and connect it to the power supply. Make some last changes in the configuration:

  • Use all of the SD Card (in my case 32GB)
  • Rename the hostname
  • Reduce the GPU memory to 16MB (already done above)
$ sudo rasp-config

Once everything has been finalized, I reboot with shutdown -r now and ssh kiffin@raspberrypi to my webserver again. Here's how much additional space I've created:

kiffin@raspberrypi:~# df -H
Filesystem      Size  Used Avail Use% Mounted on
rootfs           32G  751M   30G   3% /
/dev/root        32G  751M   30G   3% /
devtmpfs        247M     0  247M   0% /dev
tmpfs            51M  234k   51M   1% /run
tmpfs           5.3M     0  5.3M   0% /run/lock
tmpfs           102M     0  102M   0% /run/shm
/dev/mmcblk0p1   59M   19M   40M  33% /boot

Of course, in order to make the web server accessible from the outside world, you will have to use NAT by configuring the router to forward HTTP (port 80) and SSH (port 22) to transfer these requests to the IP-address of the Raspberry Pi server.

Here's proof that it really works:


www.kiffinweb.com

For convenience, everything is setup downstairs in the electricity cupboard.

raspberrypi-meterkast.png raspberrypi-meterkast-closeup.png

The Raspberry Pi is connected to the KPN Experia Modem with the blue Ethernet cable and the power supply is the black cord going up over the fuse box to the socket.

Raspberry-pi-1.png Raspberry-pi-2.png

I was so very pleased to receive my new techie toy in the mail today. Brings out that little boy feeling in me, yet another fun present. The Raspberry Pi is a credit-card sized computer which costs less than thirty euros, and it's a lot of fun to play around with.

These are the only components required to make a good start:

  • RASPBERRY PI, MODEL B, 512MB
  • SDCARD 8GB
  • POWER SUPPLY, RASPBERRY 5V, 1A
  • HDMI TO DVI CABLE 2M
  • WIRELESS KEYBOARD & MOUSE

Here are some references:

There are a number of rituals in life that can be conducted in order to cut the symbolic ties with the past. These rituals are always painful, but at the same time they are very necessary in order to move on in life. By nature, I am one who postpones the inevitable as long as possible, especially when it comes to distancing myself from my past. I remain very attached to the way things were, can get pretty sentimental about the most trivial memories, collect useless memorabilia and hate to throw anything away for fear of who knows what. Discarding items is so definite, and once they are thrown away, there's no way ever to get them back again.

Take for instance my computers and stuff. Believe it or not, I have saved every single computer, laptop, mouse, keyboard, router, hub, floppy disc, monitor, printer, scanner, cable, video card, hard disk, on and on. I still have my very first computer which is more than twenty years old. A state-of-the-art 386 PC and I was the first one in my neighborhood to have one. I also kept my very first laptop. A heavy bulk of a Toshiba shaped like an over-sized brick. All that hardware and cables have been collecting dust in dark corners, turning yellow and rotting away for ages, waiting to be let go. If only I would ever give them the chance.

So this morning I went through all my drawers, closets, boxes, and grabbed every piece of hardware and cable I could find. I carried those poor souls downstairs, filled the trunk of my car, the back and front seats. I went to the dump on the other side of town, and with tears in my eyes I tossed all those fine memories into the dumpster, watching them crash and splinter. There I was thanklessly discarding those wonderful pieces of technology which have meant so much to me. Thanks very much for being part of my life, good bye and see you later.

The big cleanup action took a little less than two hours. I've freed up so much extra room I do not know what to do with it all. Empty spaces waiting to be filled up again with the new fangled objects of my future. Like some kind of catharsis, I feel liked I've been relieved from a tremendous burden weighing me down. It's time to move on in life, stop getting dragged down by the past, face forward and reach out. Here we go again.

[This is my 2000th blog entry]

Here's an interesting quote I came across this evening while reading the introduction of the online course Building a Modern Computer From First Principles:

It turns out that this strategy works well thanks to a special gift unique to humans: our ability to create and use abstractions. The notion of abstraction, central to many arts and sciences, is normally taken to be a mental expression that seeks to separate in thought, and capture in some concise manner, the essence of some entity. In computer science, we take the notion of abstraction very concretely, defining it to be a statement of "what the entity does" and ignoring the details of "how it does it." This functional description must capture all that needs to be known in order to use the entity's services, and nothing more. All the work, cleverness, information, and drama that went into the entity's implementation are concealed from the client who is supposed to use it, since they are simply irrelevant. The articulation, use, and implementation of such abstractions are the bread and butter of our professional practice: Every hardware and software developer is routinely defining abstractions (also called "interfaces") and then implementing them, or asking other people to implement them. The abstractions are often built layer upon layer, resulting in higher and higher levels of capabilities.

The site contains all the software tools and project materials necessary to build a general-purpose computer system from the ground up, so check it out if you dare to take up this amazing challenge.

  • Eliminate waste: Spend time only on what adds real customer value.
  • Amplify learning: When you have tough problems, increase feedback.
  • Decide as late as possible: Keep your options open as long as practical, but no longer.
  • Deliver as fast as possible: Deliver value to customers as soon as they ask for it.
  • Empower the team: Let people who add value use their full potential.
  • Build integrity in: Don't try to tack on integrity after the fact, build it in.
  • See the whole: Beware of the temptation to optimize parts at the expense of the whole.

I am convinced that the new programming language called Clojure has alot of potential and if successful will fundamentally change the way we think about developing complex applications.

Recently I purchased two books about this amazing programming language, Clojure in Action and The Joy of Clojure, and although I've read about a fourth of each book, I have not had enough time to study it as deeply as I would like to.

Here's a very simple example of how elegantly an otherwise difficult to program algorithm can be expressed in a single code statement:

(reduce + (range 1 1001))

Basically, this one-liner takes a range of numbers and adds them all together giving the total of one through one thousand and one. Show me another programming language which can express this more simpler.

Very interesting is the fact that this language is based on Lisp which is one of the earliest (functional) programming languages and is many decades old. The pendulum swings back and forth and now it is time to return to our roots. We will have to turn our linear programming mindset inside out in order to move forward.

So with that in mind, it's now time for me to go out for my daily run in the freezing cold and warm up my body and mind by philosophizing about programming computers and the true significance of simulating/stimulating human thought processes.

While reading an interesting book about producing reliable software releases called Continuous Delivery, I came across the following excellent idea:

"If it hurts, do it more frequently, and bring the pain forward."

If certain tasks of releasing software make it a painful process, for example last minute tests which seem to break the product right before launch, then the idea is to figure out a way to automate all tests and 'release' the latest version after every single change.

How often has this inefficient so-called fact of life just been accepted as part of the deal, when in fact with a little logical thinking it does not have to be so. The extra time and energy spent in improving this might result in a temporary pain increase, but in the end the pain will simply go away.

Deal with the hurt by rubbing some sand in the wound so that after a while it will not hurt any more.

The challenging part is convincing the rest of the organization that this is so.

Random entries

Here are some random entries that you might be interested in:

Recent Assets

  • DSC02011.JPG
  • winning-captain.png
  • winners-jong-tegen-oud-2021.png
  • amd-radeon-rx-6700-xt.png
  • chipping.jpg
  • 224601-kms.png
  • kaboom.png
  • Drenthe-walk-with-Thea.png
  • two-bad-holes.png
  • vaccinatie-tegen-corona.png
  • build-your-own.jpg
  • in-good-hands.png

Recent Comments

  • Long time no see: I remember them, as well. I remember Donald was my ...
    - Charles
  • Bridge to the moon: Yes it was a drawing and my older brother told me ...
    - jpmcfarlane
  • Bridge to the moon: Wow, that's quite a coincidence that we both sent ...
    - Kiffin
  • Bridge to the moon: Hello I was in my teens when Gemini 4 went up that ...
    - jpmcfarlane
  • Back to work: Congratulations Kiffin, I hope it is something you ...
    - KathleenC

Golf Handicap

Information

This personal weblog was started way back on July 21, 2001 which means that it is 7-21-2001 old.

So far this blog contains no less than 2432 entries and as many as 1877 comments.

Important events

Graduated from Stanford 6-5-1979 ago.

Kiffin Rockwell was shot down and killed 9-23-1916 ago.

Believe it or not but I am 10-11-1957 young.

First met Thea in Balestrand, Norway 6-14-1980 ago.

Began well-balanced and healthy life style 1-8-2013 ago.

My father passed away 10-20-2000 ago.

My mother passed away 3-27-2018 ago.

Started Gishtech 04-25-2016 ago.