Skip to main content

Posts

Catherine Ann (Shinn) Liptak 12/29/1930-8/7/2021

Catherine Ann (Shinn) Liptak Catherine Ann (Shinn) Liptak, age 90, passed away with family at her side at the Sunnyside Nursing Home in Cloquet, MN on August 7, 2021. She was born December 29th, 1930, in Hart, MI to Eleanor (Osborn) and Hyman Hubert Shinn. She graduated as Valedictorian from Hart High School in 1948; graduating with Honors from the University of Michigan in 1952, with a BA in English and Speech, followed by a Master’s in Library Science. Catherine taught at a number of schools in Michigan and Minnesota until meeting her husband John in 1963, settling in a lake home that John built on Bass Lake after their marriage in Hart, MI on July 25, 1964. She worked at the Virginia Public Library until her retirement in 1993.    One of the constants in Catherine’s life was her love for her family. She was always quick to share stories about her parents and sisters, growing up on Griswold Street, working at the family drive-in restaurant, and having fun with her cousins. C...

FizzBuzz - part 4

 This is no longer really about FizzBuzz At this point, we are not talking about a simple number generator, and if we ask people to pretend it's a "typical" enterprise web application, we can continue to talk with an applicant about what they know. Nothing runs in a vacuum Even with properly running FizzBuzz code, you still need to deploy it.  A simple deployment would look something like this: It's not a good idea to directly expose your application server, so a more limited device like a firewall, load balancer and a reverse proxy server in some combination generally sits in front of an application. Almost all enterprise applications will also need to store the results of their processing somewhere - otherwise why did we build this? A cloud and an on premise deployment have the some functionality.  At the heart of it, they differ in Who runs the component (vendor, customer or shared) What it's called Run in the face of adversity  An applicant should also be able...

FizzBuzz - part 3

 Team Lead  In part 2, I looked at the additional items you would look for in a FizzBuzz discussion for a lead developer.  Without getting too much into what job title means what, the next level up is a team lead. The road not yet discussed My discussion here isn't meant to be exhaustive, there are additional areas an interview can and should go; however, this one is really the last one that I think could possibly have source code up on a white board.  After this, I really think you should start using boxes and arrows. RTFM - Read the F*#cking Manual won't cut it When I started my career, you really could get by with your K&R C manual and the various man pages.  That really ended in my SunOS world (we hadn't converted to Solaris yet) when we started developing in X-Windows R4/Motif.  You really could not actually do your work without an O'Reilly set of books or the equivalent. Today, being able to not only code using a library or framework, but also dir...

FizzBuzz - part 2

 Lead Developer FizzBuzz In part 1 we went over a simple implementation of FizzBuzz.  Now we are going to advance to what I expect a lead developer would create. Job titles are tricky things.  I am not equating this with a job title, what I mean by a lead developer is that they could help a more junior developer develop a more complete solution. Unit Tests The first change is the addition of unit tests.  Developers will develop structurally different code if they are required to develop unit tests.  In order to test code correctly, you have to be able to expose the different facets of your creation.  This allows unit tests to be short and robust. package dev.boundary.waters.FizzBuzz; import static org.junit.jupiter.api.Assertions.*; import org.junit.jupiter.api.Test; class FizzBuzzTest { @Test void test1() { FizzBuzz fb = new FizzBuzz(); assertEquals("1", fb.process(1), "failed for 1"); } @Test void test3() { FizzBuzz fb = new FizzBuzz();...

FizzBuzz - part 1

 What is FizzBuzz FizzBuzz is a simple programming exercise that is frequently used as a white board problem during interviews.  I, personally, have never used this problem or had anyone ask it of me during an interview so your mileage  may vary. I was inspired to write this sequence after getting Poly Bridge 2 as a Christmas gift.  As I was watching some of the truly intricate bridge designs on YouTube, I also ran across a FizzBuzz video (no, I have no idea why Google put them together as a recommendation).  What, I thought to myself, would happen if I took the simple FizzBuzz to the same levels as Tyler  and Arglin Kampling . I plan to discuss a progression of FizzBuzz from what I would expect from an entry level developer to an enterprise architect.  Since I'm playing both sides of this, I don't expect anyone else will make the same choices as I do - what's important is the journey. The initial solution : package dev.boundary.waters.FizzBuzz; import...

Are you technically ready for Cloud Native?

I'm going to focus on technical requirements in this blog.  That said, I do feel your organizational change is also a prerequisite  for you to be able to make such a transition effectively.  Cloud Native is About Culture, Not Containers  is a good starting place for looking at a transition from that point of view. A successful transition to a new deployment technology requires that everyone involved is ready for the changes necessary: The development team The quality assurance team The deployment team The platform team The operations team The facilities team The management team I'm not going to give advice on your organization's preparation, you know them better than I do and any advice would be speculative at best.  But there are a few structural boundaries that can be analyzed in order to determine if you have done the technical work.  I will be using kubernetes as an example deployment platform becau...

CodeReady Container address range

I've been working with Red Hat's CodeReady Containers and I recently had a networking issue that might not be obvious to all users. Hyper-V assigns your VM to an address space in the 172.x.x.x range.   CRC also assigns, by default, in the 172.30.0.0/16 address space. This all works well unless Windows assigns your VM an address in 172.30.x.x as well.  Then you get a bunch of networking issues where you can't connect outside the cluster.  This means you can't download images, etc. The solution - reboot your PC and restart CRC until you get an address outside the 172.30.x.x range.

CodeReady Containers inside vs. outside

CodeReady Containers Red Hat has produced a single-node kubernetes install that can run on a single developer's machine (also known by the acronym CRC).  This allows you to spin up a cluster, administrate it, and install your own software to it in an environment you completely control. To kick the tires, I wanted to do the following: Deploy a dead-simple application with one REST endpoint  Be able to access it from outside the cluster (i.e. figure out ingress using ISTIO, not just the OpenShift automatic route) Use an external build process using Maven and Google Jib (I like the buildpack like way that OpenShift provides, but I wanted to start without depending on all of that magic). Google Jib   In order to get jib to work, I needed two things: I had to go searching for how the registry that is bundled into CRC is exposed.  You can find the route in th...

All things come to an end - plan for it

In 2011 I built a PC with an i7 2600k that stood me in good stead until two weeks ago.  I had upgraded disks, memory and video cards over the years, but while upgrading my memory, I must have flexed the 9 year old mother board more than it wanted and I got an ugly sight: That is the CPU fail LED glowing to show me that the computer was dead :-( My how PC building has changed in 9 years!  Due to work commitments I couldn't take the time to build it's replacement, but the folks at MicroCenter hooked me up with a very nice AMD build.  I got it home, double checked that it would POST correctly and I was off to the races. First, I installed my drives from my old PC into the new box, turned it on and nothing.  I forgot to put the boot configuration into compatibility mode!  My old drives were created before UEFI, so I needed to turn that one.  One change and bingo! got the Windows boot screen.  A little nervious waiting while it said that it was configuring ...

Cardinality is critical

One important facet of software design is the cardinality between items in your system design.   As an example, consider a simple credit card based design from telecommunications: Given this design, you can explore many boundary conditions with your business partners. Can you have a customer without services? Can you have billing information without an account?  How about the other way? I'm diagramed this as having multiple BillingInfo entities.  Should they allow overlap?  How about one time payments?  Refunds and chargeback processing? Getting into the details of how the business works will provide you a lot of leading conversations as you develop your system, but I want to focus just the one relationship in dark black above, specifically the qualitative dimensions of the 0 or more accounts. If you think about a mass consumer product, you will have a fairly low cardinality of accounts.  A wireless customer may have a family plan with multiple phones,...

The process of building software

The process of building software follows well defined steps: Source code and other artifacts are compiled and assembled into some sort of deployable artifacts. Primary tools in this spaces are  Simple scripts Make IMake Ant Maven Groovy go build cargo build etc. Quality control steps are applied, either manual or automated: Manual Testing Unit Testing (JUnit, NUnit, cargo test, go test, etc.) Integration Testing System Testing User Testing Load and Performance Testing Security Testing Usability Testing etc. The software is delivered.  This can take many different forms: Manual installation Automatic patching (Patch Tuesday) Made available in a repository like Maven Central, docker hub, etc. The major evolution in this process has not been the steps, but rather the boundaries between them.  When I first started, each and every step of this process was a manual handoff.  Now, large numbers of companies have software automatically migr...

The only constant is change

Perhaps one of the few common tool types for all developers is a change control system.  During my career, I have used a progression of them: SCCS RCS ClearCase CVS and SVN Serena Dimensions Git (GitLab and BitBucket) The key features of a version control system are: Keeping all versions of a file Allowing you to tag or label a set of file versions Supporting concurrent editing by multiple people It's this last item that starts a lot of very passionate points of view.  You will hear statements like: "You should only develop on master" "You should/shouldn't have a branch for feature development" What you need to keep in mind is that the right answer for a particular development environment will depend on many factors.  Let's talk about a few of them. Deployment size: the number of dependencies increases with deployment size.  What is reasonable for a micro-service in a single language for a single deployment is not going to be t...

Lessons from CORBA

In the mid-90's, I got to become experienced in CORBA distributed programming environments.  While it's considered a dead technology with a lot of flaws , I would like to look at it specifically at it from a boundaries point of view. With the advantage of hindsight, we can look at which characteristics of a distributed programming environment: Interface Definition Language (IDL) - is not a deal breaker.  If you look at current environments like gRPC, the use of a language independent definition language allows wide adoption.  The ability to do code generation for your interfaces ensure that you can implement clients and servers with static type checking (if your language supports that). At the same time, JSON is also widely used is REST implementations.  So, another successful alternative is an interpretive on the wire format. Leaking language details into other implementations is not good.  Anyone who implemented a CORBA system would have to learn and und...

Object Oriented Programming is Dead! Long Live Object Oriented Programming!

There have been a bunch of blog entries , with slightly different points of view ,  discussing the decline of object oriented programming.  I think most of them miss the mark communicating the arc of history when it comes to programming paradigms. At a fundamental level, as programming languages have evolved they have improved on two different dimensions: Abstraction - allowing the developer to develop components at higher and higher levels of abstractions. Encapsulation - allowing the developer to hide more and more details from the users of their components. Let me give you a specific example from my C++ days.  At the time, I was using the Oracle Call Interface (OCI) - a C language API to connect to Oracle databases.  Using Perl::DBI, we used code generation to automatically generate C++ classes based on the schema metadata.  The class layout looked something like this: This object oriented design had a few design features that I consider object...

Object Oriented Programming - Use the right objects

In the mid-90's, there was three major object oriented analysis and design methodologies among the leaders in the field: The Booch method - In my opinion, the most technical and exacting of the methods had symbols for things like abstract class es,  parameterized  types, etc. The major problem I saw in using this method was there was less advice in the analysis phase in terms of deciding what should be an object. The Object Modeling Technique (OMT) - This technique, promoted by   Rumbaugh et. al., had a primary goal of being a communication channel with customers.  It also had drawing techniques that seemed to provide a comfortable transition from entity-relationship-drawings (ERD). This gave advice on picking objects - the classical picking out the nouns from a requirements document. The Jacobson method (OOSE) - This method had all of the standard OO techniques like the others, but also added use case influenced design and officially categorizing o...

Informix row level locking - Breaking the process boundary

During the time of my previous post, the version of Informix on the Amdahl mainframe was upgraded to a new version that included row level locking.  Common now, at the time, database vendors were busy figuring out the best ways to perform row level locking. The method that Informix chose had an interesting unexpected feature.  If you had a unique index on a table and you inserted a row, the database implemented a row level lock on the yet to be committed index row by locking the next row in the table.  If there was no index key larger than that, it would lock to the end of the index. This meant that inserting sequential values at the end of the table, a very common occurrence for our system, would in effect behave like a full table lock.   A clean solution to this problem would have been to apply the CQRS pattern so that the inserts into the database could be queued without affecting the user.  But that would have required a complete refactoring of the ...

curses setjmp/longjmp

In 1992, I was maintaining a UNIX inventory system that was developed around a screen template system that used the curses library to implement a pretty standard menu tree with data screens as leafs: My job was to add a new screen, the red box, and the red transition lines to do a master-detail pair of screens.  The way this system worked is that a set of screen definition files would be run through a code generation program to generate the C code that would keep track of the menu path and the fields in the data screens. Each of these screens would have a 80x24 template with field names following by a special replacement character for different field types.  For example, you could have a part of the screen say: Ship Date: MM/DD/YY or Comments: @@@@@@@@@@@@@@@@@@@ and the code generator would create all of the code necessary to define structures that had a ship_date members and a comments member, handle moving from field to field with the tab key, etc. along w...