Homebrew issues

I have been getting an error every time I try to install a new homebrew package and I found a quick and easy solution to it.

The problem looks like this:

user1$ brew install kubernetes-helm
Warning: git 2.15.1 is already installed
Error: Git must be installed and in your PATH!
Error: The following formula:
 python, 
cannot be installed as a binary package and must be built from source.
Install the Command Line Tools:
 xcode-select --install

Git is available on my system without issues so I was puzzled about this “invalid” error.

I found that if you set this variable:

export HOMEBREW_NO_ENV_FILTERING=1

It would allow brew to update and then install new packages without an issue.

Very simple workaround.

Advertisements

Updating Bash on my Mac

Some applications started to require a newer version of the bash shell on my Mac. Apple ships a version of 3.2 but Bash is up to version 4.4 and Apple is not able to ship a newer version because Bash 4 uses a GPLv3 license.

I found this quick solution where you use homebrew to install the latest Bash version and changing the terminal configuration to use this new shell.

If you don’t have homebrew it is quite easy with this one command:

/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"

Then you simply have to execute:

brew install bash

Changing the terminal configuration is easy because it is on te first tab of the preferences. Where it says “Shells open with” change from “Default login shell” to “Command (complete path)” and enter this string:

/usr/local/Cellar/bash/4.4.19/bin/bash

Obviously your version may vary depending on what is the latest one available.

 

Reference:
https://apple.stackexchange.com/questions/193411/update-bash-to-version-4-0-on-osx
https://brew.sh/

When kube-dns does not resolve

We have been seeing this intermittently in our different kubernetes clusters, kube-dns is not resolving some hosts names and it is causing failures with some batch jobs or containers that are dependent on others to start.

We know that the networking is working because we are able to reach services or containers with IPs. So I had to stop blaming flannel.

Our Linux architect noticed that br_netfilter module was loaded but not the xt_physdev one.

On each node he fixed it with these commands (first and last ones to verify the loaded modules and second to load the missing module):

[root@server ~]# lsmod | grep br_
br_netfilter 22209 0
bridge 136173 1 br_netfilter

[root@server ~]# modprobe xt_physdev

[root@server ~]# lsmod | grep br_
br_netfilter 22209 1 xt_physdev
bridge 136173 1 br_netfilter

Everything started to work perfectly after that little fix.

Struggling with Kafka Connect

I have been working on deploying Kafka on a Kubernetes cluster using the Confluent platform and it has been much harder than I expected.

Deploying their zookeeper in a StatefulSet seemed impossible so I had to switch to using the Google version of zookeeper. I just could not get it to deploy with a serverId that was unique and the Google example made this trivial.

Kafka was easily deployed once Zookeeper was working.

Kafka Connect has been very difficult and I have tried quite a few combinations. Nothing seems to work. Basically, it times out.

If you want to see the latest update to the YAML and some notes to this struggle I have it on Github: https://github.com/cinq/confluent-kafka-k8s

 

Learning Kotlin 1/10000

It is a nice language to learn but I am not dedicating enough time right now to make the progress I would like.

Little thing I learned last night was the repo for use the Anko “library” in my code.

maven { url "http://dl.bintray.com/kotlin/kotlin-dev" }

Sometimes the simplest thing takes you an hour to figure out. All the examples for Anko that I looked at were partial so they showed the dependency line to add:

compile("org.jetbrains.anko:anko-commons:${anko_version}")

I guess that I have to be happy that I figured something out.

Back to coding a micro-service with Spring Boot and Kotlin…

Memory on my Laptop

I just wanted to know what memory was in my Linux laptop to buy another stick. It is actually easy and very descriptive on what can be done with your latop:

sudo dmidecode –type 17

Gives you a nice output with all the details:

# dmidecode 3.1
Getting SMBIOS data from sysfs.
SMBIOS 3.0.0 present.

Handle 0x003E, DMI type 17, 40 bytes
Memory Device 
    Array Handle: 0x003D 
    Error Information Handle: Not Provided 
    Total Width: 64 bits 
    Data Width: 64 bits 
    Size: 8192 MB 
    Form Factor: SODIMM 
    Set: None 
    Locator: DIMM A 
    Bank Locator: BANK 0 
    Type: DDR4 
    Type Detail: Synchronous Unbuffered (Unregistered) 
    Speed: 2400 MT/s 
    Manufacturer: 000000000000 
    Serial Number: 00000000 
    Asset Tag: 00000000 
    Part Number: HMA81GS6AFR8N-UH     
    Rank: 1 
    Configured Clock Speed: 2400 MT/s 
    Minimum Voltage: 1.2 V 
    Maximum Voltage: 1.2 V 
    Configured Voltage: 1.2 V

Handle 0x003F, DMI type 17, 40 bytes
Memory Device 
    Array Handle: 0x003D 
    Error Information Handle: Not Provided 
    Total Width: Unknown 
    Data Width: Unknown 
    Size: No Module Installed 
    Form Factor: Unknown 
    Set: None 
    Locator: ChannelB-DIMM0 
    Bank Locator: BANK 2 
    Type: Unknown 
    Type Detail: None 
    Speed: Unknown 
    Manufacturer: Not Specified 
    Serial Number: Not Specified 
    Asset Tag: Not Specified 
    Part Number: Not Specified 
    Rank: Unknown Configured 
    Clock Speed: Unknown 
    Minimum Voltage: Unknown 
    Maximum Voltage: Unknown 
    Configured Voltage: Unknown

 

Hibernate relationship

Software version that were used:

  1. Spring Boot 1.5.9

The error message that triggered all the research:

2017-12-08 18:02:48.438 UTC [73] ERROR: constraint "uk_n7c2qd0x7l2yaq2xyyvs21rej" of relation "software_suite_instance" does not exist
2017-12-08 18:02:48.438 UTC [73] STATEMENT: alter table public.software_suite_instance drop constraint UK_n7c2qd0x7l2yaq2xyyvs21rej

The code fix was to properly map in both entities the link between them.

In the child entity we needed to add a field for the parent entity like:

@OneToMany
@MappedBy()
private ParentEntity parentEntity;

// Create a getter and setter for parentEntity

The error stopped showing up after this correction.