add personal blog and update categories and tags

This commit is contained in:
Peter Tillemans 2023-10-19 01:04:39 +02:00
parent 681a9c58a7
commit 98305ebc0b
33 changed files with 1211 additions and 200 deletions

View file

@ -9,6 +9,11 @@ build_search_index = true
theme = "blow" theme = "blow"
taxonomies = [
{ name="tags", feed=true },
{ name="categories", feed=true },
]
[markdown] [markdown]
# Whether to do syntax highlighting # Whether to do syntax highlighting
# Theme can be customised by setting the `highlight_theme` variable to a theme supported by Zola # Theme can be customised by setting the `highlight_theme` variable to a theme supported by Zola
@ -25,12 +30,12 @@ adsense_link = "https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js?c
items = [ items = [
{ lang = "en", links = [ { lang = "en", links = [
{ base_url = "/", name = "English" }, { base_url = "/", name = "English" },
{ base_url = "/fr", name = "French" }, # { base_url = "/fr", name = "French" },
] },
{ lang = "fr", links = [
{ base_url = "/", name = "Anglais" },
{ base_url = "/fr", name = "Français" },
] }, ] },
#{ lang = "fr", links = [
# { base_url = "/", name = "Anglais" },
# { base_url = "/fr", name = "Français" },
# ] },
] ]
[extra.navbar] [extra.navbar]
@ -39,25 +44,25 @@ items = [
{ url = "/", name = "Home" }, { url = "/", name = "Home" },
{ url = "/categories", name = "Categories" }, { url = "/categories", name = "Categories" },
{ url = "/tags", name = "Tags" }, { url = "/tags", name = "Tags" },
] }, ]},
{ lang = "fr", links = [ # { lang = "fr", links = [
{ url = "/fr", name = "Accueil" }, # { url = "/fr", name = "Accueil" },
{ url = "/fr/categories", name = "Categories" }, # { url = "/fr/categories", name = "Categories" },
{ url = "/fr/tags", name = "Tags" }, # { url = "/fr/tags", name = "Tags" },
] }, # ] },
] ]
title = "title" title = "Snamellit"
[extra.sidebar] [extra.sidebar]
items = [ items = [
{ lang = "en", links = [ { lang = "en", links = [
{ url = "/markdown", name = "Markdown" }, #{ url = "/markdown", name = "Markdown" },
{ url = "/blog", name = "Blog" }, { url = "/blog", name = "Blog" },
] }, ] },
{ lang = "fr", links = [ # { lang = "fr", links = [
{ url = "/fr/markdown", name = "Markdown" }, # { url = "/fr/markdown", name = "Markdown" },
{ url = "/fr/blog", name = "Blog" }, # { url = "/fr/blog", name = "Blog" },
] }, # ] },
] ]
# Index page # Index page

View file

@ -0,0 +1,110 @@
+++
title = "Relm on Windows"
+++
There are essentially 2 methods for compiling Gtk apps
- using *msvc* compiler and .LIB files
- using \"gnu\" backend and msys2 libraries
It is a pick your poison situation, the first option requires unpacking
zip files and organizing stuff so the rust compiler can find the
libraries and the runtime the DLLs.
The other option requires configuring rust to essentially cross compile
to the *x86~64~-pc-windows-gnu* target.
Since I know how to cross compile from my recent experiments with
compiling for embedded ARM processors and I generally know my way better
around the unixy msys environment, the latter it is.
# Installing Gtk in Msys2/Mingw64
Installing gtk3 is easy as it can be installed with *pacman*
``` example
$ pacman -S mingw64/mingw-w64-x86_64-gtk3
```
Add some support for integrating with c libraries and linking, etc...
``` example
$ pacman -S mingw-w64-x86_64-toolchain base-devel
```
Being optimistic I installed glade. Probably wont need it today but it
is a good test to see if Gtk apps at least start.
``` example
$ pacman -S mingw-w64-x86_64-glade
```
Glade starts and shows the GUI so the first milestone was reached.
# Preparing rust and cargo
First install the cross compilation target
``` example
$ rustup target add x86_64-pc-windows-gnu
```
I can now compile with
``` example
$ PKG_CONFIG_ALLOW_CROSS=1 cargo build --target=x86_64-pc-windows-gnu
```
To always enably this environment variable in the project folder I added
a *.env* file with
``` example
PKG_CONFIG_ALLOW_CROSS=1
```
which is picked up by the zsh dotenv plugin when I enter the folder.
Similarly we can change the default target for cargo with adding
``` example
build]
target = "x86_64-pc-windows-gnu"
```
to \*.cargo/config.toml
We can now simply do
``` example
$ rust build
```
or
``` example
$ rust run
```
# Using VSCode
When using *vscode* the rust language server needs to have this
enviroment setup too so it can do its magic. For debugging and running
you can do this in *launch.json* by setting the variables in the *env*
property, but this is (logically) not used for the language server.
There seem no way to have *vscode* respect the *.env* file soon enough
for the language server to pick it up.
The solution I settled on was to launch *vscode* from the command line
in the project folder:
``` example
$ code .
```
The prompt returns immediately and *vscode* is seeing the same
environment as the shell. (This does imply that the vscode bin directory
is on the path.)
Of course this wisdom was gained by fixing 1 error message after another
while trying to build an example in a 2018 blogpost. In hindsight I
could have started with the examples provided with the relm sourcecode,
but where would be the fun in that?

View file

@ -0,0 +1,86 @@
+++
title = "Leveraging Env Vars in Rust Apps"
[categories]
tags = [ "rust", "config" ]
categories = [ "programming", "apps"]
+++
Environment variable have gained a lot of importance since the rise of
the container based deployments and (consequently) the popularity of the
[12 factor app](https://12factor.net/).
It also has become very practical with the widespread support of the
*.env* file in the project folder which makes configuring apps during
development very practical.
# Using environment in Rust
The [std::env](https://doc.rust-lang.org/std/env/index.html) package
gives access to the environment variables, and also information about
the working directory, the location of the program executing, temp
folder, etc...
The method we really are interested in is
[var](https://doc.rust-lang.org/std/env/fn.var.html).
``` example
match env::var("MQTT_BROKER") {
Ok(mqtt_broker) => mqtt_init(&mqtt_broker).await,
Err(e) => error!("No broker specified in MQTT_BROKER environment variable.({})", e)
}
```
It returns a *Result\<String, VarError\>* which we can easily pattern
match on and give readable feedback to the user.
I thing this is perfectly fine for simple, small apps I am likely to
write in the foreseeable future.
# Controlling Logging from the Environment
Another thing needed for smallisch apps is a logging system with the
following requirements:
- Controllable via environment
- Add a timestamp
- Output to stdout or stderr (a 12 factor thing)
- Namespace modules
- Override config for specific modules
Rust has a standard logging API defined in the [log
crate](https://docs.rs/log/0.4.11/log/) crate for which a large
selection of implementations is available.
The first one on the [list with
implementations](https://docs.rs/log/0.4.11/log/#available-logging-implementations)
fit all my requirements, so that\'s fine.
All we need to do is initialize it after reading the environment
variables from the *.env* file :
``` example
async fn main() {
dotenv::dotenv().ok();
env_logger::init();
...
```
and we are logging using the standard `debug!`{.verbatim},
`info!`{.verbatim}, `warn!`{.verbatim}, ... macros.
# Scaling to larger apps
When apps grow (or just when they live long enough) they tend to
accumulate config options and layers of modules making logging also a
headache.
When confronted with these issues I saw that the *config* and *envy*
crates offer nice layered configuration support and straightforward
pouring in type safe structs.
Similarly there are more flexible, and consequently more complex,
logging frameworks like *log4rs*. There are also structured logging
libraries but I still need to see how these can work in containers
without adding additional hoops to jump through.
Let\'s hope my apps stay small and simple and do not need this
additional complexity.

View file

@ -0,0 +1,98 @@
+++
title = "Add tap Escapt to HHKB"
[categories]
tags = ["hhkb"]
categories = [ "keyboards" ]
+++
I configure all apps I can to use *vim* keybindings. Which means I use
the *Escape* key very often.
On my custom keyboards I have usually QMK or similar available to remap
the keys and use a tap on the CapsLock to mean *Escape* and hold to mean
*Control*. On the macbook pro I used *Karabiner* to program the same
effect. And even on Windows I found a *AutoHotKey* script. On Linux I
use an event interceptor between the keyboard and the rest of the OS.
So either the keyboard does it natively or the useless CapsLock is
remapped to the much more useful Ctrl/Esc combination.
# Happy Hacking Keyboard
This year I got a HHKB for my birthday. I have been wanting one of those
for a real long time, but choice anxiety and the new models arriving
last year prevented me from pulling the trigger so my family conspired
to end my suffering by buying me Hybrid Type-S. Super happy with it.
Of course it took some time to get used to the different location of the
backspace, and the practical use of the 2 keys top right. However my
muscle memory really expects the *Esc* to be under my pinky.
No problem I thought a quick Google will sort that out. Nope...
Been looking on and off for about a week before posting something in the
subreddit. No reply, probably because I forgot to add a nice picture.
The general consensus is to just buy an Hasu controller and use QMK to
implement the tap dance. However I do not want to rip out the guts of my
new keyboard, and ordering one will take some time too.
# My Solution
A thought crossed my mind to just try to do the same with the Control as
I do to the CapsLock in my autohotkey script. So I copy pasted the
CapsLock remapping to the end of the file and replaced CapsLock with
Control in the copy. Reloaded the script and everything seemed to work.
Waited a few days to confirm, and wrote it down before I forget.
Here is the relevant part:
``` example
Control::Send {esc}
Control & a::Send ^a
Control & b::Send ^b
Control & c::Send ^c
Control & d::Send ^d
Control & e::Send ^e
Control & f::Send ^f
Control & g::Send ^g
Control & h::Send ^h
Control & i::Send ^i
Control & j::Send ^j
Control & k::Send ^k
Control & l::Send ^l
Control & m::Send ^m
Control & n::Send ^n
Control & o::Send ^o
Control & p::Send ^p
Control & q::Send ^q
Control & r::Send ^r
Control & s::Send ^s
Control & t::Send ^t
Control & u::Send ^u
Control & v::Send ^v
Control & w::Send ^w
Control & x::Send ^x
Control & y::Send ^y
Control & z::Send ^z
Control & 0::Send ^0
Control & 1::Send ^1
Control & 2::Send ^2
Control & 3::Send ^3
Control & 4::Send ^4
Control & 5::Send ^5
Control & 6::Send ^6
Control & 7::Send ^7
Control & 8::Send ^8
Control & 9::Send ^9
Control & '::Send ^'
Control & ,::Send ^,
Control & .::Send ^.
Control & /::Send ^/
Control & -::Send ^-
Control & =::Send ^=
Control & [::Send ^[
Control & ]::Send ^]
```
Not elegant, but works fine for me.

View file

@ -0,0 +1,128 @@
+++
title = "Install Latex in Windows"
[taxonomies]
tags = [ "LaTeX" ]
categories = [ "apps" ]
+++
I have some latex templates to format documents to create fancy formal
looking PDF versions of my plaintext mode documents I create in
*org-mode* in *emacs*, well *spacemacs* actually.
# Installation
Since this are tools built in the UNIX world I expected a protracted
battle before I got all settings right, especially since there was no
version of Live Tex in the chocolatey repositories. I never tried MikTex
before.
## Installing MikTex
That is easy, in an admin shell run :
``` example
> choco install miktex
```
This will take a bit so we can already fetch the templates.
## Installing the custom LaTeX templates
I keep my templates in git repositories for easy updating. Some are
shared with others so new features are sometimes added. Let\'s make a
place for these
In the msys terminal :
``` example
$ cd
$ mkdir -p .local/texmf/tex/latex
```
I want to use the *.local/texmf* as the local extension folder, but
*MikTex* will not allow to select it if the *tex/latex* folders are
missing. Also this is consistent with the Mac and Linux versions.
## Installing the custom templates
Just add the repos for the *.sty* and *.cls* files to the folder we just
made:
``` example
$ cd ~/.local/texmf/tex/latex
$ git clone git@gitlab.xomeplace.com:latex/xomeplace-latex.git
$ ... repeat for other templates ...
```
## Configuring MikTex
After installation look for MikTex in the *Start Menu* to start the
config tool. It will complain that no update has been done yes, so I
humoured it by updating, none are available since it is just installed,
but it keeps regularly reminding me to update. I assume this\'ll go away
once some update arrives.
Add the *texmf* folder we created :
- Press the \'+\' icon
- Navigate the \~/.local/texmf
- Confirm
Keep the tool open because we need to copy the location of the *bin*
folder in the next step.
\## Add the TeX tools to the PATH
I only use the TeX tools from Spacemacs so I\'ll just add it there. The
Spacemacs dev team decided to make the environment variables, including
the path, load from a config file. Having been at the receiving end of
the confusion which follows from the subtle differences when launching
*Emacs* as daemon, from the GUI menu or from the command line, I
heartily applaud this approach.
In any case I just update the PATH in *Spacemacs*
- Space f e e (to open the environment config file)
- find the line with PATH= ( *\'*\' PATH Enter, maybe some /n\* to
find the one)
- Copy the location from the config tool
- Paste it in the PATH value (do not forget the \';\' separator)
- Esc : w (to save changes)
- Space f e E (to reload the new value)
# Using it
Well nothing new here, it kind of just works:
## Creating a PDF
Open an org file with the *LaTeX headers* or add them with *, e e \#*
and select latex from the list. Check the *LATEX~CLASS~* is one of the
custom classes.
Then *, e e l o* and ... nothing will happen, ..., well emacs tells it
is busy with the texfile. It takes a while, taskmaster shows processes
blinking in and of existence under the Emacs process. I assume MikTex is
compiling stuff in the background of first use.
Eventually it returns stating PDF export failed and to consult the log
file.
## Fix Errors
The end of the logfile showed scary things of not being able to write.
Let\'s ignore those for now : I learned to treat the first errors in the
LaTeX log output first, and retry and move my way down the log till
there are no more errors.
First Error : \'Libertine Font\' was not found, a bit further the same
with \'Lato\'.
Download the fonts, unzip, select all *.ttf* files and right-click
install.
Try again and the PDF opens ... in the Edge Browser ??? And the Edge
Browser attached itself to the taskbar??? Again??? I need to tackle that
some time.
Well, it works. Take the document and send it to those you wanted to
impress.

View file

@ -0,0 +1,289 @@
+++
title = "Setting Up Blogging with Emacs"
[categories]
tags = [ "emacs" ]
categories = [ "apps" ]
+++
I\'d like to blog more notes on stuff I do and it would be nice to have
a smooth workflow in my editor of choice.
It is too late to explain a lot, but all these things were proudly found
elsewhere. See the references list at the end of this post.
# Creating the blog project
To deploy to github as a personal blog you have to create a repo in the
form **\<username\>.github.io**. Since I name my projects the same as
the repos the name was a quick choice.
The structure is as follows:
``` text
<root> -+- blog -+- posts -+- <blog posts>
| +- org-template -+- <templates>
| +- css -+- <css files>
+- public -+- <built website>
+- .github -+- workflow --- main.yml <github actions>
+- publish.el
+- Makefile <local develop actions>
```
# Configuring org-publish
This is what the **publish.el** file is for.
Prepare some snippets for the HTML pages
First off, a link to the CSS:
``` elisp
(setq website-html-head "<link rel=\"stylesheet\" href=\"css/site.css\"
type=\"text/css\"/>")
```
Let\'s also add a navigation menu at the top of each page:
``` elisp
(setq website-html-preamble
"<div class=\"nav\">
<ul>
<li><a href=\"/\">Home</a></li>
<li><a href=\"https://github.com/ptillemans\">GitHub</a></li>
</ul>
</div>")
```
And a footer :
``` elisp
(setq website-html-postamble "<div class=\"footer\"> Copyright 2020 %a (%v
HTML).<br> Last updated %C.<br> Built with %c. </div>")
```
And now we can all tie it together by creating the
**org-publish-project-alist**:
``` elisp
(setq org-publish-project-alist
`(("posts"
;; configure project structure
:base-directory "blog/posts/"
:base-extension "org"
:publishing-directory "public/"
:recursive t
:publishing-function org-html-publish-to-html
;; configure index creation
:auto-sitemap t
:sitemap-title "Blog Index"
:sitemap-filename "index.org"
:sitemap-style tree
:sitemap-file-entry-format "%d - %t"
:sitemap-sort-files anti-chronologically
:html-doctype "html5"
:html-html5-fancy t
:html-head ,website-html-head
:html-preamble ,website-html-preamble
:html-postamble ,website-html-postamble
:author "Peter Tillemans"
:email "pti@snamellit.com"
:with-creator t)
("blog-static"
:base-directory "blog/posts/"
:base-extension "png\\|jpg\\|gif\\|pdf\\|mp3\\|ogg\\|swf"
:publishing-directory "public_html/"
:recursive t
:publishing-function org-publish-attachment)
("css"
:base-directory "blog/css/"
:base-extension "css"
:publishing-directory "public/css"
:publishing-function org-publish-attachment
:recursive t)
("all" :components ("posts" "css" "blog-static"))))
```
# Make local testing easy
The commands to build the blog are not hard, but hard to remember and
hard to type.
Let\'s make a makefile to help:
``` makefile
.PHONY: all publish publish_no_init
all: publish
publish: publish.el
@echo "Publishing... with current Emacs configurations."
emacs --batch --load publish.el --funcall org-publish-all
publish_no_init:
@echo "Publishing... with --no-init"
emacs --batch --no-init --load publish.el --funcall org-publish-all
clean:
@echo "Cleaning up..."
rm -rf public
@rm -rvf *.elc
@rm -rvf public
@rm -rvf ~/.org-timestamps/*
serve: publish
@echo "Serving site"
python -m http.server --directory public
```
For local testing just do:
``` shell
$ make clean serve
```
If the only change is new content then not cleaning is much faster.
# Deploy to Github Pages
A slightly modified version of the initial workflow will do the
publishing:
``` yaml
# This is a basic workflow to help you get started with Actions
name: CI
# Controls when the action will run. Triggers the workflow on push or pull request
# events but only for the master branch
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build"
build:
# The type of runner that the job will run on
runs-on: ubuntu-latest
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
- uses: actions/checkout@master
with:
fetch-depth: 1
- name: build
uses: docker://iquiw/alpine-emacs
if: github.event.deleted == false
with:
args: emacs --batch --load publish.el --funcall org-publish-all
- name: deploy
uses: peaceiris/actions-gh-pages@v1.1.0
if: success()
env:
GITHUB_TOKEN: ${{ secrets.PERSONAL_ACCESS_TOKEN }}
PUBLISH_BRANCH: gh-pages
PUBLISH_DIR: ./public
```
Note that you need to put a secret **PERSONAL~ACCESSTOKEN~** with an
access token which has basic push access to the repo to push the built
site to the gh-pages branch.
For the emacs call, I just copied the command from the **Makefile**.
After a push the site is usually up by the time I check, say in about a
minute.
# Setting up a Capture Template
This proved to be the hardest part to get working.
I am using **Doom Emacs** so I wrap everything in
**with-eval-after-load**.
The challenge was that the title is needed to create the slug for the
filename and then again as title for the post. So my ugly solution is to
stuff it in a variable and get the variable back in the template.
``` elisp
(with-eval-after-load 'org-capture
(defvar snamellit/blog-title "test-title")
(setq snamellit-blog-template "#+title: %(progn snamellit-blog-title)
#+date: %t
#+author: Peter Tillemans
#+email: pti@snamellit.com
%?")
(defun snamellit/capture-blog-post-file ()
(let* ((title (read-from-minibuffer "Post Title: "))
(slug (replace-regexp-in-string "[^a-z0-9]+" "-" (downcase title))))
(setq snamellit/blog-title title)
(format "~/Projects/ptillemans.github.io/blog/posts/%s-%s.org"
(format-time-string "%Y-%m-%d" (current-time))
slug)))
(add-to-list 'org-capture-templates
'("b" "Blog Post" plain
(file snamellit/capture-blog-post-file)
(file "~/.doom.d/tpl-blog-post.org"))))
```
The **tpl-blog-post.org** template file :
``` org
#+title: %(progn snamellit/blog-title)
#+date: %<%Y-%m-%d>
%?
```
It is very minimal and I\'d like to keep it that way.
# In use
To create a blog post
- SPC-X b will create the post
- Give a title for the post
- A template file is created (unfortunately in plain text)
- Enter the idea, hook and save with C-c C-c
- Open the org file with SPC-f r (open recent file)
- Flesh out the post using org-mode goodness
- save, commit and push to git
After push the github action will bring it live
# References
Following links were useful in setting this up:
- [How to blog with Emacs Org
mode](https://opensource.com/article/20/3/blog-emacs)
- [GitHub Pages
documentation](https://docs.github.com/en/free-pro-team@latest/github/working-with-github-pages/about-github-pages)
- [The Org Mode Manual](https://orgmode.org/org.html)
- [The Meg in Progress post on building a static blog with
org-mode.](https://meganrenae21.github.io/Meg-in-Progress/posts/blogging-with-org-mode.html)
- [Website with org-mode](https://thenybble.de/projects/orgsite.html)
- [Richard Kallos\' post on site generation with Org
Mode](https://rkallos.com/blog/2017/01/02/static-site-generation-with-org-mode/)
- [Blogging with Org
mode](https://www.brautaset.org/articles/2017/blogging-with-org-mode.html)
- [Org Capture Tricks from
Storax](https://storax.github.io/blog/2016/05/02/org-capture-tricks/)

View file

@ -0,0 +1,89 @@
+++
title = "Access NAS Drive using Windows 10 PIN login"
[categories]
tags = ["nas"]
categories = ["os"]
+++
Each time I login using the PIN of my Microsoft account, my mounted
folders on the NAS are not mounted. After login I get a notification
that not all mounted drives could be connected. When double clicking on
the failed mount I get a silly error regarding duplicate sessions or
something, and I can access after logging in.
This is irritating, but also breaks automations like auto importing
notes and documents into evernote from the NAS folder.
# Attempt 1
This seems to be a known issue with PIN (and by extension face
recognition (?)) login.
On Windows answers I found a
[workaround](https://answers.microsoft.com/en-us/windows/forum/all/unable-to-access-nas-drive-when-logged-in-using/3587cf33-7ed9-403f-ac7c-d4158969412d)
:
- Open the **Credential Manager**
- Select **Windows Credential**
- **Add a Windows Credential** (it will actually modify the existing
one)
- network address : \\\\ [your NAS]{.underline}
- User name : ~your~ NAS\_`\pti`{=latex}
- Password : [your password on the NAS]{.underline}
From now on your drive should be mounted after rebooting.
Unfortunately, it does not.
Removing and adding again does not help either.
# Attempt 2 : remap network drives on login
In 2018 [another
workaround](https://support.microsoft.com/en-us/help/4471218/mapped-network-drive-may-fail-to-reconnect-in-windows-10-version-1809#:~:text=Workaround%201%3A%20Create%20a%20startup%20item&text=If%20the%20device%20has%20not,t%20automatically%20reconnect%20network%20drives.&text=A%20log%20file%20(StartupLog.,to%20open%20the%20mapped%20drives.)
was published.
This relies on a powershell script launched by a Command script to walk
over unavailable mapped drives and map them again.
It uses 2 scripts:
## \*%ProgramData%`\Microsoft`{=latex}`\Windows`{=latex}`\Start`{=latex} Menu`\Programs`{=latex}`\StartUp`{=latex}`\MapDrives`{=latex}.cmd
A startup scrip to kickoff remapping after login
``` shell
PowerShell -Command "Set-ExecutionPolicy -Scope CurrentUser Unrestricted" >> "%TEMP%\StartupLog.txt" 2>&1
PowerShell -File "%SystemDrive%\Scripts\MapDrives.ps1" >> "%TEMP%\StartupLog.txt" 2>&1
```
## %SystemDrive%`\Scripts`{=latex}`\MapDrives`{=latex}.ps1
``` shell
$i=3
while($True){
$error.clear()
$MappedDrives = Get-SmbMapping |where -property Status -Value Unavailable -EQ | select LocalPath,RemotePath
foreach( $MappedDrive in $MappedDrives)
{
try {
New-SmbMapping -LocalPath $MappedDrive.LocalPath -RemotePath $MappedDrive.RemotePath -Persistent $True
} catch {
Write-Host "There was an error mapping $MappedDrive.RemotePath to $MappedDrive.LocalPath"
}
}
$i = $i - 1
if($error.Count -eq 0 -Or $i -eq 0) {break}
Start-Sleep -Seconds 30
}
```
## Evaluation
After rebooting and logging in, I still get the error that not all
drives could be mounted, however, by the time I can check in the
explorer the volume is mounted and ready to be used.
Not very elegant as the notification still feels terribly yanky, but at
least it no longer interferes my workflows.

View file

@ -0,0 +1,211 @@
+++
title = "Running Emacs with wsl2-xrdp"
[taxonomies]
tags = [ "wsl", "emacs" ]
categories = ["os", "apps" ]
+++
# Why
I have been using Emacs in WSL2 using an X server running in Windows and
that works fine as long as long as the computer does not go to sleep.
When the computer goes to sleep, the X connection is cut and the emacs
process crashes. I like to just have my emacs session available so I can
continue where I was last time and I like my computer to go to sleep
when I not use it because Global Warming.
- start emacs in WSL2 in GUI mode
- can be reconnected after sleep
- copy-paste works transparent
# Plan
- use **xrdp** as remote desktop is built into windows
- configure xrdp to startup in WSL2
- run emacs in a remote desktop session
# Installation
## Install xrdp
[Arch wiki page for xrdp](https://wiki.archlinux.org/index.php/xrdp)
``` shell
$ yay -S xrdp xorgxrdp-git
```
We do not have systemd in WSL2 so I started the daemons manually with a
small script \*/usr/local/bin/start-xrdp:
``` shell
#!/bin/bash
sudo /usr/sbin/xrdp
sudo /usr/sbin/xrdp-sesman
```
We can now start it with **start-xrdp** from the bash command line or
using **wsl -u root /usr/local/bin/start-xrdp**. If not running as root
(or recently authenticated sudo access) it will ask for your Linux
password to allow running as **sudo**.
Trying to connect lets me login but after a timeout is shows a dialog
box telling me Xorg did not want to start. This is confirmed in the
**/var/log/xrdp-sesman.log** file.
The root cause is that I cannot read properly because if I could, I
would have read to add *allowed\\~users~=anybody* to the
**/etc/X11/Xwrapper.config** file to allow **Xorg** to be started by
regular users like me instead of only **root**.
Once that is there I get a nice black screen after login.
Note: Each time WSL2 restarts it gets a random ip address. So I created
a small script to dig out the actual ip address out of the output of
**ip address** :
``` shell
#!/bin/bash
ip address show dev eth0 | grep "inet " | sed -e 's/.*inet \([^/]*\).*/\1/'
```
Which I gave the original name of **/usr/local/bin/ip-address** (do not
forget to `chmod +x /usr/local/bin/ip-address` to make it executable) so
I can easily call it from powershell with `wsl ip-address`.
## Installing DBUS replacement
DBUS is the de-facto GNU/Linux desktop communication bus which glues all
kind of GUI apps together. I do not know if it is directly used by Emacs
or any of the extensions I use, however it reduces the amount of errors
and warnings.
``` shell
$ yay -S dbus-x11
```
allows xfce and other programs to feel happy and display the desktop
with an emacs window. Now just maximizing the window and the goal is
reached.
## Automatic start of Emacs only
In order to just start emacs maximized in a single remote desktop window
we only need to start it as the only program in the XSession. This of
course also means there is no chrome on the X-Window or the ability to
lauch other programs outside of Emacs. This is exactly how I like it.
``` shell
#file:~/.xinitrc
emacs -mm
```
If you rather have a full desktop environment, see further.
## Installing Xfce4 (Optional)
I\'d rather have something more lightweight as window manager, but I
have experience with Xfce4 in Arch and I also like something which just
works.
I only intend to run emacs in the window however having something a bit
more capable which works can help me debug the environment. (I have some
font things to sort out too...)
``` shell
$ sudo pacman -S xfce4
```
and then start it form **\~/.xinitrc**
``` shell
emacs &
startxfce4
```
If you have skipped the **DBUS** setup above, this hangs while the
**\~/.xorgxrdp.log** file is filling up with errors complaining about
missing connection to dbus.
# Using this from Windows
## Starting from the command-line
We can start remote desktop session using
>_ mstsc /v:$(wsl ip-address) /h:2560 /w:1600
This works, however we get now a prompt to accept the certificate and we
still need to login. We can make this smoother
## Accepting the certificate
You can accept the certificate and let remote desktop add it to your
certificate stores. This solves this interruption.
However, this still happens each time the ip address changes.
## Automatic login
Start **remote desktop** GUI using the search or from the start menu.
Fill in the ip address returned by `wsl ip-address` and your username.
Enable the flag to store your password. Login and save the configuration
as e.g. **emacs.rdp**.
We can now start emacs using
>_ cmdkey /generic:$(wsl ip-address) /user:<username> /pass:<password>
>_ mstsc /v:$(wsl ip-address)
We can assemble this is a small script **wsl-emacs.ps1** somewhere on
your path:
``` powershell
wsl -u root /usr/local/bin/start-xrdp
$address = $(wsl ip-address)
$userName = "pti"
$userPwd = "shht!Secret"
cmdkey /generic:$address /user:$userName /pass:$userPwd
mstsc /v:$address
```
Which allows us to start our **wsl-emacs** from powershell or as a
startup application with a shortcut. It ensures the **xrdp** daemons are
running (they are idempotent, so the script can be run multiple times),
then the credentials are created so they can be picked up by remote
desktop.
To add a shortcut to the start menu:
- Type Win-R and open
\*%AppData%`\Microsoft`{=latex}`\Windows`{=latex}`\Start`{=latex}
Menu`\Programs*`{=latex}
- create a new shortcut
- set as target \*powershell.exe \"&
\'\<path-of-script\>`\wsl`{=latex}-emacs.ps1\'
Note the weird **\"&** on the command-line.
## A note on security
Since you can run any command from the windows command-line as root, the
current logged in person has full access to anything in the WSL Linux
machines. As such there is not a big hole added by adding your linux
password somewhere securely in your account files such as the startup
script.
This does not mean you should not have secure passwords, as your linux
box can expose its ports (not by default but just assume they are) and
allow e.g. ssh access. Since I assume a lot of WSL2 hosts will be used
fast and loose as a development box, there is a good chance that sooner
or later a port is opened for reasons.
So I would not worry too much your linux box password is exposed in the
emacs startup script as long as it is hard enough and not used anywhere
else. If you\'d like to get it from some secure vault on your PC or from
your infrastructure, go for it.
tldr;
- use a strong password for your WSL box
- do not reuse an existing password
- secure your startup script so it is only readable by you.

View file

@ -2,6 +2,9 @@
title: 'Adding audit information to entities in the Seam framework.' title: 'Adding audit information to entities in the Seam framework.'
date: 2008-11-11T13:15:51.000Z date: 2008-11-11T13:15:51.000Z
draft: false draft: false
taxonomies:
tags: ["jvm"]
categories: ["programming"]
--- ---
Since we write a lot of stuff for use in the production side of the business we need to comply with rules which allow effective control that the processes and procedures have been followed. In the automotive industry people take these measures seriously. Since we write a lot of stuff for use in the production side of the business we need to comply with rules which allow effective control that the processes and procedures have been followed. In the automotive industry people take these measures seriously.

View file

@ -2,6 +2,9 @@
title: 'Building debian packages in a cleanroom' title: 'Building debian packages in a cleanroom'
date: 2011-07-14T16:19:00.000Z date: 2011-07-14T16:19:00.000Z
draft: false draft: false
taxonomies:
categories: [ "os" ]
tags: [ "debian" ]
--- ---
### Overview and Goals ### Overview and Goals

View file

@ -2,6 +2,29 @@
title: 'Deliverables and Activities Ahah!! Moment' title: 'Deliverables and Activities Ahah!! Moment'
date: 2008-07-15T01:32:39.000Z date: 2008-07-15T01:32:39.000Z
draft: false draft: false
taxonomies:
tags: [ "bfo"]
categories: [ "project"]
--- ---
Something I already knew became suddenly clear today : the product or product pieces in the WBS and the relation to the activities. Although I already knew for a long time that it is good practice to make the WBS deliverable oriented instead of activity oriented it remained always a gradient where activities blended seamlessly in deliverables. The key was that I used a mental trick derived from 'Getting Things Done', which says that you must write your activities action oriented, with a verb, active voice, i.e. do something. I was rephrasing the activities this way (more bang per spreadsheet cell).  Now I applied this reasoning to the WBS work packages, but I rewrote it as things, or part of things. Again the clarity improved considerably and wordiness got down. And then : klabammm.... flash of light : activities were clearly separated from the WBS work packages, the grey area between them was gone!!!. I am quite sure if I read my PM books again I will find this trick in every single one, but I had to "invent" it myself before I understood and felt it, instead of just knowing it. Something I already knew became suddenly clear today : the product or
product pieces in the WBS and the relation to the activities. Although
I already knew for a long time that it is good practice to make the
WBS deliverable oriented instead of activity oriented it remained
always a gradient where activities blended seamlessly in
deliverables.
The key was that I used a mental trick derived from 'Getting Things
Done', which says that you must write your activities action oriented,
with a verb, active voice, i.e. do something. I was rephrasing the
activities this way (more bang per spreadsheet cell). 
Now I applied this reasoning to the WBS work packages, but I rewrote
it as things, or part of things. Again the clarity improved
considerably and wordiness got down. And then : klabammm.... flash of
light : activities were clearly separated from the WBS work packages,
the grey area between them was gone!!!.
I am quite sure if I read my PM books again I will find this trick in
every single one, but I had to "invent" it myself before I understood
and felt it, instead of just knowing it.

View file

@ -2,6 +2,9 @@
title: 'Disable authentication for global proxy settings on Ubuntu' title: 'Disable authentication for global proxy settings on Ubuntu'
date: 2011-07-22T14:33:00.000Z date: 2011-07-22T14:33:00.000Z
draft: false draft: false
taxonomies:
tags: [ "proxy", "ubuntu"]
categories: [ "os"]
--- ---
2011-07-22 Fri 16:33 [Figure out proxy settings Linux](snamellit.html#ID-c965ad9d-522b-4dfe-9574-b8d2a78c83a3) Ubuntu has a Network Proxy chooser which allows you to select a location (a la MacOSX). This works well enough except that the UI is a bit counter-intuitive (in my humble opinion)which causes me to regularly nuke some predefined setting inadvertently. This is not a big deal though. However for update manager (and several other tools) to pick up the new proxy settings you need to push the settings down to the system level. This takes 2 times typing your password. Now, this IS a big deal. When I go back and forth between work and home I have to change this at least 2 times per day. Also it irks me that a detail setting like the proxy is not auto-detected and I need to login to change this 'system' setting. My laptop is essentially a single user system and I do not see switching the proxy as a serious security issue, even with 3 kids running around the home. To come back to auto-detection, while this works fine at work, it fails to figure out that at home that there is a direct connection to the Internet. I can probably fix this by replacing my aging wireless router with my Time Capsule as the Internet gateway router, but I prefer to have the Time Capsule close to my desk. In any case the **Network proxy** shows 2 times the authentication dialog box. A particularly nice feature (Is this new in Natty?) is that the dialog shows for which DBUS setting access is being asked. The first dialog asks access to **com.ubuntu.systemservice.setProxy**. This response is configured in the file **/usr/share/polkit-1/actions/com.ubuntu.systemservice.policy**. This is a very readable XML file which contains a section for the **setProxy** action. I feel no reservation in allowing unchecked access to the **setProxy**. Although this might make a man-in-the-middle attack easier someone with the sophistication to pull this off, does not need to doctor my PC to do it. 2011-07-22 Fri 16:33 [Figure out proxy settings Linux](snamellit.html#ID-c965ad9d-522b-4dfe-9574-b8d2a78c83a3) Ubuntu has a Network Proxy chooser which allows you to select a location (a la MacOSX). This works well enough except that the UI is a bit counter-intuitive (in my humble opinion)which causes me to regularly nuke some predefined setting inadvertently. This is not a big deal though. However for update manager (and several other tools) to pick up the new proxy settings you need to push the settings down to the system level. This takes 2 times typing your password. Now, this IS a big deal. When I go back and forth between work and home I have to change this at least 2 times per day. Also it irks me that a detail setting like the proxy is not auto-detected and I need to login to change this 'system' setting. My laptop is essentially a single user system and I do not see switching the proxy as a serious security issue, even with 3 kids running around the home. To come back to auto-detection, while this works fine at work, it fails to figure out that at home that there is a direct connection to the Internet. I can probably fix this by replacing my aging wireless router with my Time Capsule as the Internet gateway router, but I prefer to have the Time Capsule close to my desk. In any case the **Network proxy** shows 2 times the authentication dialog box. A particularly nice feature (Is this new in Natty?) is that the dialog shows for which DBUS setting access is being asked. The first dialog asks access to **com.ubuntu.systemservice.setProxy**. This response is configured in the file **/usr/share/polkit-1/actions/com.ubuntu.systemservice.policy**. This is a very readable XML file which contains a section for the **setProxy** action. I feel no reservation in allowing unchecked access to the **setProxy**. Although this might make a man-in-the-middle attack easier someone with the sophistication to pull this off, does not need to doctor my PC to do it.

View file

@ -2,6 +2,11 @@
title: 'Eclipse Ganymede crashes on 64-bit Ubuntu Hardy Herron' title: 'Eclipse Ganymede crashes on 64-bit Ubuntu Hardy Herron'
date: 2008-07-12T06:56:04.000Z date: 2008-07-12T06:56:04.000Z
draft: false draft: false
taxonomies:
tags: [ "jvm"]
categories: [ "programming"]
--- ---
In java6 there is a bug in the 64-bit linux version of the jvm which causes eclipse to crash when opening projects with some aditional plugins installed. Both the openjdk as the sun versions are affected. for more info see : In java6 there is a bug in the 64-bit linux version of the jvm which causes eclipse to crash when opening projects with some aditional plugins installed. Both the openjdk as the sun versions are affected. for more info see :

View file

@ -2,6 +2,11 @@
title: 'Enable real idletime for Org Mode on Ubuntu' title: 'Enable real idletime for Org Mode on Ubuntu'
date: 2011-07-04T09:11:00.000Z date: 2011-07-04T09:11:00.000Z
draft: false draft: false
taxonomies:
tags: [ "emacs" ]
categories: [ "apps" ]
--- ---
Org Mode can use the idle time to correct time tracking entries. On the Mac this works on the idle time of the computer, on other platforms it uses the idle time of emacs. Hence if you do a significant task in another program it will be idle for org-mode. There is a little program delivered with org-mode sources to estimate the "real" idle time based on the information used by screensavers. Unfortunaltely it is in source code and needs to be compiled first. Everything is actually well documented, just scattered. Let's see where that little program is located Org Mode can use the idle time to correct time tracking entries. On the Mac this works on the idle time of the computer, on other platforms it uses the idle time of emacs. Hence if you do a significant task in another program it will be idle for org-mode. There is a little program delivered with org-mode sources to estimate the "real" idle time based on the information used by screensavers. Unfortunaltely it is in source code and needs to be compiled first. Everything is actually well documented, just scattered. Let's see where that little program is located

View file

@ -2,6 +2,11 @@
title: 'Fix for Sonar choking on ''result returns more than one elements''' title: 'Fix for Sonar choking on ''result returns more than one elements'''
date: 2011-06-23T12:54:05.000Z date: 2011-06-23T12:54:05.000Z
draft: false draft: false
taxonomies:
tags: [ "CI", "QA" ]
categories: [ "programming"]
--- ---
Recently our sonar installation on our Hudson CI choked again, this time with an error I have not seen before. It was just before the release of an important milestone for the team, so not being unable to publish new version on the test server could not come on a worse time. Recently our sonar installation on our Hudson CI choked again, this time with an error I have not seen before. It was just before the release of an important milestone for the team, so not being unable to publish new version on the test server could not come on a worse time.

View file

@ -1,30 +0,0 @@
---
title: 'Gallery'
date: 2021-11-28T12:57:12.000Z
draft: false
---
Discover. Create. Experience.
=============================
* ![Image description](http://www.snamellit.com/wp-content/uploads/2021/11/image-1.jpg)
* ![Image description](http://www.snamellit.com/wp-content/uploads/2021/11/image-2.jpg)
* ![Image description](http://www.snamellit.com/wp-content/uploads/2021/11/image-3.jpg)
* ![Image description](http://www.snamellit.com/wp-content/uploads/2021/11/image-4.jpg)
* ![Image description](http://www.snamellit.com/wp-content/uploads/2021/11/image-5.jpg)
![Image description](http://www.snamellit.com/wp-content/uploads/2021/11/image-6.jpg)
* ![Image description](http://www.snamellit.com/wp-content/uploads/2021/11/image-7.jpg)
* ![Image description](http://www.snamellit.com/wp-content/uploads/2021/11/image-8.jpg)
### Let me design your home
[Book a Consultation](http://www.snamellit.com/contact/)

View file

@ -2,6 +2,12 @@
title: 'Java is a First Class Citizen on Ubuntu Hardy Linux' title: 'Java is a First Class Citizen on Ubuntu Hardy Linux'
date: 2008-08-08T18:07:52.000Z date: 2008-08-08T18:07:52.000Z
draft: false draft: false
taxonomies:
tags: [ "jvm" ]
categories: [ "programming", "os"]
--- ---
Lately I hear a lot of Java bashing from a very vocal part of the Linux community. I do a lot of Java, I use Linux on my main laptop and I like this just fine. I use daily freemind (Java mindmapper) and openoffice (which is Java enabled). I was doing this for the past years since Windows XP ate my C-drive for the N-th time (and now it did the same on my gaming rig at home, grrr... ). Lately I hear a lot of Java bashing from a very vocal part of the Linux community. I do a lot of Java, I use Linux on my main laptop and I like this just fine. I use daily freemind (Java mindmapper) and openoffice (which is Java enabled). I was doing this for the past years since Windows XP ate my C-drive for the N-th time (and now it did the same on my gaming rig at home, grrr... ).

View file

@ -2,6 +2,11 @@
title: 'Learnings from the PMI Benelux Day.' title: 'Learnings from the PMI Benelux Day.'
date: 2008-09-28T00:29:48.000Z date: 2008-09-28T00:29:48.000Z
draft: false draft: false
taxonomies:
tags: [ "PMI"]
categories: [ "project"]
--- ---
I went to the PMI Benelux Day today. The theme today was _A symphony of Knowledge,_ a theme which ran through the plenary sessions topics. I choose to see the use of earned value techniques applied to the Galileo project presented by Francois Picaut. It was interesting to see how this technique was applied in a quite straightforward manner by just entereing the AC provided by the accountants and the EV provided by the products completed at roughly biweekly milestones. All other performance numbers were calculated from these numbers_._ One learning was that in the case of subcontracted work under a FFP contract, the EV = AC. Of course once you think it over this is evident, but this quarter took a while to drop. Some additional metrics were defined like the ES (Earned Schedule, or the time when the current EV should have been reached) and 'To Complete Performance Indexes' in their cost and scope variations. Apprently these metrics should show the effectiveness of the project management when plotted over time.  He applied EVT on FFP projects with good success as part of project assurance. One anekdote was the case when the project manager presented a 'project on track' report while the EV calculations showed the project would end with a 1.000.000EUR loss. This triggered a discussion about the variances which triggered corrective actions. As such this proved the value of the method to confirm the results from bottom-up or other estimates.  In the area of risk management presented Daniel vander Borcht a session about why the ABCD methodology works where so many other risk management approaches fail. ABCD stnds for Assumption Based Communication Dynamics and a key part is the central role of assumptions in this model. Next to the classic issue register and risk register a assumption register is introduced. I need to research this some more Jean Diederich presented the Test Monkeys and Banana Software talk. He made the case that thinking of testing still occurs way too late in the project. He proposes to involve tests at the earliest opportunity to help start the acceptance test design, release test design, ... in parallel to development rather than afterwards. This is the same story I heard from Suzanne Robertson a couple of weeks ago in the context of requirement gathering. This confirms again my conviction that a good requirements process will make life in the project considerably easier.  A last presentation was by Hedda Pahlson-Muller regarding KPO's or knowledge process outsourcing. This was a very instructive talk which clarified this form of outsourcing. Roughly they provide analysts or other knowledge workers in the company the possibility to have an external team to do the legwork and provide collected data for further analysis by the company itself. Due to this business model they face a glass ceiling to the service they cannot deliver analysis or recommendations. For this they are looking for PM/Consultants to use their experience to analyse and interprete the data for the companies which ask this. Nice interesting day, bought a couple of books, talked PM with other people. Time well spent. I went to the PMI Benelux Day today. The theme today was _A symphony of Knowledge,_ a theme which ran through the plenary sessions topics. I choose to see the use of earned value techniques applied to the Galileo project presented by Francois Picaut. It was interesting to see how this technique was applied in a quite straightforward manner by just entereing the AC provided by the accountants and the EV provided by the products completed at roughly biweekly milestones. All other performance numbers were calculated from these numbers_._ One learning was that in the case of subcontracted work under a FFP contract, the EV = AC. Of course once you think it over this is evident, but this quarter took a while to drop. Some additional metrics were defined like the ES (Earned Schedule, or the time when the current EV should have been reached) and 'To Complete Performance Indexes' in their cost and scope variations. Apprently these metrics should show the effectiveness of the project management when plotted over time.  He applied EVT on FFP projects with good success as part of project assurance. One anekdote was the case when the project manager presented a 'project on track' report while the EV calculations showed the project would end with a 1.000.000EUR loss. This triggered a discussion about the variances which triggered corrective actions. As such this proved the value of the method to confirm the results from bottom-up or other estimates.  In the area of risk management presented Daniel vander Borcht a session about why the ABCD methodology works where so many other risk management approaches fail. ABCD stnds for Assumption Based Communication Dynamics and a key part is the central role of assumptions in this model. Next to the classic issue register and risk register a assumption register is introduced. I need to research this some more Jean Diederich presented the Test Monkeys and Banana Software talk. He made the case that thinking of testing still occurs way too late in the project. He proposes to involve tests at the earliest opportunity to help start the acceptance test design, release test design, ... in parallel to development rather than afterwards. This is the same story I heard from Suzanne Robertson a couple of weeks ago in the context of requirement gathering. This confirms again my conviction that a good requirements process will make life in the project considerably easier.  A last presentation was by Hedda Pahlson-Muller regarding KPO's or knowledge process outsourcing. This was a very instructive talk which clarified this form of outsourcing. Roughly they provide analysts or other knowledge workers in the company the possibility to have an external team to do the legwork and provide collected data for further analysis by the company itself. Due to this business model they face a glass ceiling to the service they cannot deliver analysis or recommendations. For this they are looking for PM/Consultants to use their experience to analyse and interprete the data for the companies which ask this. Nice interesting day, bought a couple of books, talked PM with other people. Time well spent.

View file

@ -2,6 +2,11 @@
title: 'Logging Notification Messages in Ubuntu' title: 'Logging Notification Messages in Ubuntu'
date: 2013-07-22T11:48:16.000Z date: 2013-07-22T11:48:16.000Z
draft: false draft: false
taxonomies:
tags: [ "apps"]
categories: [ "os"]
--- ---
Ubuntu contains a nice notification system to talk to the user about noteworthy events. However when the message dissappears, by default, it is gone. Often I am busy with something else when the notification pops up and I pay little notice. Then somewhere in my brain, something gets triggered by a word, and by the time I focus on the message to **really** read it, the message disappears. Some of these appear after booting or logging in, so it is not trivial to redisplay them. Then I really need a system for logging notification messages for reading what I missed. Ideally there was a scroll back buffer in the notification feature. Maybe there is, but I didn't find it. Ubuntu contains a nice notification system to talk to the user about noteworthy events. However when the message dissappears, by default, it is gone. Often I am busy with something else when the notification pops up and I pay little notice. Then somewhere in my brain, something gets triggered by a word, and by the time I focus on the message to **really** read it, the message disappears. Some of these appear after booting or logging in, so it is not trivial to redisplay them. Then I really need a system for logging notification messages for reading what I missed. Ideally there was a scroll back buffer in the notification feature. Maybe there is, but I didn't find it.

View file

@ -2,6 +2,11 @@
title: 'Making dotFiles visible on the Mac' title: 'Making dotFiles visible on the Mac'
date: 2011-07-20T20:56:00.000Z date: 2011-07-20T20:56:00.000Z
draft: false draft: false
taxonomies:
tags: [ "mac" ]
categories: [ "os", "apps" ]
--- ---
Dotfiles, can't live with 'em, can't live without 'em. Dot files are the term for folders and files starting with a '.' so they do not show upwhen using plain **ls**. Tha Mac has a cool keycode to toggle visibility of dotfiles in the **File Open/Save** dialog, but this does not work in the finder for one reason or another. In practice this meant I had to deal with dotFiles and dotDriectories I found on the net some incantation to force the setting for the finder to show/hide the dotfiles. Upon restarting the finder the widows will reopen with the updated setting. I found on the net some snippets of sh script (but I forgot where and cannot immediately retrieve it), and I immediately dumped them in my **~/bin** folder. ~/bin/hide-dotfiles : Dotfiles, can't live with 'em, can't live without 'em. Dot files are the term for folders and files starting with a '.' so they do not show upwhen using plain **ls**. Tha Mac has a cool keycode to toggle visibility of dotfiles in the **File Open/Save** dialog, but this does not work in the finder for one reason or another. In practice this meant I had to deal with dotFiles and dotDriectories I found on the net some incantation to force the setting for the finder to show/hide the dotfiles. Upon restarting the finder the widows will reopen with the updated setting. I found on the net some snippets of sh script (but I forgot where and cannot immediately retrieve it), and I immediately dumped them in my **~/bin** folder. ~/bin/hide-dotfiles :

View file

@ -2,6 +2,39 @@
title: 'OpenProj Locks up with Blank Grey Windows' title: 'OpenProj Locks up with Blank Grey Windows'
date: 2008-06-11T11:20:02.000Z date: 2008-06-11T11:20:02.000Z
draft: false draft: false
taxonomies:
tags: [ "proxy"]
categories: [ "project", "apps"]
--- ---
The symptom is that after starting sometimes there is some blurp dialog and afterwards the _**Tip of the Day**_ dialog appears. This stays grey and the application accepts no more events. You need to kill it to get out of there. This happens at the place which is protected by a firewall and has no transparent proxy. At home it worked fine albeit on my Macbook and not on my Ubuntu laptop. The reason  is that there is some phone home functionality built in and with wireshark I could see the application trying to connect to a webserver. Probably to get the tips of the day. Behind the firewall this did not work and the application is just hanging there. I suspect that sooner or later it will time out, but I am not that patient. Since this is a regular occurence with java applications I also immediately knew that I had to tell it where the proxy can be found. In the file **~/.openproj/run.conf** file replace the line : JAVA\_OPTS="-Xms128m -Xmx768m" with JAVA\_OPTS="-Dhttp.proxyHost=proxy -Dhttp.proxyPort=3128 -Xms128m -Xmx768m" This directs the java runtime library to use the proxy **http://proxy:3128/.** And voila! ... Openproj starts immediately and in its full glory? The symptom is that after starting sometimes there is some blurp
dialog and afterwards the _**Tip of the Day**_ dialog appears. This
stays grey and the application accepts no more events. You need to
kill it to get out of there.
This happens at the place which is protected by a firewall and has no
transparent proxy. At home it worked fine albeit on my Macbook and not
on my Ubuntu laptop.
The reason  is that there is some phone home
functionality built in and with wireshark I could see the application
trying to connect to a webserver. Probably to get the tips of the
day. Behind the firewall this did not work and the application is just
hanging there.
I suspect that sooner or later it will time out, but I
am not that patient. Since this is a regular occurence with java
applications I also immediately knew that I had to tell it where the
proxy can be found. In the file **~/.openproj/run.conf** file replace
the line :
JAVA\_OPTS="-Xms128m -Xmx768m"
with
JAVA\_OPTS="-Dhttp.proxyHost=proxy -Dhttp.proxyPort=3128 -Xms128m
-Xmx768m"
This directs the java runtime library to use the proxy
**http://proxy:3128/.** And voila! ... Openproj starts immediately
and in its full glory?

View file

@ -2,9 +2,41 @@
title: 'Organizing Windows using the Keyboard with Compiz Grid' title: 'Organizing Windows using the Keyboard with Compiz Grid'
date: 2011-06-30T11:44:00.000Z date: 2011-06-30T11:44:00.000Z
draft: false draft: false
taxonomies:
tags: [ "ubuntu" ]
categories: [ "os"]
--- ---
The Mac has a great utility called Divvy to easily map windows to loacation on the screen using the keyboard. Fiddling with the mouse to get multiple windows in the right location is a productivity killer and a pain in the neck (or shoulder, or elbow, or …) Ubuntu (and of course any other Linux distro with compiz) has a similar feature built in as a plugin for compiz. Type Alt-F2 and enter **ccsm** + Return in the command prompt to launch the CompizConfig Settings manager. Select the **Window Management** from the left menu and enable the **Grid** plugin. Click on it so you can look at the key bindings in the **Bindings** tab. If you are on a desktop or a big laptop with a separate numeric keyboard you are set. As you can see the locations for the windows are by default mapped in a logical fashion like the arrow keys on the numeric keypad. However my laptop does not have a separate numeric keyboard and enabling it before typing the key code is a pain. Remapping them is easy by clicking on the button with the key code. A window appears with a **Grab key code** button. Click it and type the new keycode you want to assign it to. If there is a conflict, you will get a window explaingin the conflict and asking how to resolve it. My first attempt was to remap the laptop numeric keys using the super keys. This conflicted with the unity launcher since the top row 7-8-9 map to the apps in the launcher. To avoid conflicts I use now Control-Super with the keys around the j-key (which is the home key for the right hand) Also autokey (a text macro expander) is mapped to Super-K The Mac has a great utility called Divvy to easily map windows to
loacation on the screen using the keyboard. Fiddling with the mouse to
get multiple windows in the right location is a productivity killer
and a pain in the neck (or shoulder, or elbow, or …)
Ubuntu (and of course any other Linux distro with compiz) has a
similar feature built in as a plugin for compiz. Type Alt-F2 and enter
**ccsm** + Return in the command prompt to launch the CompizConfig
Settings manager.
Select the **Window Management** from the left menu and enable the
**Grid** plugin. Click on it so you can look at the key bindings in
the **Bindings** tab. If you are on a desktop or a big laptop with a
separate numeric keyboard you are set. As you can see the locations
for the windows are by default mapped in a logical fashion like the
arrow keys on the numeric keypad. However my laptop does not have a
separate numeric keyboard and enabling it before typing the key code
is a pain. Remapping them is easy by clicking on the button with the
key code. A window appears with a **Grab key code** button. Click it
and type the new keycode you want to assign it to.
If there is a conflict, you will get a window explaingin the conflict
and asking how to resolve it. My first attempt was to remap the laptop
numeric keys using the super keys. This conflicted with the unity
launcher since the top row 7-8-9 map to the apps in the launcher. To
avoid conflicts I use now Control-Super with the keys around the j-key
(which is the home key for the right hand) Also autokey (a text macro
expander) is mapped to Super-K
* Control-Super-j : 100% (maximize) * Control-Super-j : 100% (maximize)
* Control-Super-h : 50% left * Control-Super-h : 50% left
@ -16,4 +48,8 @@ The Mac has a great utility called Divvy to easily map windows to loacation on t
* Control-Super-n : 25% bottom-left * Control-Super-n : 25% bottom-left
* Control-Super-, : 25% bottom-right * Control-Super-, : 25% bottom-right
If Control-Super-j is used to maximize a window, clicking on one of the other keys will first restore it to the original size and position and only map it to its place on the second click. I consider this a feature, but you are free to interprete it as a bug. Result, this is now super practical way to divide my windows on my screen. If Control-Super-j is used to maximize a window, clicking on one of
the other keys will first restore it to the original size and position
and only map it to its place on the second click. I consider this a
feature, but you are free to interprete it as a bug. Result, this is
now super practical way to divide my windows on my screen.

View file

@ -2,18 +2,32 @@
title: 'Proxy Support for Grails' title: 'Proxy Support for Grails'
date: 2011-06-28T11:59:43.000Z date: 2011-06-28T11:59:43.000Z
draft: false draft: false
taxonomies:
tags: [ "jvm"]
categories: [ "programming"]
--- ---
Not a big thing, actually for me it is. I am always struggling with proxy settings. It requires a lot of different incantations to be done, every program deals with it differently, support for platform settings is flaky, ... Grails deals with this is a way which pleasantly surprised me : Not a big thing, actually for me it is. I am always struggling with
proxy settings. It requires a lot of different incantations to be
done, every program deals with it differently, support for platform
settings is flaky, ... Grails deals with this is a way which
pleasantly surprised me :
> grails add-proxy <name> --host=<hostname> --port=<portno> e.g. grails add-proxy client --host=proxy --port=3128 > grails add-proxy <name> --host=<hostname> --port=<portno> e.g. grails add-proxy client --host=proxy --port=3128
allows you to create a setting for a proxy and bind it to a name. It also supports username/password. Switching to the setting involves only allows you to create a setting for a proxy and bind it to a name. It
also supports username/password. Switching to the setting involves
only
> grails set-proxy client > grails set-proxy client
to enable the proxy setting, and to enable the proxy setting, and
> grails clear-proxy > grails clear-proxy
when I get back in a transparent environment. (for completeness there is a **remove-proxy** command which is useful to remove those passwords after the need has passed). I particularly impressed  that this was done in a simple and straightforward without the need fo brain gymnastics trying to remember which arcane curse needs to be put at what location in which file. Nice. when I get back in a transparent environment. (for completeness there
is a **remove-proxy** command which is useful to remove those
passwords after the need has passed). I particularly impressed  that
this was done in a simple and straightforward without the need fo
brain gymnastics trying to remember which arcane curse needs to be put
at what location in which file. Nice.

View file

@ -2,14 +2,24 @@
title: 'Uploading documents to Plone with WebDAV' title: 'Uploading documents to Plone with WebDAV'
date: 2012-03-29T22:03:41.000Z date: 2012-03-29T22:03:41.000Z
draft: false draft: false
taxonomies:
tags: [ "cms" ]
categories: [ "apps"]
--- ---
Preparing **Plone** to start the WebDAV service and setting the permissions to allow the users to make use of it is only half the battle, actually using it, especially from automated systems like build servers is another struggle. Preparing **Plone** to start the WebDAV service and setting the
permissions to allow the users to make use of it is only half the
battle, actually using it, especially from automated systems like
build servers is another struggle.
Using the _Cadaver_ WebDAV client Using the _Cadaver_ WebDAV client
--------------------------------- ---------------------------------
Although WebDAV is currently well integrated in modern desktop environments, a CLI alternative is useful for automation, like _Jenkins_ build scripts. Although WebDAV is currently well integrated in modern desktop
environments, a CLI alternative is useful for automation, like
_Jenkins_ build scripts.
### Automatic Login ### Automatic Login

View file

@ -1,113 +0,0 @@
---
title: 'Vraagstukken 5de leerjaar'
date: 2011-06-19T19:24:26.000Z
draft: false
---
Wiskunde Vraagstukken 5e leerjaar
=================================
Bij de voorbereiding voor de wiskunde proeven op het einde van het 5e leerjaar kwam Hendrik een aantal vraagstukken te kort zodat hij de uitkomsten al van buiten kende, en zich niet meer bekommerde om de manier waarop de uitkomst bekomen werd. Ik heb er dus een aantal bij verzonnen die voor hem heel herkenbaar waren. Dat vond hij fantastisch! Ik heb deze vragen dan maar geblogd op aandringen van Hendrik. ik vond dat een schitterend idee want dan ben ik er zeker van dat ik ze kan terugvinden als Emma en Lotte ze nodig hebben. En ondertussen kunnen anderen er ook plezier aan beleven.
Vraagstuk 1
-----------
Piet loopt 800m in 2m44.76 sec. Jan deed er 15.34 sec langer over.
Wat was de tijd van Jan?
Formule : .........................................................
Antwoord : ........................................................
...................................................................
Vraagstuk 2
-----------
We rijden van Antwerpen naar het Kaunertal (840km) in 8 uur tijd.
a) Wat was de gemiddelde snelheid?
Formule : .........................................................
Antwoord : ........................................................
...................................................................
b) We zijn onderweg 1u gestopt om te eten. Wat was de gemiddelde
snelheid toen toen we aan het rijden waren.
Formule : .........................................................
Antwoord : ........................................................
...................................................................
Vraagstuk 3
-----------
Hendrik loopt 3300m in 12min. Hoeveel km/u liep hij gemiddeld?
Formule : .........................................................
Antwoord : ........................................................
...................................................................
Vraagstuk 4
-----------
Een sportwagen rijdt 240km op een circuit in 1 u.
Een zeilvis zwemt 60km in de zee in 1u.
De snelheid van de auto en de zeilvis verhouden zich als ..... tot .....
Vraagstuk 5
-----------
Opa gaat 's maandags fietsen met de gepensioneerden
Hij vertrekt om 13h45 en rijdt een toertje van 60km. Ze hebben in
Postel een halfuurtje pauze genomen om een trappist te drinken.
Als de gepensioneerden gemiddeld 20km/u rijden, hoe laat is Opa dan
terug?
Formule : .........................................................
Antwoord : ........................................................
...................................................................
Vraagstuk 6
-----------
Zoek de kans :
We doen alle schaakstukken en een zak en schudden ermee tot ze
allemaal door elkaar gehusseld zijn. Lotte wordt geblindoekt en
haalt een stuk uit de zak.
Wat is de kans dat :
```
|--------------------+---------------+----+--------------|
|                    |               |    |              |
| een wit stuk is    | ............. | op | ............ |
|                    |               |    |              |
| een zwarte pion is | ............. | op | ............ |
|                    |               |    |              |
| een paard is       | ............. | op | ............ |
|                    |               |    |              |
| een koningin is    | ............. | op | ............ |
|                    |               |    |              |
| de witte koning is | ............. | op | ............ |
|--------------------+---------------+----+--------------|
```

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Binary file not shown.

Before

Width:  |  Height:  |  Size: 116 KiB

View file

@ -1 +0,0 @@
function switchLang(n){document.getElementById("switch-lang-panel").classList.toggle("hidden")}document.addEventListener("DOMContentLoaded",function(){document.getElementById("switch-lang")?.addEventListener("click",switchLang)});

View file

@ -1 +0,0 @@
function switchTheme(){"dark"==([...document.documentElement.classList].includes("dark")?"dark":"light")?(localStorage.theme="light",document.documentElement.classList.remove("dark"),document.getElementById("light").classList.add("hidden"),document.getElementById("dark").classList.remove("hidden"),document.getElementById("syntax_highlight").href="/syntax-light.css"):(localStorage.theme="dark",document.documentElement.classList.add("dark"),document.getElementById("dark").classList.add("hidden"),document.getElementById("light").classList.remove("hidden"),document.getElementById("syntax_highlight").href="/syntax-dark.css")}function toggleSidebar(){let e=document.getElementById("sidebar");[...e.classList].includes("translate-x-0")?(document.body.style.removeProperty("overflow"),e.classList.remove("translate-x-0"),e.classList.add("-translate-x-full")):(document.body.style.setProperty("overflow","hidden"),e.classList.remove("-translate-x-full"),e.classList.add("translate-x-0"))}function toggleMobileMenu(){let e=document.querySelector("#mobile-menu div.nav-links");[...e.classList].includes("h-screen")?(document.body.classList.remove("overflow-hidden","relative"),document.documentElement.classList.remove("overscroll-none"),e.classList.remove("h-screen"),e.classList.add("h-0")):(document.body.classList.add("overflow-hidden","relative"),document.documentElement.classList.add("overscroll-none"),e.classList.remove("h-0"),e.classList.add("h-screen"))}document.addEventListener("DOMContentLoaded",function(){var e=document.querySelectorAll(".nav-links a");let t=window.location.href.replace(/\/$/,"");e=[...e].filter(e=>e.href===t||e.href===window.location.href);if(0!==e.length)for(var d of e)d.className="bg-gray-900 text-white px-3 py-2 rounded-md text-sm font-medium";"dark"===localStorage.theme||!("theme"in localStorage)&&window.matchMedia("(prefers-color-scheme: dark)").matches?(document.documentElement.classList.add("dark"),document.getElementById("dark").classList.add("hidden"),document.getElementById("syntax_highlight").href="/syntax-dark.css"):(document.documentElement.classList.remove("dark"),document.getElementById("light").classList.add("hidden"),document.getElementById("syntax_highlight").href="/syntax-light.css"),document.getElementById("switch-theme")?.addEventListener("click",switchTheme),document.getElementById("toggle-sidebar")?.addEventListener("click",toggleSidebar),document.getElementById("toggle-mobile-menu")?.addEventListener("click",toggleMobileMenu)});

View file

@ -1 +0,0 @@
function getActiveTocElement(e){return[...e].find(e=>e.getBoundingClientRect().y<=0)}function findCorrespondingTocTitle(n){return[...document.querySelectorAll("#toc li a")].find(e=>e.href.substring(e.href.indexOf("#"))==="#"+n.id)}document.addEventListener("DOMContentLoaded",function(){if(null!==document.getElementById("toc")){var e=document.querySelectorAll("#toc li a");let n=[];[...e].forEach(e=>{n.push(e.href.substring(e.href.indexOf("#")))});const i=document.querySelectorAll(n.join(","));let t=[...i].reverse();e=getActiveTocElement(t)||i[0];findCorrespondingTocTitle(e).classList.add("bg-blue-700");var o=e;window.addEventListener("scroll",()=>{var e=getActiveTocElement(t)||i[0];e!==o&&(findCorrespondingTocTitle(o).classList.remove("bg-blue-700"),findCorrespondingTocTitle(e).classList.add("bg-blue-700"),o=e)})}});

View file

@ -1,6 +0,0 @@
function toggleSearchModal(){const e=document.getElementById("search-modal");e.classList.toggle("opacity-0"),e.classList.toggle("pointer-events-none"),document.body.classList.toggle("search-active"),[...document.body.classList].includes("search-active")&&(document.getElementById("search-input").value="",document.getElementById("search-input").focus())}function formatResultItem(e){return console.log(e),htmlToElement(`<li class="flex hover:bg-gray-200 dark:hover:bg-gray-600 text-black dark:text-gray-200 p-2 rounded-lg border border-black dark:border-gray-200 bg-gray-200 dark:bg-gray-500 rounded-lg hover:shadow-xl mb-2">
<a href="${e.doc.path}">
<span class="text-xl text-bold">${e.doc.title}</span>
<span class="text-lg">${e.doc.description}</span>
</a>
</li>`)}function htmlToElement(e){let t=document.createElement("template");return e=e.trim(),t.innerHTML=e,t.content.firstChild}document.addEventListener("DOMContentLoaded",function(){let e=document.getElementById("search");e.addEventListener("click",function(e){e.preventDefault(),toggleSearchModal()});const t=document.querySelector(".modal-overlay");t.addEventListener("click",toggleSearchModal);let n=document.querySelectorAll(".modal-close");for(var o=0;o<n.length;o++)n[o].addEventListener("click",toggleSearchModal);document.onkeydown=function(e){let t=!1,n=!1;"key"in(e=e||window.event)?(t="Escape"===e.key||"Esc"===e.key,n="k"===e.key&&!0===e.metaKey):(n=75===e.keyCode&&e.metaKey,t=27===e.keyCode),n&&e.preventDefault(),(t&&document.body.classList.contains("search-active")||n)&&toggleSearchModal()};let l=elasticlunr.Index.load(window.searchIndex),a={bool:"AND",fields:{title:{boost:2},body:{boost:1}}},c,d,r=document.getElementById("search-input");document.getElementById("search-results");r.addEventListener("keyup",function(e){if([...document.body.classList].includes("search-active")&&3<r.value.trim().length&&(c=r.value.trim(),d=l.search(c,a),Array.isArray(d)&&0<d.length)){let e=document.getElementById("results-list");e.replaceChildren();for(o=0;o<d.length;o++){var t=formatResultItem(d[o]);e.appendChild(t)}}})});

File diff suppressed because one or more lines are too long