I’ve been building a C testing framework for work and heard about Snow on Lobsters, so I’m planning to peruse it’s features for inspiration. The one I’m building isn’t as macro-heavy/macro-driven but I think there are a number of advantages to leveraging macros so I want to see what I can add.
You should have a look at greatest, which has worked out great for me in the past. I don’t do a lot of C, but dropped my own little testing setup for greatest, and haven’t looked back.
I’ll check it out, thanks for the link. At a glance, my framework does look similar.
Probably worth mentioning, I am sort of targeting this at folks that develop software using the traditional full cycle SDLC and have to live through that cycle many many times. As a result, I also have a goal to formally support requirements engineering. Basically what that means is that as a test engineer writes a test for a developer to implement against, they can optionally map it to either a requirement (by ID), a type of requirement (functional, non-functional, performance, etc), or a set of requirements (multiple IDs). On a very large project with many moving parts, support for this in a test tool can be invaluable.
The nice side benefit of this is that if you’re using a tool like Git, you can scroll through major points in the Git history and clearly see the progression of development not just by what tests are passing, but also by how many of the requirements solicited from the customer/stakeholder are satisfied. Eventually, I’ll support generating metrics from the tests in a common business/professional format (such as an Excel spreadsheet, so managers can create visualizations and whatnot).
I think it’ll be useful for developers because they don’t just have to point at a results table and say “I’m passing all the tests”, they can point at a results table and say “I’m passing all the tests and here’s also proof that the tests passing fully cover all the initial requirements laid out, therefore the product you asked for is built” (and of course if they don’t like what they got, they can go talk to the requirements engineer :P )
Something that might be useful in your situation: with greatest, if you have formal requirements IDs, you could use them as tags in the test function names, and then run tests based on those – you can use the -t command line switch in the default test runner to run tests based on test name / substring match. (Similarly, -x skips matching tests, so, for example, you could skip tests with __slow in the name.) If you name tests, say, TEST handle_EEPROM_calibration_read_failure__ENG_471(void) { /* ... */ }, then ./tests -t ENG_471 would run that. (Running tests by name also helps with quick feedback loops during development.)
I did some automotive embedded work several years ago. We had a whole requirement traceability system that involved scraping requirement IDs out of comment headers for tests, which eventually fed into coverage reports.
Oh wow, that’s pretty cool. That tagging system can certainly be useful for more than just the requirement IDs but ya, that would work. Being able to filter tests by the tags is also really neat and I hadn’t thought of that as a feature.
I’d be curious to see what someone could come up with if the test framework didn’t use the C preprocessor and used something else instead. Might be a fun exercise. But then again, maybe I’m just not liking the preprocessor lately.
What would it look like to drive tests, for C programs, in say Lua? It seems like a wonderful idea, but I’m not sure if the boilerplate stuff can be automated in such a way to make it feasible…
I’m not sure either, but it still might be an interesting exercise (or mini research project). Maybe I should be the one to look into it since I’m the one that spoke up. ;)
I’m building this kind of greenhouse automation system to monitor my plants at home and maintain appropriate conditions throughout the winter (by turning on/off some of the electric heaters, ventilation, light, sending email alerts when things get too much out of range). It’s all PHP, which is not sexy - but it works - and a few python or bash scripts running on the Raspberry Pi machines that are scattered in the rooms.
My goal is to eventually use machine learning to give advice on when to water the plants (and eventually do it automatically). For now, I manually record the watering times until I have enough data to use. For now, it seems like dirt colour (the system also takes regular pictures of the plants) is a rather good indicator, but not good enough.
The clean, server-side parts of the code are here.
Awesome project! I’ve done this at an industrial scale for multi million dollar greenhouses at my previous job. I initiated and developed with our team a system then called Cortex now called HelioCORE ( http://info.heliospectra.com/heliocore ). It was a modular system which had an easily extendable frontend and backend, all our own features were built a independent modules in the same way so the extensions would be first class citizens. We built it in Python3 with a react frontend but you could pretty much build the modules in anything that speaks http, the core module would broker the connection to hardware using multicasting, udp, broadcasting, plc, modbus, etc through a simple http/websocket JSON-API. We also used some quite sophisticated algorithms which learned to recognize i.e. at what times the sun would pass by beams in the structure etc.
Was great fun and very exciting to work with something so tangible.
Thanks! My system is flexible enough that I was able to add (in the private part of it) things like my weight (from a connected scale) the times I leave/come back from work, presence of my bike in front of the house, or getting intrusion alerts, stuff like that, not very useful but interesting to do.
For the important data though, temperature/light/humidity mostly, I’m struggling to find adequate sensors, good parts of my space are far from electric outlets so I tried to go with battery-powered BLE sensors, but most of those I tried aren’t as precise, long-lived and easy to configure as I’d like. But I guess real professional greenhouses are already wired so this wouldn’t be an issue for you?
Yeah they usually have sensor speaking through some modbus or PLC system which we interfaced with through an industrial pc. We also built some sensors ourself using grade A sensor hardware and arduinos for communication.
Spent the weekend getting things hacked together for a first foray into a “smart, connected” home via Apple’s Homekit. Hooked up a DHT11 to a Raspberry Pi for temperature/humidity (via this server), and also got control of some 433mhz-controlled sockets (via this server). This is all wired into the Homekit system via homebridge running in a SmartOS Zone.
It mostly seems to work. A couple of friends have pointed out nodered which looks to remove some of the wiring up work I’ve been doing with http/python so far. Going to check that out this week I think.
Work on the blood pressure “calculator” iOS app stalled in favour of the homekit stuff, I’ve figured out the behaviour I want from the table rows being tapped (& I also finally understand the highlighted/selected state of UITableViewCells.) Next steps are wiring up the number picker (keyboard) to adjust values in a row, and adding UI to show the averaged result.
As usual, continue on to the next set of debugging books to review. I just published (Lobste.rs story) a review that contained one huge book that took longer than I though to digest. (Short story: it was a mediocre book and explaining why was trickier than I thought.) I’m not sure if the next batch will be a couple of heavyweight texts (including Zeller’s “Why Programs Fail”), or concentrate on a collection of stuff from the 70s and 80s, or maybe a collection of chapters on debugging from programming books. I’m thinking I should just get the heavyweights out of the way because with the old books I need to learn some COBOL and re-learn some Basic.
In the process of replacing an old huge heavy laptop with no battery that I carry around to do my personal stuff when I’m on the move with a hand-me-down EeePC. My brother had no need for it anymore, so I asked him to send it my way. I appreciate the form factor and the lightness. It’s not powerful, but I can run a tiny linux on top of it and it’s gonna work great for what I want to do with it anyway.
For work, I’m going to be running more load tests using apib. Probably fix some nodejs code along the way as well.
I just hired two more people to my team today, now at full planned strength. While I’ve been doing recruiting and interviews for a long time, these past few months were my first involved in the whole process as a hiring manager, from initial contact through orientation. It’s been a difficult˚ but rewarding process full of plot twists and learning a lot about people. I’m pretty excited to see what my team is going to produce!
˚ difficult because hiring is hard, not because of company or other standard things that inhibit what feels like it should be an effortless process
Congratulations on getting through the first challenge if your career pivot. There’s been lots of info on hiring that’s often a bit contradictory. Were there any articles or other resources you found helpful for how you did it?
Unfortunately, I didn’t save much of what I read. I can say that a lot of my process I came up with myself, borrowing ideas I’ve heard here and there throughout the last few years. It started with a few tenets, which may get long as I think about it:
Know what you want but reevaluate after each hire when hiring a team.
My latest hires became viable when I realized that my initial pass at job descriptions for their positions didn’t reflect the skills of the team hired in the meantime. With the help of my team, we revisited the job descriptions and changed expectations. Not lowered, not heighthened. Changed, because the team had changed.
Job description skill lists should be short and should convey an expectation that engineers at all experience levels are expected to learn and teach others.
Establish one or two tech skillsets that are desired for the position, that are non-negotiable. Those are things that someone ideal for the position must know day one.
Establish a group of skillsets or technologies that are nice-to-haves. That is, if you know one or two of these things, I want to talk to you!
Establish a group of interesting technologies. That is, if you know some of these words, let’s talk about them on a screening call in addition to some of the other more pertinent things.
When marketing positions, personal outreach is incredibly important.
Ask friends who might not be interested and ask if they know anyone who might be. All of my hires came from this or similar referrals.
Angelist was not a useful resource for me. I received probably triple-digits’ worth of interest notifications and I think i contacted maybe 3 people. So many clearly hadn’t even read the posting.
Represent yourself well and consistently.
I had the privilege of working with an extremely professional and competent internal recruiter. I learned a lot from her. We helped each other remain consistent.
Be quirky but not self-important, pretentious, or self-deprecating. Do not compromise your authority or authenticity.
Build on your networks constantly. I’ve been networking for years. That network is what helped me execute this, hiring for skills that are hard to find in my area.
Design interview processes such that if a candidate progresses to an in-person interview, you know they can code.
Don’t waste precious face-to-face time assessing a candidate’s ability to write code. Whiteboarding is out entirely.
Front-load a tech assessment as a short project if you cannot find evidence of their competency through social coding profiles, recorded presentations, internships, etc.
When I say short, I mean 30-60 minutes for a new programmer to complete and something that veterans could bang out in an email response. I’d rather have the latter than a zip file containing a documented project complete with 100% unit test coverage, although such would be pretty impressive in its own right.
I hear of tech tests that are weekend projects. That’s unreasonable and does not select for the kind of employee I think I want.
Make it clear that intent is what matters, not syntax or compilation. I’ll take a solution for the tech test or the in-person exercise that doesn’t compile if it shows me enough of what a candidate is thinking to assess that what they’re thinking is correct or at least on the right track.
Expect the best, prepare for the worst: I would rather inadvertently hire someone who falsely misrepresented themselves and have to fire them than bog down myself and my team and my candidates with litmus tests.
Assess two things in an in-person interview:
Can this person explain themselves?
My interviews are designed to encourage a candidate to geek out about themselves and their passions.
My interviews get technical but it’s not a quiz. I like people who “talk shop” well and who can tell a story. Great communicators are often great engineers.
Can I work with this person long-term?
If I’m stuck in a room with this person for a 2 week sprint, will be both emerge at the end and still like each other?
I do intend to write a blog post after another hiring manager within my company tries this strategy and succeeds. I’m working on formalizing these ideas internally as policy first.
Great write-up. Appreciate it. Among other things, I like how you do the coding tests in a way that doesn’t waste too much of the candidate’s time. I also thought the “Be quirky” line was interesting: people rarely talk about that stuff in tech hiring write-ups. What was the reasoning behind those do’s and don’ts?
Quirkiness is showing a weird side, showing a touch of class and uniqueness that the potential applicant likely hasn’t encountered in another job posting. For me, I was able to summarize our very complex mission statement and then summarize the summary into a single sentence and be really upfront with eyecatching buzzwords but explain what we’re doing with those buzzwords later.
A contrived example of a mission statement shortening:
Highfalutin mission statement: Colincom is enabling life-changing insight into customers’ desires for self-improving their culinary skills to make an impact in non-alcoholic social gathering situations.
Distilled: Colincom helps customers make better non-alcoholic drinks.
A self-important job description overemphasizes the novelty or impact of the company, its product, or its team(s), or a combination thereof. Avoid words like “life-changing” and “visionary”, and anything else that would be used to introduce a speaker at a Nobel prize dinner or TED talk. That highfalutin vision statement could be in the JD but really should be summarized to show that the company actually can do an elevator pitch.
A pretentious job description overemphasizes perks that don’t really matter, such as a particular location “a newly-outfitted office in $hipster_district” or things available only to employees that aren’t considered benefits, such as free booze or work trips. Benefits like free lunch or mass transit passes belong in a bullet list, not in the second paragraph of the description of the job.
A self-deprecating job description is pretty hard to write, but would say things like “we are moving to a better location soon” or “we will soon have an HR department”. That self-deprecation line is more a verbal thing once you’ve established contact with a candidate: every company has problems, but how a candidate perceives those problems as they encounter them sets a tone for their time with the company. Never shittalk another team: “they didn’t do this right” is bad, “there was a miscommunication about how to proceed” is better. It’s spin, honestly. Things can be on fire and you can point out the fire, but exude the confidence that it’s under control even if it isn’t. If it’s not, and it affects candidates, why the hell are you interviewing candidates until that fire is put out?
Avoiding a compromise of authenticity is largely keeping messaging consistent. Update JDs across postings, ensure that external recruiters are using your messaging and not diluting it, use some templates across all JDs so there is consistency company-wide.
Avoiding a compromise of authority is a more confident way of conveying that telling a candidate you don’t know something is OK, but you must take it upon yourself to find out and tell the candidate an answer, even if that means putting them on the phone with someone who has the authority to answer questions. I’m a good example of this: I do not talk about health benefits to candidates. I say we offer them and if they have questions, I’ll get them on a call with our HR team even if that call is five minutes.
I received a Tiva C Series Launchpad MCU from a friend and am currently running through some exercises to get comfortable working in an embedded environment
At work, transitioning back to a project I left a couple weeks ago, so a lot of review of things I still remember from before
As much as I want this week’s agenda to be to share all the ideas I’ve had for the product over the last couple months, I know that’s not what someone should be doing in their first week. Or month. Or three.
This week I’m going to find my bearings. I’m going to feel out the culture, acquaint myself with people, and fit in. I’m going to try to understand more of the fundamentals of their product, development process, culture, and industry. A lot of new information is about to come at me and I expect to be a bit overwhelmed and exhausted, but that’s the game.
I’m excited and nervous and all of the usual emotions. Wish me luck!
I’ve been working on a follow-up to a discussion I had on how GNU’s implementation of yes had a higher throughput than any other implementation from their use of buffering two pages of "yes\n". (Reddit post on r/unix, lobsters)
I’m interested now in benching the speed of a virtual terminal, hopefully it’ll be ready by the end of this week!
I have written and submitted some patches 1-2 weeks ago to the Rust Tensorflow bindings to make tensors, graphs, and ops Send + Sync. In the latter two cases, this was trivial, but for tensors this required a bit of work since tensors of types where C and Rust do not have the same representations are lazily unpacked. I didn’t want to replace Cell for interior mutability by RwLock, because it pollutes the API with lock guards. So, I opted for separating the representation for types where C/Rust types do/don’t have the same representation, so that tensors are at least Send + Sync for types where the representations match.
Since the patches were accepted, I am now implementing simple servers for two (Tensorflow-using) natural language processing tools (after some colleagues requested that make them available in that way ;)).
Besides that, since it’s exam week I am writing an exam for Wednesday and there’s a lot of correction work to do after that.
Semi-work/semi-home:
I have been packaging one application (a treebank search tool) as a Flatpak. Building the Flatpak was far less work than I expected. Thus far, I had been rolling Ubuntu and Arch packages. Building the Flatpak was far less work than the Ubuntu packages. Also, the application seems to work well with portals, since most file opening/saving goes through QFileDialog. I guess I am also benefitting from rewriting some sandbox-unfriendly code when sanboxing the macOS build.
I’m trying Alex and Happy (and Haskell) for the first time in a side project; trying to make a really minimalistic query language for CSVs. I’ve gotten up to generating some small parse trees from strings.
We released first alpha last Friday. It all went well except that it was supposed to be done on Thursday.
This sprint will be all about refactoring, fixing pressing bugs, adding pressing features (light stuff).
Personal:
Game development and more game development! Been spending around 1 hour or so doing game dev. Some day I’ll have a game. I’m going through the incredibly inefficient rule of writing my own game engine. I’m doing this with Java+LWJGL. I don’t know either of those. So this should continue to be fun for several months lol
Also I’m reading The Pragmatic Programmer, couple pages a day or so, just in the ocassional downtime.
I’ve been working my way through the Programming Phoenix book, with the intent of finishing the book and working on a small hobby app. After that, I’m going to dive into either React Native or Swift to build a mobile frontend for my hobby app.
All in the hopes of either providing myself the skills and exposure for a new role, or at the very least, having a product out in the wild that I can be passionate towards. :)
Putting the final touches on UTF-8 input and output support for The Last Outpost MUD. We’re looking for more players, so when its time to take a break from whatever you’ve been working on this week, come over and explore some dungeons with us– now with umlauts!
Also, just for fun, I added a ‘baudrate’ command into the game. Chose among your favorite modem rates (56k bps or lower), and you can try playing our mulit-user text adventure game just like it was when it first went online back in 1991, minus the occasional burst of line noise and your mom yelling at you to get off the phone.
I finally doped out an issue in my wiki/notebook software that was keeping me from being able to simply make run to get it up and going. I’ve also added and edited that wiki some, though not quite as much as I was doing in the first few days after setting it up.
Based on last weeks post - I’ve been brainstorming based on the advice you had given. I’m still trying to think of something either I can build or something I can just run on my Digital Ocean VPS (aside from ZNC).
For fun, I’ve been building a web interface for Funhaus’ Google Trends Show, and as usual I’m being massively inconvenienced by CORS. This week’s contribution will probably be a server‐side component to sidestep it.
I’m not sure what I’d share in my learning experience, to be honest. Most of what I’ve learned is tricks that are succinctly shown in the challenge solutions.
Hurriedly preparing a talk on mruby for the local Ruby meetup this week, using How To Prepare A Talk from Deconstructconf.
I’m not giving it nearly enough of the time the article recommends though, because I spent the month getting my home office in order and then discovering that the mruby build system still has a lot of rough edges (which are ending up as part of the talk)
Evaluating different graph databases for use with user data at work. Not entirely sure they’ll be a good fit over our usual combination of RDBMS/Elasticsearch/Cassandra, especially given the REST-like way some of the queries are structured. Planning to test out Neo4j, AWS Neptune, JanusGraph v Plain Old Cassandra; anyone have experience running these or others in production?
At home I’ve started on a project to enable using a midi keyboard to control playback of sound effects and ambient music over Discord for my regular Dungeon World roleplaying sessions. Managed to get a horrific thing that’d play any number of youtube videos into Discord voice chat at the same time going in about an hour, so I’m excited about the next steps (reorg code, then put in functionality to queue/layer/etc). Writing code on Windows is still not entirely ergonomic, but I’ve been pretty impressed so far with rustup’s ease-of-use and how far the mingw/msys ecosystem has come since the late-00’s.
Allows me to easily create binary based formats and write tail/head unix-like programs. I use this for a couple personal services and saves me tons of space and is very efficient. I write data every second. Using tools like gzip, fswatch, parallel, and others, I can compress my data and manipulated in parallel with ease.
a webgl-based menu system for someone’s website (it’s flash time, all over again!)
a phone app to help kids track and manage/monitor their medical condition
a Zombie shooter/puzzle game in VR
Home:
finishing up my old-school retro arcade shooter game (coding is done, just setting up for online sales. I’m aiming to sell the latest version, with the previous version under GPL.)
writing a browser extension to detect and flag up sponsored content in news articles
Step-by-step I’m rewriting vdirsyncer in Rust. Right now I’m trying to (unsuccessfully) debug a segfault that happens only in the legacy Ubuntu Trusty environment of Travis (switching to Ubuntu Precise helps), but not in an equivalent local VM: https://github.com/pimutils/vdirsyncer/pull/698
Apart from that:
I’ve just returned from FOSDEM and I try to figure out what I could work on next
A bit of studying. One exam still left for this semester
Work wise I’m making a custom sort routing which is locale aware for C strings… The wonders of working on an embedded device without a full C standard library.
At least I’ve gotten to brush up my C++ for the test code - and props to CppUTest for being by far the best c++ unit testing framework I’ve tried so far.
Work: Thinking about implementing an XSS Sanitizer and exposing it to all web pages, a bit like DOMPurify. The road to writing web standards and IDL is…bumpy.
(Well in fact, my focus should be on code reviews and meeting preparations this week :))
Fun: Trying to build a music player for my toddler that is toddler friendly and doesn’t require any kind of reading. Idea: NFC cards with colorful stickers that allow the selection of songs / albums / playlist. Based on a raspberry pi and an rc522
I started porting Chroma to MacOS. The makefile has two targets - a curses build and a graphical build that uses SDL 1.2. The latter fails at runtime with an error message that didn’t yield any useful search results. This weekend I’ll try compiling the SDL version for Windows. If this succeeds, I’ll port Chroma from SDL 1.2 to SDL 2.0 and try building it for MacOS again.
Hosted a Hackathon at our office (a “small” castle) weekend, which was great fun and now I’m evaluating the success and also onboarding a new hire for backend+devops. Will write a article for our company blog about the event with some highlights this week.
Other then that I’m looking for a good platform for a smaller firm to deploy Python services on our servers, leaning towards Kubernetes, hoping it’s not to big for a ≈15 person startup. We also have to figure out how to handle file storage if we go the container route.
If anyone got experience in it I’d gladly appreciate some tips and pointers.
Same as last week: Rustwell, a Rust REST front end to Gnome Shotwell.
Progress is slow as this is what I do just after I put my kdis to bed and before I go to bed, so progress is dad-brain paced.
The reasons for it are that I want not just to have a full catalog of all my photos and videos, with duplicates detected, I want to know exactly where I have duplicates of my pictures, and the ability to have a consistent policy about it.
And that policy should enable me to say “oh, that picture of my whiteboard at work should not have been backed onto Flickr because of company policy, let’s delete,” or “that picture was of a document containing PII, so it should not exist anywhere”, with Rustwell knowing what to do.
20 minutes of work going into it every night, and now that it builds and runs, the coding-without-fear thing is really kicking in.
Working on the grind of focusing and writing my own code in c and assembly. I honestly don’t know how to focus on the programming and less about the meta stuff.
Hopefully finishing setting up a basic Packer → Terraform setup for my own website + side-projects so I can easily deploy things. I realised that was one of the blockers for me getting side-projects up and out there.
Yesterday my order for various bits and pieces for my new RGBW lighting system arrived. Soldered some basic stuff onto the PCB and I might get around to finishing the ardunio code for it today. Also load testing, I have no idea how much power the LEDs pull in reality, only some quick napkin math. Might burn a wire or three.
Otherwise, not much planned, I was tinkering with a small data protocol on a notepad, might be fun to reinvent most of TCP and write a basic userspace lib for it.
Work: well its work, and this week is fixing merge conflicts for stuff, so exciting
Home: Been debugging building ghc 8.2.2 on armhf for alpine linux on/off the past few months. Will keep working on that, but discovered I wasn’t the only one with the issue. So have a ticket open on that front.
Also porting ghc to aarch64 and i586, but encountering other fun issues unrelated to ghc, aka the cross compilation stuff in alpine linux is broken for seemingly just me somehow.
I’ve been building a C testing framework for work and heard about Snow on Lobsters, so I’m planning to peruse it’s features for inspiration. The one I’m building isn’t as macro-heavy/macro-driven but I think there are a number of advantages to leveraging macros so I want to see what I can add.
You should have a look at greatest, which has worked out great for me in the past. I don’t do a lot of C, but dropped my own little testing setup for greatest, and haven’t looked back.
I’ll check it out, thanks for the link. At a glance, my framework does look similar.
Probably worth mentioning, I am sort of targeting this at folks that develop software using the traditional full cycle SDLC and have to live through that cycle many many times. As a result, I also have a goal to formally support requirements engineering. Basically what that means is that as a test engineer writes a test for a developer to implement against, they can optionally map it to either a requirement (by ID), a type of requirement (functional, non-functional, performance, etc), or a set of requirements (multiple IDs). On a very large project with many moving parts, support for this in a test tool can be invaluable.
The nice side benefit of this is that if you’re using a tool like Git, you can scroll through major points in the Git history and clearly see the progression of development not just by what tests are passing, but also by how many of the requirements solicited from the customer/stakeholder are satisfied. Eventually, I’ll support generating metrics from the tests in a common business/professional format (such as an Excel spreadsheet, so managers can create visualizations and whatnot).
I think it’ll be useful for developers because they don’t just have to point at a results table and say “I’m passing all the tests”, they can point at a results table and say “I’m passing all the tests and here’s also proof that the tests passing fully cover all the initial requirements laid out, therefore the product you asked for is built” (and of course if they don’t like what they got, they can go talk to the requirements engineer :P )
Hi, greatest author here. :)
Something that might be useful in your situation: with greatest, if you have formal requirements IDs, you could use them as tags in the test function names, and then run tests based on those – you can use the
-t
command line switch in the default test runner to run tests based on test name / substring match. (Similarly,-x
skips matching tests, so, for example, you could skip tests with__slow
in the name.) If you name tests, say,TEST handle_EEPROM_calibration_read_failure__ENG_471(void) { /* ... */ }
, then./tests -t ENG_471
would run that. (Running tests by name also helps with quick feedback loops during development.)I did some automotive embedded work several years ago. We had a whole requirement traceability system that involved scraping requirement IDs out of comment headers for tests, which eventually fed into coverage reports.
Oh wow, that’s pretty cool. That tagging system can certainly be useful for more than just the requirement IDs but ya, that would work. Being able to filter tests by the tags is also really neat and I hadn’t thought of that as a feature.
I’d be curious to see what someone could come up with if the test framework didn’t use the C preprocessor and used something else instead. Might be a fun exercise. But then again, maybe I’m just not liking the preprocessor lately.
What would it look like to drive tests, for C programs, in say Lua? It seems like a wonderful idea, but I’m not sure if the boilerplate stuff can be automated in such a way to make it feasible…
I’m not sure either, but it still might be an interesting exercise (or mini research project). Maybe I should be the one to look into it since I’m the one that spoke up. ;)
Actually, this sounds like something @silentbicycle has probably already tried. Might be worth checking in with him first. :)
My Mono patchset is likely getting merged upstream. So far I can run things like ASP.NET and old IRC bots of mine, but on weird IBM midrange systems.
I’m building this kind of greenhouse automation system to monitor my plants at home and maintain appropriate conditions throughout the winter (by turning on/off some of the electric heaters, ventilation, light, sending email alerts when things get too much out of range). It’s all PHP, which is not sexy - but it works - and a few python or bash scripts running on the Raspberry Pi machines that are scattered in the rooms.
My goal is to eventually use machine learning to give advice on when to water the plants (and eventually do it automatically). For now, I manually record the watering times until I have enough data to use. For now, it seems like dirt colour (the system also takes regular pictures of the plants) is a rather good indicator, but not good enough.
The clean, server-side parts of the code are here.
Awesome project! I’ve done this at an industrial scale for multi million dollar greenhouses at my previous job. I initiated and developed with our team a system then called Cortex now called HelioCORE ( http://info.heliospectra.com/heliocore ). It was a modular system which had an easily extendable frontend and backend, all our own features were built a independent modules in the same way so the extensions would be first class citizens. We built it in Python3 with a react frontend but you could pretty much build the modules in anything that speaks http, the core module would broker the connection to hardware using multicasting, udp, broadcasting, plc, modbus, etc through a simple http/websocket JSON-API. We also used some quite sophisticated algorithms which learned to recognize i.e. at what times the sun would pass by beams in the structure etc.
Was great fun and very exciting to work with something so tangible.
Thanks! My system is flexible enough that I was able to add (in the private part of it) things like my weight (from a connected scale) the times I leave/come back from work, presence of my bike in front of the house, or getting intrusion alerts, stuff like that, not very useful but interesting to do.
For the important data though, temperature/light/humidity mostly, I’m struggling to find adequate sensors, good parts of my space are far from electric outlets so I tried to go with battery-powered BLE sensors, but most of those I tried aren’t as precise, long-lived and easy to configure as I’d like. But I guess real professional greenhouses are already wired so this wouldn’t be an issue for you?
Yeah they usually have sensor speaking through some modbus or PLC system which we interfaced with through an industrial pc. We also built some sensors ourself using grade A sensor hardware and arduinos for communication.
Spent the weekend getting things hacked together for a first foray into a “smart, connected” home via Apple’s Homekit. Hooked up a DHT11 to a Raspberry Pi for temperature/humidity (via this server), and also got control of some 433mhz-controlled sockets (via this server). This is all wired into the Homekit system via homebridge running in a SmartOS Zone.
It mostly seems to work. A couple of friends have pointed out nodered which looks to remove some of the wiring up work I’ve been doing with http/python so far. Going to check that out this week I think.
Work on the blood pressure “calculator” iOS app stalled in favour of the homekit stuff, I’ve figured out the behaviour I want from the table rows being tapped (& I also finally understand the highlighted/selected state of
UITableViewCell
s.) Next steps are wiring up the number picker (keyboard) to adjust values in a row, and adding UI to show the averaged result.As usual, continue on to the next set of debugging books to review. I just published (Lobste.rs story) a review that contained one huge book that took longer than I though to digest. (Short story: it was a mediocre book and explaining why was trickier than I thought.) I’m not sure if the next batch will be a couple of heavyweight texts (including Zeller’s “Why Programs Fail”), or concentrate on a collection of stuff from the 70s and 80s, or maybe a collection of chapters on debugging from programming books. I’m thinking I should just get the heavyweights out of the way because with the old books I need to learn some COBOL and re-learn some Basic.
Working on open source project nuster, fixed a bug, released a new version.
Work:
Perso:
In the process of replacing an old huge heavy laptop with no battery that I carry around to do my personal stuff when I’m on the move with a hand-me-down EeePC. My brother had no need for it anymore, so I asked him to send it my way. I appreciate the form factor and the lightness. It’s not powerful, but I can run a tiny linux on top of it and it’s gonna work great for what I want to do with it anyway.
For work, I’m going to be running more load tests using apib. Probably fix some nodejs code along the way as well.
I just hired two more people to my team today, now at full planned strength. While I’ve been doing recruiting and interviews for a long time, these past few months were my first involved in the whole process as a hiring manager, from initial contact through orientation. It’s been a difficult˚ but rewarding process full of plot twists and learning a lot about people. I’m pretty excited to see what my team is going to produce!
˚ difficult because hiring is hard, not because of company or other standard things that inhibit what feels like it should be an effortless process
Congratulations on getting through the first challenge if your career pivot. There’s been lots of info on hiring that’s often a bit contradictory. Were there any articles or other resources you found helpful for how you did it?
Unfortunately, I didn’t save much of what I read. I can say that a lot of my process I came up with myself, borrowing ideas I’ve heard here and there throughout the last few years. It started with a few tenets, which may get long as I think about it:
I do intend to write a blog post after another hiring manager within my company tries this strategy and succeeds. I’m working on formalizing these ideas internally as policy first.
Great write-up. Appreciate it. Among other things, I like how you do the coding tests in a way that doesn’t waste too much of the candidate’s time. I also thought the “Be quirky” line was interesting: people rarely talk about that stuff in tech hiring write-ups. What was the reasoning behind those do’s and don’ts?
Quirkiness is showing a weird side, showing a touch of class and uniqueness that the potential applicant likely hasn’t encountered in another job posting. For me, I was able to summarize our very complex mission statement and then summarize the summary into a single sentence and be really upfront with eyecatching buzzwords but explain what we’re doing with those buzzwords later.
A contrived example of a mission statement shortening:
Highfalutin mission statement: Colincom is enabling life-changing insight into customers’ desires for self-improving their culinary skills to make an impact in non-alcoholic social gathering situations.
Distilled: Colincom helps customers make better non-alcoholic drinks.
A self-important job description overemphasizes the novelty or impact of the company, its product, or its team(s), or a combination thereof. Avoid words like “life-changing” and “visionary”, and anything else that would be used to introduce a speaker at a Nobel prize dinner or TED talk. That highfalutin vision statement could be in the JD but really should be summarized to show that the company actually can do an elevator pitch.
A pretentious job description overemphasizes perks that don’t really matter, such as a particular location “a newly-outfitted office in $hipster_district” or things available only to employees that aren’t considered benefits, such as free booze or work trips. Benefits like free lunch or mass transit passes belong in a bullet list, not in the second paragraph of the description of the job.
A self-deprecating job description is pretty hard to write, but would say things like “we are moving to a better location soon” or “we will soon have an HR department”. That self-deprecation line is more a verbal thing once you’ve established contact with a candidate: every company has problems, but how a candidate perceives those problems as they encounter them sets a tone for their time with the company. Never shittalk another team: “they didn’t do this right” is bad, “there was a miscommunication about how to proceed” is better. It’s spin, honestly. Things can be on fire and you can point out the fire, but exude the confidence that it’s under control even if it isn’t. If it’s not, and it affects candidates, why the hell are you interviewing candidates until that fire is put out?
Avoiding a compromise of authenticity is largely keeping messaging consistent. Update JDs across postings, ensure that external recruiters are using your messaging and not diluting it, use some templates across all JDs so there is consistency company-wide.
Avoiding a compromise of authority is a more confident way of conveying that telling a candidate you don’t know something is OK, but you must take it upon yourself to find out and tell the candidate an answer, even if that means putting them on the phone with someone who has the authority to answer questions. I’m a good example of this: I do not talk about health benefits to candidates. I say we offer them and if they have questions, I’ll get them on a call with our HR team even if that call is five minutes.
Appreciate the detailed reply! That all makes a lot of sense. :)
I received a Tiva C Series Launchpad MCU from a friend and am currently running through some exercises to get comfortable working in an embedded environment
At work, transitioning back to a project I left a couple weeks ago, so a lot of review of things I still remember from before
Today is the first day of my new job!
As much as I want this week’s agenda to be to share all the ideas I’ve had for the product over the last couple months, I know that’s not what someone should be doing in their first week. Or month. Or three.
This week I’m going to find my bearings. I’m going to feel out the culture, acquaint myself with people, and fit in. I’m going to try to understand more of the fundamentals of their product, development process, culture, and industry. A lot of new information is about to come at me and I expect to be a bit overwhelmed and exhausted, but that’s the game.
I’m excited and nervous and all of the usual emotions. Wish me luck!
Good luck, and congratulations.
I’ve been working on a follow-up to a discussion I had on how GNU’s implementation of
yes
had a higher throughput than any other implementation from their use of buffering two pages of"yes\n"
. (Reddit post on r/unix, lobsters)I’m interested now in benching the speed of a virtual terminal, hopefully it’ll be ready by the end of this week!
Work:
I have written and submitted some patches 1-2 weeks ago to the Rust Tensorflow bindings to make tensors, graphs, and ops
Send + Sync
. In the latter two cases, this was trivial, but for tensors this required a bit of work since tensors of types where C and Rust do not have the same representations are lazily unpacked. I didn’t want to replaceCell
for interior mutability byRwLock
, because it pollutes the API with lock guards. So, I opted for separating the representation for types where C/Rust types do/don’t have the same representation, so that tensors are at leastSend + Sync
for types where the representations match.Since the patches were accepted, I am now implementing simple servers for two (Tensorflow-using) natural language processing tools (after some colleagues requested that make them available in that way ;)).
Besides that, since it’s exam week I am writing an exam for Wednesday and there’s a lot of correction work to do after that.
Semi-work/semi-home:
I have been packaging one application (a treebank search tool) as a Flatpak. Building the Flatpak was far less work than I expected. Thus far, I had been rolling Ubuntu and Arch packages. Building the Flatpak was far less work than the Ubuntu packages. Also, the application seems to work well with portals, since most file opening/saving goes through
QFileDialog
. I guess I am also benefitting from rewriting some sandbox-unfriendly code when sanboxing the macOS build.I’m trying Alex and Happy (and Haskell) for the first time in a side project; trying to make a really minimalistic query language for CSVs. I’ve gotten up to generating some small parse trees from strings.
https://github.com/emsal1863/csvql/ (I’m sort of embarrassed)
Work:
Personal:
Game development and more game development! Been spending around 1 hour or so doing game dev. Some day I’ll have a game. I’m going through the incredibly inefficient rule of writing my own game engine. I’m doing this with Java+LWJGL. I don’t know either of those. So this should continue to be fun for several months lol
Also I’m reading The Pragmatic Programmer, couple pages a day or so, just in the ocassional downtime.
I’ve been working my way through the Programming Phoenix book, with the intent of finishing the book and working on a small hobby app. After that, I’m going to dive into either React Native or Swift to build a mobile frontend for my hobby app.
All in the hopes of either providing myself the skills and exposure for a new role, or at the very least, having a product out in the wild that I can be passionate towards. :)
Putting the final touches on UTF-8 input and output support for The Last Outpost MUD. We’re looking for more players, so when its time to take a break from whatever you’ve been working on this week, come over and explore some dungeons with us– now with umlauts!
Also, just for fun, I added a ‘baudrate’ command into the game. Chose among your favorite modem rates (56k bps or lower), and you can try playing our mulit-user text adventure game just like it was when it first went online back in 1991, minus the occasional burst of line noise and your mom yelling at you to get off the phone.
I finally doped out an issue in my wiki/notebook software that was keeping me from being able to simply
make run
to get it up and going. I’ve also added and edited that wiki some, though not quite as much as I was doing in the first few days after setting it up.Based on last weeks post - I’ve been brainstorming based on the advice you had given. I’m still trying to think of something either I can build or something I can just run on my Digital Ocean VPS (aside from ZNC).
For fun, I’ve been building a web interface for Funhaus’ Google Trends Show, and as usual I’m being massively inconvenienced by CORS. This week’s contribution will probably be a server‐side component to sidestep it.
Working on a Kafka Streams, Kotlin and Spring Boot to standardise our async payment events.
Work:
Personal:
Awesome ! I’m also reading this learn Haskell from Chris, I’ve started the CIS194.
Thanks for H99, that should be a nice complement if I want more exercises.
Are your sharing your learning experience anywhere?
I’m not sure what I’d share in my learning experience, to be honest. Most of what I’ve learned is tricks that are succinctly shown in the challenge solutions.
Hurriedly preparing a talk on mruby for the local Ruby meetup this week, using How To Prepare A Talk from Deconstructconf.
I’m not giving it nearly enough of the time the article recommends though, because I spent the month getting my home office in order and then discovering that the mruby build system still has a lot of rough edges (which are ending up as part of the talk)
Hello! These look fun!
Evaluating different graph databases for use with user data at work. Not entirely sure they’ll be a good fit over our usual combination of RDBMS/Elasticsearch/Cassandra, especially given the REST-like way some of the queries are structured. Planning to test out Neo4j, AWS Neptune, JanusGraph v Plain Old Cassandra; anyone have experience running these or others in production?
At home I’ve started on a project to enable using a midi keyboard to control playback of sound effects and ambient music over Discord for my regular Dungeon World roleplaying sessions. Managed to get a horrific thing that’d play any number of youtube videos into Discord voice chat at the same time going in about an hour, so I’m excited about the next steps (reorg code, then put in functionality to queue/layer/etc). Writing code on Windows is still not entirely ergonomic, but I’ve been pretty impressed so far with rustup’s ease-of-use and how far the mingw/msys ecosystem has come since the late-00’s.
skullrump: https://docs.rs/skullrump/0.1.0/skullrump/
Allows me to easily create binary based formats and write tail/head unix-like programs. I use this for a couple personal services and saves me tons of space and is very efficient. I write data every second. Using tools like gzip, fswatch, parallel, and others, I can compress my data and manipulated in parallel with ease.
Work:
Home:
Step-by-step I’m rewriting vdirsyncer in Rust. Right now I’m trying to (unsuccessfully) debug a segfault that happens only in the legacy Ubuntu Trusty environment of Travis (switching to Ubuntu Precise helps), but not in an equivalent local VM: https://github.com/pimutils/vdirsyncer/pull/698
Apart from that:
I’ve been working on a DND assistant that tracks character sheets, spells, items, etc.
https://dnd.fn.lc/
It’s built in react native and backed by firestore. It’s been pretty fun trying to make something cross platform like that.
I’m playing around with porting the RxSwing library (https://github.com/ReactiveX/RxSwing) to work on RxJava 2.
Work wise I’m making a custom sort routing which is locale aware for C strings… The wonders of working on an embedded device without a full C standard library.
At least I’ve gotten to brush up my C++ for the test code - and props to CppUTest for being by far the best c++ unit testing framework I’ve tried so far.
Work: Thinking about implementing an XSS Sanitizer and exposing it to all web pages, a bit like DOMPurify. The road to writing web standards and IDL is…bumpy. (Well in fact, my focus should be on code reviews and meeting preparations this week :))
Fun: Trying to build a music player for my toddler that is toddler friendly and doesn’t require any kind of reading. Idea: NFC cards with colorful stickers that allow the selection of songs / albums / playlist. Based on a raspberry pi and an rc522
I started porting Chroma to MacOS. The makefile has two targets - a curses build and a graphical build that uses SDL 1.2. The latter fails at runtime with an error message that didn’t yield any useful search results. This weekend I’ll try compiling the SDL version for Windows. If this succeeds, I’ll port Chroma from SDL 1.2 to SDL 2.0 and try building it for MacOS again.
Hosted a Hackathon at our office (a “small” castle) weekend, which was great fun and now I’m evaluating the success and also onboarding a new hire for backend+devops. Will write a article for our company blog about the event with some highlights this week.
Other then that I’m looking for a good platform for a smaller firm to deploy Python services on our servers, leaning towards Kubernetes, hoping it’s not to big for a ≈15 person startup. We also have to figure out how to handle file storage if we go the container route.
If anyone got experience in it I’d gladly appreciate some tips and pointers.
Same as last week: Rustwell, a Rust REST front end to Gnome Shotwell.
Progress is slow as this is what I do just after I put my kdis to bed and before I go to bed, so progress is dad-brain paced.
The reasons for it are that I want not just to have a full catalog of all my photos and videos, with duplicates detected, I want to know exactly where I have duplicates of my pictures, and the ability to have a consistent policy about it.
And that policy should enable me to say “oh, that picture of my whiteboard at work should not have been backed onto Flickr because of company policy, let’s delete,” or “that picture was of a document containing PII, so it should not exist anywhere”, with Rustwell knowing what to do.
20 minutes of work going into it every night, and now that it builds and runs, the coding-without-fear thing is really kicking in.
I’m working on some tonality models for this music composition app that’s like an IDE for music. You should sign up here https://docs.google.com/forms/d/1-aQzVbkbGwv2BMQsvuoneOUPgyrc6HRl-DjVwHZxKvo if you wanna be notified when I launch it.
Started rewriting the static site generator that powers usesthis.com, because it grew without any planning and its current design is frustrating me.
Working on the grind of focusing and writing my own code in c and assembly. I honestly don’t know how to focus on the programming and less about the meta stuff.
Hopefully finishing setting up a basic Packer → Terraform setup for my own website + side-projects so I can easily deploy things. I realised that was one of the blockers for me getting side-projects up and out there.
Yesterday my order for various bits and pieces for my new RGBW lighting system arrived. Soldered some basic stuff onto the PCB and I might get around to finishing the ardunio code for it today. Also load testing, I have no idea how much power the LEDs pull in reality, only some quick napkin math. Might burn a wire or three.
Otherwise, not much planned, I was tinkering with a small data protocol on a notepad, might be fun to reinvent most of TCP and write a basic userspace lib for it.
Work: well its work, and this week is fixing merge conflicts for stuff, so exciting
Home: Been debugging building ghc 8.2.2 on armhf for alpine linux on/off the past few months. Will keep working on that, but discovered I wasn’t the only one with the issue. So have a ticket open on that front.
Also porting ghc to aarch64 and i586, but encountering other fun issues unrelated to ghc, aka the cross compilation stuff in alpine linux is broken for seemingly just me somehow.
[Comment removed by author]
[Comment removed by author]