The common approach I have used in the past for locating a tag on a given 2D shape has been to use the centroid location. For convex parts there is a very good solution. However, when the shape is not convex the centroid location may be outside of the shape surface.
Whenever the tags are intended to identify a shape it might be a problem is the label falls outside of the shape, even more so when multiple shapes are packed together, as user may not be able to be sure which label belongs to which part.
One idea of fixing that is to make sure the tag location is always inside the part, and for that purpose I have evolved through for different algorithms, trying to find the best result.
If centroid is within the shape area, then just use that. When it is outside (concave shape) then an horizontal sweep is done in 10% increments, at the centroid height, looking for a spot within the shape area. If that is not found, then same approach is repeated with a vertical sweep at the centroid width too. It appears as a black box in the video.
At the centroid, one horizontal line is traced and shape is explored for the longest area intersection with this line. The middle point of this line is now used for performing a similar sweep but this time done vertically. The tag location will be at the middle point of the longest vertical intersection. It appears a in blue color in the video.
Similar to algorithm 2, but adding a second horizontal sweep trying to get a better centered result. It appears as a pink box in the video.
It follows a topological approach, looking for the point that it is furthest from the shape perimeters. To do so the shape is painted as a bitmap and a dilate operation is done repeatedly till the last pixels are removed from the image. It is the location of that last pixel the desired tag location. It appears in red color in the video.
Usually, the black box is hiding the centroid that appears as a small circle, but on a few cases that can be seen as the black box is moved away from the centroid.
If you have another way of solving the problem, please let me know in the comments below.
Actually, similar to number 4: Instead of using a bitmap, I use the vector representation of the perimeter as a polygon. Then I perform, repeatedly, negative polygon buffer operations [on the larger block] until polygon area reaches a certain threshold. Then I use the centroid of that remaining polygon as the location for the label. It turns out much more efficient than its cousin Algorithm 4 (provided you have a decent polygon offset implementation).
Sometimes I needed to check how heat was distributed on a surface. A cool but expensive way is to use a thermographic camera. I do not have one at hand.
But an ongoing project uses thermochromic ink. That is an ink that becomes transparent once a temperature threshold is reached. It goes from a certain color to no color at all. So if you paint a piece of cloth and place it on a given surface you can do the measurement of temperature at each point.
The following pictures show the heating process of a certain aluminium heated bed. My sample cloth was not large enough to cover the whole bet but you get the idea.
Heat sources start to show as whiter areas.
Now heat spreads a bit more.
Reaching the temperature threshold at many points
For best results a glass on top would make sure the cloth is making contact with the whole surface evenly (top left corner was not having a good contact which explains the apparent colder temperature).
A while ago I decided to implement the drop-cutter algorithm as part of an ongoing software project. I found very interesting Anders Wallin website and his Opencamlib software. But the project I was working on was java-based.
Once I had a working implementation I realized that while most of the output made sense, there were a few odds points that were clearly wrong.
In a nutshell, the idea of drop-cutter algorithm is that it works by simulating a tool is being dropped till it touches the 3D model whose tool-path we are trying to obtain. Using such a tool-path on a CNC machine equipped with the same tool, will render a geometrically accurate copy of the model.
The algorithm checks, for each XY tool location, which is the highest Z-axis value that causes a contact point between the tool tip and the object's 3D model. We use a triangular mesh for our models (STL files). Three types of checks are performed:
Whether the tool tip touches a triangle's vertex.
Whether the tool tip touches a triangle's edge.
Whether the tool tip touches a triangle's facet.
The contact point closer to the top (zmax) is selected for each XY coordinate tested.
If we represent the obtained 3d coordinates of these points graphically, we should see that tool tip is never penetrating model mesh but would stay tangent to it.
So once you get your data and some of the points seem to be too-low you wonder what may be wrong. If your calculations are right most of the time, is it some rounding error or what?
After banging my head against the wall for a little while, I realized some of the ideas of my implementation were wrong.
To determine the contact point of a tool with a triangle I determine the contact point between the plane the triangle is on and the tool axis. If there is not contact we can ignore that triangle safely. If there is a contact, then, depending of the geometry of the tool tip, calculate the tool height that corresponds to that contact. Now project that tool coordinate back to the triangle, using the triangle normal inverted, is the tool-triangle contact point inside the triangle? If not ignore the calculated tool height.
Unfortunately I discovered that at times the contact point is outside of the triangle face but for a short margin, due to face-to-tool-axis orientation, and if the point is ignored then the tool path produced will severe that triangle with the tool. The solution I am using so far is to relay a bit the last check while I figure out a more rigorous check.
I am not entirely happy with the way I do the edge test. The basic idea is that I calculate the minimum distance from the edge to the tool axis on a top 2d projection. If the distance is larger than tool radius then there is not contact, but if lower then there is contact. What I reckon is incorrect is to assume the contact point matches the vertical of the perpendicular distance on the 2d top projection.
I was lucky my friend Rafa was attacking the same problem with a more thorough view, which led him to a better solution that made me aware of what I was doing wrong. What happened was that my edge test was poor. More than a proper test it was a wild hack. It is true that for an edge to be touching the tool it has to be within the vertical projection of it, but that is only a necessary condition. Because my edge test was failing sometimes, I looked toward the facet test as the one to blame but I was wrong, facet test was ok, but with my hack I was making the facet test to detect some of the edges the edge test was failing to catch.
Long story short, we are looking for the tangent point of the edge and tool tip. If we model both as 3d equations we look for the solution that is unique (tangent) as opposed to these others that present two solutions (secant). Things get a bit messy but solvable. Once edge test is done properly I can safely remove my hack from the facet test and it is all good.
Since I got a beta version of Hephestos 2 from BQ before its launch, I have been using that printer more and more. After the initial annoyance about doing things on a certain way (like heating the extruder before performing a home move on an axis) I have got used to these details and I do not care anymore.
And with a few exceptions were a part bottom failed to stick to the bed (nothing that a bit of hairspray could not fix) the printer has been delivering consistently quality prints. Z-axis became a bit noisy on long moves but I have no other complaints.
However, all the time I have been using PLA or Filaflex on a cold bed. There is no provision for a heated bed add-on so I had a look around for a stand-alone temperature controller. I have found a simple pcb unit with display that controls a relay for a heating load up to 20A. Not sure how long that relay could last but for less than $5 I am going to give it a try.
Next the bed, I do like aluminium beds with power resistors epoxied to the bottom. In this case care is needed to take advantage of the holes in the bed holder parts so that space could be used by the resistors without losing more than a few millimetres of print height.
Just for testing I fixed the bed using kapton tape. I was not sure on whether to use the same clamp mechanism used for the standard bed, so I reckon I will use metal bulldog clips on the sides. This new bed being metallic too works ok with the inductive probe used for automatic bed tramming.
The only other change needed was to adjust the bed offset for the new bed before starting to print. Next PLA printing at 50C worked without any trouble.
And so did the first sample print I did on ABS. But I have to kill that after a few layers because the bed was wobbling back and forth as only a bit of kapton tape was fixing it to the bed holder. Next day I will fix the bed properly to the carriage.
Of course for this bed, equipped with 4 25W power resistors, and disipating around 120W an additional power supply is needed. I used a 12V 300W power supply I had around. 12V are needed to power the temperature controller and I am using 12V for the heating element too.
No electrical or logical connection with the Hephestos 2 electronics is needed. Of course that also means that nor the printer nor the host software has a way to switch on or off the bed or of adjusting the temperature. All of this has to be done manually by the user.
I am working on an Art project that requires some radio-reception capability on a Raspberry Pi. I available online. But given the local nature of the data I need to treat this time I have to use a local receiver.
have used in the past some interesting website that feature an SDR device whose reception is
One suitable device I found very inexpensive are DVB-T USB dongles originally intended for watching Digital TV on a computer. These dongles can be had for less than $10 on eBay. The good thing is that the chipsets employed are Linux supported and there is a bunch of usefulsoftware that can use them as a Software Defined Radio (SDR).
What is SDR? Well, basically it can act as a multipurpose radio scanner for many different purposes as spectrum usage recorder, amateur radio receiver or just listening to FM radio or airplane ADS-B transponders. For that latter purpose there is a cool program called dump1090 that will receive and decode the messages of the airplanes' transponders reaching your receiver.
After watching a video of a new pen plotter made by Evil Mad Scientist we wanted to have a similar device.
And having a 3D printer at hand plus some CAD software like Onshape or Fusion 360 it was a good exercise to design the whole thing.
As usual the process was not completely straightforward, as initially it was more about copying the model we saw but as things were coming together some new ideas were explored. So while the first mock-up was based entirely on laser-cut parts (some of them glued together to make them thicker as the crappy laser I have access to is really depth limited as it is low-power). Why laser-cut? Well because it was faster (or so it was supposed, but don't get me started on that).
Once the first model was put together several ideas pop up: First, motors are in the way of carriage motion and reduce a bit carriage travel along smooth rods. Second, motors require another part that could be fused with the machine feet and rods support. Third, the initial belt path created non-parallel belt runs that will cause poor accuracy and variable belt tension, so central carriage needs to be revised.
Eventually, the model became more and more made of printed parts and once published there have been more ideas pouring in from some of the readers, like an easier to orient pen holder that already replaced the original one.
My initial approach was to try to imitate the design and tools of AxiDraw but then I learned they use a PIC-based board that I do not have around and that it will take a while till I get one, but I had Arduinos laying around instead, so it was settled my plotter would be operated by Arduino. A CNCshield a friend gave me away (thanks Ernes) could hold a couple of stepper drivers to control the machine.
A logical choice was to use GRBL firmware but a few details needed to be solved: this contraption is not a regular cartesian design but it uses a single belt configuration called H-bot. From the math point of view h-bot and corexy work the same so I was happy to learn the latest versions of GRBL do in fact support corexy. That was one thing solved. The next one was that I needed to control a servo por pen-up and pen-down movements. For doing that I learned that robottini's version of GRBL could do that too. So another need was solved too and firmware was settled. You can use mine. Servo is controlled by M3 and M5 commands.
So my drawing machine will receive drawing commands as g-code but, how is that drawing code being created. I looked around and what was designed for AxiDraw was an Inkscape plugin that would create code suitable for the board they sell which is nothing similar to the g-code mine uses, so I had to use something else.
I learned about several projects for outputting g-code for laser cutters from Inkscape. I settled with one plugin that seemed very powerful not only cutting but doing raster images too, but intended for a laser cutter. The good thing was that output was g-code, so I had to hack my way to adapt it to draw with a pen. After some struggle I manage to get a stable response.
The problems I faced were that pen up and pen down commands took time and I needed to add an extra delay so drawing would be ok. Where original plugin controlled the laser output power I just needed to set the pen down so lines would be printed. It took a while but now it is working nicely.
If you wonder why there is a 608 bearing on the pen carriage which is not present in the CAD files it is because it adds a bit more weight so the ball-pen will draw a more consistent line.
Once the g-code files are obtained, in my case using the Inkscape plugin, another tool is needed to send the file to the drawing machine. I am using a Java-based program called Universal Serial Sender that does the job brilliantly and it includes a preview and a live view of the print too.
That makes the whole workflow based on open source software that can run on any operating system you are using.
Some of you asked me why the 4xiDraw name: well, AxiDraw is a registered trademark and FreeDraw was already taken too.
While the Wire library allows you to get I2C working right off the bat with Arduino, there are times when the built-in Wire library does not do it. For some people this happens because they need to do a repeated start condition or to receive a large number of bytes, tasks that seemed to be not possible with Wire library. But for me the problem this time is a bit odd: I had to overcome the limitation of each I2C device to have a different address.
It turns out that I am using a magnetic encoder chip that responds to a given address that cannot be changed. Because I want to be able to access at least two similar encoders from on Arduino board I find myself in the unlikely situation where I have to use two different I2C buses, one per device. However, I2C interface was designed to do the exact opposite thing, allowing several devices to communicate over the same bus (provided each one had a different address of course).
As second detail I have been interested is to speed up the communication, as the current request takes almost 1 millisecond (to read a 16 bit number). It may not seem much but as I wanted to keep a constant frequency of 1Khz on my main loop, that reading time was definitely too long. When using the Wire library I learned that setting TWR=1 will offer me the fastest communication of about 170 microseconds, so that problem was also taken care of. But I still needed to get a I2C second bus running.
There are several Soft I2C libraries for Arduino, but the one that worked for me is the SoftI2CMaster, which I could see is available as simple C library or as C++ class wrapping it all nicely. I settled with the former that hopefully gives me an edge in communication time. It all should had been very simple if somebody told me I should shift left one bit the address value, but because I did not know that, I wasted a couple of hours till I figured that out.
Once set properly, the library allowed me to read one sensor value in 164 microseconds. The code below reads one angle value from the AS5600 sensor using this library.
So now, even if I read two values, one from each sensor, it will take me less that 400 microseconds, leaving room for additional processing while still keeping my 1Khz loop time. Now I only need to figure the same out for other platforms like ESP8266, Nucleo STM32 and MKR1000.