I use the calibration card to measure and cut the filament at 10 on the card, i.e 100mm plus the dead space before the 0. But since I measure the extruded distance also with this space it can be ignored.
Make sure the hot end has reached the correct temperature for the filament you are calibrating with. And issue the command to start feeding the 100mm of filament. Once that finished measure the amount of filament left.
I had 23mm left this time. Then it is time to do the calculation. I find it easiest to start the developer tools in the browser. In the console there I just enter the expressions in the article.
Then use the Luban console to enter the newly calculated value and permanently store it. If you want to understand the M-commands like M92 this page was good, https://snapmaker.github.io/Documentation/gcode/M092
Finally make a test run with the same command that feeds 100mm of filament. Mine fed a few mm of extra filament.
This time I ignored the difference but if you want you can re-calibrate. Use your calculated S value as the new E in the calculation and enter how much filament that has been fed. Repeat until you are satisfied.
I found the OpenMQTTGateway project and got really interested since it can open up for a lot of interesting integrations to Home Assistant. I bought a LilyGo to start experimenting and could get it to read 433Mhz devices like Nexa. All this is possible due to the RTL_433_ESP library.
I wanted also to be able to write to the devices since I have several power plugs that use Nexa. That is not possible with just the LilyGo and the radio module on it. With the STX882 it should be possible to transmit. So I bought both a STX882 and a SRX882. This page describes how to connect these to a ESP32 or NodeMCU.
Have not yet tried to make the full setup and connect everything. Will update this post once I have tried that.
I found Timbo’s tips on how to make a track from photos but it did not work straight out of the box for me. The problem was two fold; first I needed to change ” to ‘ but also gpsdatetime tag did not exists but changing to DateTimeOriginal made it work: exiftool -r -if '$DateTimeOriginal' -fileOrder DateTimeOriginal -p ./gpx.fmt ./*.jpg > track.gpx
But it also required me to to modify the gpx.fmt file.
There is a circuits diagram on the Maytech page which makes it easy to connect the remote receiver and the VESC together using the pinout picture on the MakerX page.
VESC App settings:
if you are using firmware version 5.03 (VESC TOOL 3.1) or above – turn off the phase filter, see here for more details
Set App to Use to UART – gives much quicker response than the Maytech recommended UART and PPM.
Use the Maytech recommended Control Type: Current No Reverse under VESC Remote
For quicker response on the remote, change the Input Deadband to something like 3% under VESC Remote
VESC Motor Settings:
Set Motor Current Max to 140A
Do not set Motor Current Max Brake to 0A but some negativa number like -1A
Set Absolute Maximum Current to 180A
Here are some screen dumps to better explain the settings
Started flying FPV in October 2020 but have since then blown a couple of ESC. And each time I’m going to setup the new ones I ran into problems so this time I write down some notes to not have that happen again…
The most common problem I run into is that I turn on Bi-directional DShot in BetaFlight but then I get RPM_FILTER Disarm disabled flag turned on!! And cannot figure out why??
The answer and solution is quite simple. The default ESC firmware does not support bidirectional DShot! And this article describes how to get it. The ESCs I have one need to use BLHeli_S firmware. On my Nazgul 5 2 HD the ESC has the G H 30 configuration and currently I’m using version 16.73. So I download the G_H_30_REV16_73.HEX firmware and use BLHeli Configuration to flash it. Other versions can be downloaded from https://github.com/JazzMaverick/BLHeli/tree/JazzMaverick-patch-1/BLHeli_S%20SiLabs
I wanted to have a live video stream from my Snapmaker 2.0 since it now has got a space in a closet. After a couple of tries to use a live video stream from the XiaoFang camera I realized it would not be a good solution. Quite a lot of tweaking to get it to work and it would require quite some resources from the Raspberry and thus might affect prints. But I still wanted to use that camera since I already had it.
After looking through the OctoPrint forum I saw that some people had used a special firmware on Wyze cameras to just make them into webcameras and connect directly to the Raspberry. The Wyze camera and the XiaoFang are the same camera, but wonder if it would just work to use that firmware? Found an article indicating it should work so gave it a try. Went to the firmware page and followed the instructions. Built a USB A to USB A cable and connected the camera to my PC. Wow, it is a webcam! Connected it the the Raspberry and entered the default settings for camera. Voila!
Last summer I got a Xiaomi XiaoFang 1S camera from Magnus. Tried to get it working then but never succeeded but today I got some inspiration and managed to get it working and stream video over RTSP! It was not so hard. Just go to the GitHub project Xiaomi-Dafang-Hacks and follow the instructions but one must know that it was a T20L so start with those specific instructions.
One thing also that might have made a difference, now this time I did this on my Windows machine and not my Mac. And really made sure the SD card was formatted as a FAT32 drive.
Also found this firmware that looks interesting; OpenMiko
I had a bit of struggle with Let’s Encrypts certbot today. Somehow it got 403 Forbidden each time it tried to authenticate the domain for my server. All the other website it worked without problem.
I created a simple text file in /.well-known/acme-challenge to see if I could access it. Yes it worked externally but then I realized I got 403 Forbidden when I tried to access from the server!
After lots of digging and testing I checked the apache general error.log and saw these lines: [Sun Jan 20 21:52:32.754574 2019] [authz_core:error] [pid 2238] [client 2001:4b98:dc2:47:216:3eff:fe9b:cec6] AH01630: client denied by server configuration: /var/www/
Seeing that IPv6 address gave a clue and soon I found out it was due to that my webserver is not setup to listen to IPv6. <virtualhost 185.26.124.99:80 [2001:4b98:dc2:47:216:3eff:fe9b:cec6]:80>
Adding it to the Virtual host listen config it solved it!
Lånade hem en Amazon Echo för att experimentera lite vad man kan göra med den och se hur bra röstgränssnittet är. Någon hade nämnt man kan använda den för att spela musik från Spotify och hittade rätt fort var man skulle ställa in det men valet var utgråat!! Gick inte välja. Varför?
Efter lite googlande hittade jag att det beror på att ens Amazon konto är registrerat i fel region. Men hur ändrar jag och vad är rätt?
Den här artikeln gav tillräckliga ledtrådar för att inse var man ändrar det. Dock enklast är att först lägga in en ny USA adress(det är den region som du måste valt för att få upp Spotify valet visade det sig). Men vilken adress skall man fylla i? Random addresses ger ett par slumpmässiga men dessa ville inte Amazon acceptera. Jag tog dock ett av resultaten och klippte in i Google Maps och sedan klicka på en annan del av gatan för att få en giltig adress.
Sedan nästa steg var att göra som beskrivs i artikeln, klicka där det står Hello, Karl-Petter(ditt namn) och sedan välja Your Content and Devices. I artikeln står det att den heter Manage your Content and Devices men den heter Your Content and Devices numera. Sedan är det bara följa instruktionerna och välja din nya USA adress. Sedan får du vänta ett tag innan din Echo har fattat att du bytt region. Sedan är det bara följa denna guide.