Thanks to the recentv4.4.1 BuddyBoard firmware the http file api works as desired: you can easily upload files to a usb stick attached to the printer. To perform bulk updates of your printer farm it’s much easier to write a simple bash script which deploys the print jobs:
#!/usr/bin/env bashset -e# printer settingsPRINTER_HOST="192.168.1.123"API_KEY="ToEn8eDlR7kWIiUpVPJg"FILENAME=myfile.gcode# capture command stdout - http status code will be written to stdout# progress bar on stderr# http response (json) stored in /tmp/.upload-responseCURL_HTTP_STATUS=$(curl \ --header "X-Api-Key: ${API_KEY}" \ -F "file=@${FILENAME}" \ -F "path=" \ -X POST \ -o /tmp/.upload-response \ --write-out "%{http_code}" \ http://${PRINTER_HOST}/api/files/local)# get resultCURL_EXITCODE=$?CURL_RESPONSE=$(cat /tmp/.upload-response)# success ?if [ ${CURL_EXITCODE} -ne 0 ] || [ "${CURL_HTTP_STATUS}" -ne "201" ]; then echo "error: upload failed (${CURL_HTTP_STATUS})"else echo "upload succeed"fi
Uploading multiple files and checksums via http can be achieved with cURL and a few lines bash scripting. This might replacescp
in most cases.
# array of files (and checksums) provided as cURL optionsUPLOAD_FILES=()# get all files within myUploadDir dir and calculate checksumswhile read -r FILEdo # get sha256 checksum CHECKSUM=$(sha256sum ${FILE} | awk '{print $1}' ) echo $FILE echo $CHECKSUM # extract filename FILENAME=$(basename ${FILE}) # append file and checksum to curl upload args UPLOAD_FILES+=("-F" "file=@${FILE}") UPLOAD_FILES+=("-F" "${FILENAME}=${CHECKSUM}")# get all files within myUploadDirdone <<<$(find myUploadDir/* -type f | sort)# uploadcurl \ -X PUT -H "Content-Type: multipart/form-data" \ "${UPLOAD_FILES[@]}" \ https://httpbin.org/put
With the release of the new AMD EPYC based cloud servers (CPX), Hetzner has applied some changes to their virtualization platform (QEMU). The network interface names have changed due to the modern virtio_net network adapter 0x1041 including different pcie bus addresses. All Hetzner standard images are now using the net.ifnames=0 setting to enforce the kernel […]
In case you’re using self-signed x509 certificates you may see this error message within the traefik logs – the solution is quite easy: the first certificate of your combined pem file (ca+intermediate+server) has to be the server certificate!
Usingejs astemplate-engine withinexpress.js default configuration can be very annoying – you have to pass a dedicated variable set to eachresponse.render()
call. But for a lot of tasks it is required to use some kind of global variables in your templates, e.g. page title, resources and much more.
The most reliable solution is acustom template renderer which invokes ejs in the way you want.
const _ejs = require('ejs');// example: global configconst _config = require('../config.json');// custom ejs render functionmodule.exports = function render(filename, payload={}, cb){ // some default page vars payload.page = payload.page || {}; payload.page.slogan = payload.page.slogan || _config.slogan; payload.page.title = payload.page.title || _config.title; payload.page.brandname = payload.page.brandname || _config.name; // resources payload.resources = payload.resources || {}; // render file // you can also pass some ejs lowlevel options _ejs.renderFile(filename, payload, { }, cb);}
const _express = require('express');const _webapp = _express();const _path = require('path');const _tplengine = require('./my-template-engine');// set the view engine to ejs_webapp.set('views', _path.join(__dirname, '../views'));_webapp.engine('ejs', _tplengine);_webapp.set('view engine', 'ejs');// your controller_webapp.get('/', function(req, res){ // render the view using additional variables res.render('myview', { x: 1, y: 2 });});
marked is one of the most popular markdown parsers written in javascript. It’s quite easy to integrateEnlighterJS within, just pass a customhighlight function as option.
File: markdown.js
const _marked = require('marked');const _renderer = new _marked.Renderer();// escape html specialcharsfunction escHtml(s){ return s.replace(/&/g, '&') .replace(/"/g, '"') .replace(/</g, '<') .replace(/>/g, '>');}// EnlighterJS Codeblocks_renderer.code = function(code, lang){ return `<pre data-enlighter-language="${lang}">${escHtml(code)}</pre>`;};const _options = { // gfm style line breaks breaks: true, // custom renderer renderer: _renderer};// promise proxyfunction render(content){ return new Promise(function(resolve, reject){ // async rendering _marked(content, _options, function(e, html){ if (e){ reject(e); }else{ resolve(html); } }); });}module.exports = { render: render};
const _markdown = require('markdown');// fetch markdown based contentconst rawCode = getMarkdownContent(..);// render contentconst html = await _markdown.render(rawCode);
Comparing the content of two directories binary-safe is a common used feature especially for data synchronization tasks. You can easily implement a simple compare algorithm by generating the sha256 checksums of each file – this is not a high-performance solution but even works on large files!
const _fs = require('fs-magic');// compare directoy contents based on sha256 hash tablesasync function compareDirectories(dir1, dir2){ // fetch file lists const [files1, dirs1] = await _fs.scandir(dir1, true, true); const [files2, dirs2] = await _fs.scandir(dir2, true, true); // num files, directories equal ? if (files1.length != files2.length){ throw new Error('The directories containing a different number of files ' + files1.length + '/' + files2.length); } if (dirs1.length != dirs2.length){ throw new Error('The directories containing a different number of subdirectories ' + dirs1.length + '/' + dirs2.length); } // generate file checksums const hashes1 = await Promise.all(files1.map(f => _fs.sha256file(f))); const hashes2 = await Promise.all(files2.map(f => _fs.sha256file(f))); // convert arrays to objects filename=>hash const lookup = {}; for (let i=0;i<hashes2.length;i++){ // normalized filenames const f2 = files2[i].substr(dir2.length); // assign lookup[f2] = hashes2[i]; } // compare dir1 to dir2 for (let i=0;i<hashes1.length;i++){ // normalized filenames const f1 = files1[i].substr(dir1.length); // exists ? if (!lookup[f1]){ throw new Error('File <' + files1[i] + '> does not exist in <' + dir2 + '>'); } // hash valid ? if (lookup[f1] !== hashes1[i]){ throw new Error('File Checksum of <' + files1[i] + '> does not match <' + files2[i] + '>'); } } return true;}await compareDirectories('/tmp/data0', '/tmp/data1');
Sometime you may need a special version of Node.js or a recent version within a foreign build environment. But in the moderncontainer-based infrastructure it is not possible to use apt to install custom packets which are notwhitelisted. As an workaround, you can download pre-build binaries via wget into your build directory and add the bin/ dir to your PATH. This allows you to use any pre-build third party software without installation.
os: linuxlanguage: perlperl: - "5.24" - "5.14"# skip perl (cpanm) dependency management# install nodejs into home folderinstall: # fetch latest nodejs archive - wget https://nodejs.org/dist/v8.8.1/node-v8.8.1-linux-x64.tar.gz -O /tmp/nodejs.tgz # unzip - tar -xzf /tmp/nodejs.tgz # add nodejs binaries to path - this has to be done here! - export PATH=$PWD/node-v8.8.1-linux-x64/bin:$PATH # show node version - node -v - npm -v # install node dependencies - npm installscript: # syntax check - perl -Mstrict -Mdiagnostics -cw rsnapshot # run javascript based tests - npm test
In case your projects make use of external databases like MySQL/MariaDB you need to setup your continuous integration tests with dedicated testcases includingapplication specific database structures. This requires some initial steps to load the database dump before starting the tests. Thanks totravisci.org you do’t need to do this kind of stuff within your application – just use the test configuration!
First of all, we addMySQL Server as service within our .travis.yml file. This initializes a dedicated database instance for testing. Additionally we hook into thebefore_install action to initialize our database structure. In this example all SQL commands are loaded from an external file located in our test directory.
language: node_jsnode_js: - "7" - "7.6" - "8"services: - mysqlbefore_install: - mysql -u root --password="" < test/travis.sql
Our Test Database structure is definied within a dedicated SQL file intest/travis.sql. It contains all necessary commands to add anew user, createdemo database, createdemo tables and finally add sometest-data.
# Create TestuserCREATE USER 'dev'@'localhost' IDENTIFIED BY 'dev';GRANT SELECT,INSERT,UPDATE,DELETE,CREATE,DROP ON *.* TO 'dev'@'localhost';# Create DBCREATE DATABASE IF NOT EXISTS `demo` DEFAULT CHARACTER SET utf8 COLLATE utf8_general_ci;USE `demo`;# Create TableCREATE TABLE IF NOT EXISTS `users` ( `user_id` int(11) NOT NULL, `created_on` timestamp NULL DEFAULT NULL, `modified_on` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP, `username` varchar(50) DEFAULT NULL, `salt` varchar(20) DEFAULT NULL, `password` varchar(50) DEFAULT NULL, `email` varchar(150) DEFAULT NULL, `firstname` varchar(50) DEFAULT NULL, `lastname` varchar(50) DEFAULT NULL, `dob` date DEFAULT NULL) ENGINE=InnoDB DEFAULT CHARSET=utf8;ALTER TABLE `users` ADD PRIMARY KEY (`user_id`);ALTER TABLE `users` MODIFY `user_id` int(11) NOT NULL AUTO_INCREMENT;# Add Data
In most cases, every web-application requires some kind of request logging. Especially package downloads will be counted for statistic purpose. By usingexpressjs, static content is served by the middleware moduleserve-static.
To count the successfull requests handled by this module, you can hook into thesetHeaders callback which is invoked each time a file is ready for delivering (file exists, file is accessible).
// utilityconst _path = require('path');// expressjsconst _express = require('express');let _webapp = _express();// your statistic moduleconst _downloadStats = require('./download-counter');// serve static package files_webapp.use('/downloads', _express.static(_path.join(__dirname, 'downloads'), { // setHeaders is only called on success (stat available/file found) setHeaders: function(res, path, stat){ // count request: full-path, file-stats, client-ip _downloadStats(path, stat, res.req.connection.remoteAddress); }}));