Capture Raspberry Pi Camera output, convert and stream video, decode and play in browser.
The repository is available on GitHub.
Capture Raspberry Pi Camera output, convert and stream video, decode and play in browser.
The repository is available on GitHub.
Good morning everyone,
Reddit is an American social news aggregation, web content rating, and discussion website.
And it opens a cool API to engineer your own front-end.
So here comes my take on it : http://reddit.webmaestro.fr.
It basically fetches JSON data, parses image and video sources, embedded players or templated content and turns it into optimized React components. The design is minimalistic and based entirely on Semantic UI.
With iframes, video players and heavy GIFs mixing over infinite scrolling, the main idea was to mount and unmount components depending on their visibility. And it was just too easy with react-lazyload.
I had some fun setting the volume of videos in relation to their position on the viewport. It works pretty well, try it out !
I delivered an Angular JavaScript front-end for an Access-Control System with Facial-Recognition, refactored the API and entire Python back-end over the IoT device, built the Desktop setup application and even contributed to the Hardware development !
A little while ago, I wrote some setup script and web application for a wireless Raspberry Pi bridge.
This package wirelessly connects a Raspberry Pi to available WiFi networks and bridges the connection to an access point.
You will need a Raspberry Pi 3 and an extra WiFi adapter.
More details on the GitHub repository.
Here is an Electron application built on React and Redux.
To try it with your browser, visit nutflex.webmaestro.fr. You could also download one of this Desktop distributions : OSX, Linux or Windows.
It uses the Movie Database API to fetch informations about movies and TV shows.
I should share my experiments more often on this blog, and maybe not only the 3D javascript game development that I’m just having fun with. But anyway, here is the last one in date : a third person planet exloration thing.
It’s not exactly “work in progress” since I don’t really plan on improving any of it from here, so just consider this work bare and unfinished. You can try it out here : battle-royale.webmaestro.fr.
The 3D models used, such as the trees and stones, are from Poly by Google.
The planet is generated when page loads. Noise is applied to the shere vertices length to create the terrain, and “biomes” (materials and 3D models) are set according to the elevation and latitude.
Controls are the classic W, A, S, D and mouse. I had to adapt the “third-person” logic to rotate rather than translate over space.
There is a day and night cycle that depends on where the player is positionned on the globe. The sun and the moon are casting light on opposite sides while turning around.
The water is… just ugly. There is no collision detection. And the character is a simple cone.
Oh, and there is no server to make it a multi-player shooter, even though that was the ispiration. The idea came when a friend showed me the very entertaining Fortnite. We thought it would be fun to turn this “Battle Royale” island into a planet. Instead of a “storm” shrinking toward the gathered players, we could simply reduce the radius of the spherical terrain… That was the concept.
Maybe I could post details about the code if whoever is interested. In the meantime I have other things to focus on !