I just make a poor man TikTok app.

Artiya
BOOTLEGSOFT
Published in
6 min readMar 10, 2021

--

Motivation

I was trying to build a video platform since in university when YouTube is a new big next thing. I called it “MyTube”(My version of YouTube copycat) and submit it for a programming course project. One of the reasons that prevent it from going out to the world is it’s too expensive to run a video site. The cost of hosting, storage, and bandwidth is not capable for a broke student. Time goes by, People consume more video content from platforms like YouTube. I have learned a lot, and internet technology advanced in a decentralized way. Facebook, TikTok. I like cute girls dance on short videos too. I’m still broke so what if I can make a video platform without using any money?

Tech Stack

Months ago, I have researched how video streaming works like TikTok but for a specific user segment. I found that we can use HLS(HTTP Live Streaming) technique by encoding video into small files and create playlists of different video quality. It can achieve video hosting without using a special streaming server. Just a normal HTTP server to serve the small files. The storage and CDN to faster streaming content still be a missing part. I have successfully implemented it on Google Cloud with Cloud CDN with a backend bucket. This is not a complete solution, it still needs an encoding service to process the uploaded video to HLS. Using the compute engine with GPU attached is still be a viable option but it comes with a heavy price. Another option is using a complete service like Cloudflare Stream but it would cost me a kidney per month for the starter package. My employer at the time gladly purchases the service with the taxpayer money. But now I have to find a new way to achieve this without using any money.

HLS file structure

Encoding — I love how fast the smartphone processor can do now. I feel like it can be used as the main video encoder especially the video in the platform will be really short around ~15s. I have tested FFMPEG encoding on my not-so-fast Xiaomi Mi9 and iPhone SE 2020 the result is around 0.3x. It means video 10s long would using ~33s to encode to HLS format. Not so bad right, encoding video on the smartphone is do-able. I found an FFMPEG HLS encoding shell script on GitHub gist. I have to convert it to react-native code to let it work on the mobile app.

Storage and CDN— The most important part of the platform that would cost an arm and a leg to serve. I will use the IPFS(A distributed web) to store and distribute the files in a peer-to-peer way. It just like BitTorrent but you can access the data from the HTTP gateway, make it usable to serve web static content like HTML, image, video. The HLS small files would serve well in IPFS gateways. I made something like IPFS for my senior project, using the technique called DHT(Distributed Hash Table) more specifically Kademlia, p2p transportation, and mixing with NAT traversal. IPFS is developed by Protocol Labs, the one that created Filecoin and raised $250m from ICO. Filecoin added the incentivize layer on top of IPFS. No, thank you. I would love to just use IPFS for free. I still need to run a node to host the original content. But no, remember I got no money. The best option is to use a free IPFS service from Infura, which free and can upload files directly to the IPFS nodes cluster. The storage now free from the crypto money company.

The best CDN I can think of is Cloudflare, which costs a kidney for video streaming CDN service but somehow IFPS gateway CDN from Cloudflare is free but with a limitation “video streaming is not allowed”. If I use the whole mp4 video it will not possible to use the Cloudflare IPFS gateway but I only use HLS small files, not a video file so it will work just fine. Using the default IPFS gateway (https://ipfs.io/ipfs) still be an option but Cloudflare’s gateway is faster. Now we got the best free CDN. Thanks, Cloudflare, please don’t take my kidney, take my heart instead.

Cloudflare IPFS Gateway limitation.

Front-end App — Apart from using the client mobile app to encode video the app needs the cheapest way to develop the whole user experience like smoothing scrolling video timeline(TikTok like), shooting, and uploading the videos. React-Native is my current first choice for me to develop a mobile app. Expo environment is really good with many libraries and documentation. The most important library is expo-av for the video players and expo-camera to record video. It not perfect. Expo-av still cannot adapt bitrate fast enough before the video end, I have to program to select bitrate manually. Expo-camera on Android cannot use video stabilization and the workaround is allowing users to select videos from the phone camera app. The performance of the react-native app is not the best, scrolling through videos sometimes can spot the fps drop. It’s necessary to unload the video view before it uses all of the app memory. The react-native-ffmpeg makes the app need to detach from the expo environment.

Timeline feed — I don’t want to make a back-end for the app and no such thing as a free backend service. Everything needs to make on the front-end app. The Firebase Cloud Firestore is a good choice for starting. It free at the start and if more users using the app, I will be bankrupt or need to sell the user data for personalized ads. For now, nothing to worry about. I’m using FireStore to store the IPFS hash of the HLS index playlist and the hash of the cover image that I snap from the first frame of the video (to make it feel like the video load really fast. It is still really fast). I’m using Firebase Authentication-Anonymous user too to ensure user ID and some level of data security and it can easy to develop user sign-in and invoke into a social platform by adding the following, like, and comment later. For now, all users can do is just viewing the video and upload a new video. The video timeline feed is straightforward just sort by the timestamp of the uploaded videos and only shows the total view of the video. For now, not even the video title or caption was implemented. It just a pure video app. I would love to try an ML on the mobile device to personalize the feed but maybe another time.

HNOOM App Architecture

The Result

I called the app HNOOM. It’s a made-up word. The HLS encoding on the phone is not particularly fast but it’s usable. The scrolling on the video list is fine, not the best, and can see some fps drop on swift scrolling. The content load really fast, thank to Cloudflare. Sometimes, the video player not working on iOS. I have to use another player lib react-native-video to work around. The android expo-av player is not adapt to the bandwidth. Underlaying video player for HLS video react-native lib is really buggy. The memory management underly of the libs native code is not suitable for this app use-case. The overall experience is fine but not good enough for a production-ready app using react-native. At least, the concept of making a short video app without breaking a bank is proven. I really think making a native app would be a better choice for real-world use. The next thing is finding a cute girl to dance on the app :P

Source code:

https://github.com/artiya4u/hnoom

--

--