Compare commits
7 Commits
submit-web
...
v1.1.0
| Author | SHA1 | Date | |
|---|---|---|---|
|
9b4c0b24ef
|
|||
|
c418634515
|
|||
|
87ac0cecb7
|
|||
|
1d49689171
|
|||
|
fdd41397bf
|
|||
|
7e5c92dd63
|
|||
|
83f2b76cbc
|
51
README.md
51
README.md
@@ -8,53 +8,6 @@ A light weight personal music streaming platform.
|
||||
|
||||
[toc]
|
||||
|
||||
## TODO
|
||||
|
||||
- Restructure,为多人协作做好准备
|
||||
|
||||
### 前端部分更改
|
||||
|
||||
- 修复页面 CSS 溢出问题
|
||||
- 显示操作执行世界
|
||||
|
||||
页面数量至少 10 个(目前 5 个),预计添加如下页面
|
||||
|
||||
- 文件详情页,可以修改单个文件的信息
|
||||
- 文件评论页,可对文件进行评论
|
||||
- 最新动态页,查看最近播放的曲目、最近的评论
|
||||
- 登录/注册页,取代现有的 token 逻辑
|
||||
- FfmpegConfigs 配置页面
|
||||
- 意见反馈的查看页面
|
||||
|
||||
### 后端部分更改
|
||||
|
||||
- 返回操作执行时间
|
||||
|
||||
- 修复 Prepare 模式转码不完整但仍然被 tmpfs 记录为成功转码的问题
|
||||
|
||||
- FfmpegConfigs 由目前的字典格式改为列表格式
|
||||
- 为 sqlite3 添加数据库单线程锁
|
||||
- 添加外键约束
|
||||
- Update 功能自动检查重复的项目并忽略,只添加新的项目
|
||||
- Token 验证方法改为 暱称 + 密码 的方法,管理员使用 admin 保留关键字作为暱称。
|
||||
|
||||
需要 8 个 entities 和 6 个 relationship,目前有 3 个 entities,和 1 个 relationship
|
||||
|
||||
目前有
|
||||
|
||||
- files 文件表,数量 50,000
|
||||
- folders 文件夹表,数量 3,000
|
||||
- feedbacks 反馈留言表
|
||||
|
||||
计划添加
|
||||
|
||||
- users 用户表
|
||||
- comments 管理表
|
||||
- playbacks 播放记录表
|
||||
- likes 点赞记录表
|
||||
|
||||
## 编译 & 构建
|
||||
|
||||
## How to build
|
||||
|
||||
### Build the back-end server
|
||||
@@ -411,10 +364,10 @@ Anonymous API can be called by anonymous.
|
||||
|
||||
Currently only few APIs in font-end.
|
||||
|
||||
- `/#/share/39`
|
||||
- `/#/files/39/share`
|
||||
|
||||
Share a specific file.
|
||||
|
||||
- `/#/search-folders/2614`
|
||||
- `/#/folders/2614`
|
||||
|
||||
Show files in a specific folder.
|
||||
|
||||
@@ -1,36 +0,0 @@
|
||||
# DBMS Group Project Problem Description
|
||||
|
||||
- Group 1
|
||||
|
||||
The Internet infrastructure construction has made the network speed development faster. With the fast Internet, people are gradually migrating various data and services to the cloud. For example, NetEase Cloud Music, Spotify, and Apple Music, we call them streaming media platforms. The definition of streaming media platform is that users purchase the digital copyright of music and then play the music online on the platform.
|
||||
|
||||
Generally speaking, users cannot buy music that is not available on the platform. The user cannot download the digital file of the music (the user purchases the right to play instead of the right to copy). Users cannot upload their music to the platform.
|
||||
|
||||
However, in the era of digital copyright, there are still many advantages to getting original music files, such as no need to install a dedicated player; free copying to other devices (without violating copyright); no risk of music unavailable from the platform; no play records and privacy will be tracked by the platform.
|
||||
|
||||
Some people don't like streaming platforms. They like to collect music (download or buy CDs) and save it on their computers. But as more and more music is collected (over 70,000 songs and in total size of 800GB), it becomes very difficult to manage files. It is difficult for them to find where the songs they want to listen to are saved. Also, lossless music files are large and difficult to play online.
|
||||
|
||||
As long as there no such "Self-hosted music streaming platform" software available, we decided to develop a project based on database knowledge to help people who have collected a lot of music to enjoy their music simply.
|
||||
|
||||
We will handle various relevant types of data in our database. Including song name, album name, file size, update date, rating, comment, user information, etc. They are highly relevant, so using a relational database will be a good choice.
|
||||
|
||||
The features of the project we designed are as follows:
|
||||
|
||||
- Open. Independent front-end (GUI) and back-end (server program), using API to communicate.
|
||||
- Easy to use. Minimize dependencies, allowing users to configure quickly and simply.
|
||||
- Lightweight. The program is small in size and quick to install.
|
||||
- High performance. Only do what should be done, no features that will lead to poor performance.
|
||||
- Cross-platform. The project can run on computers, mobile phones, Linux, Windows, macOS, and X86 and ARM processor architectures.
|
||||
- Extensibility. Access to cloud OSS (Object Storage Service), reverse proxy, or other external software.
|
||||
|
||||
Our project has the following functions:
|
||||
|
||||
- Index file. Index local files into the database.
|
||||
- Search. Search for music based on name/album/tag/comment, sorted by rating or other columns.
|
||||
- Play. Play music online, play music randomly and play music at a low bit rate on a bad network.
|
||||
- User management. Users can register and log in.
|
||||
- Comment. Users can give a like or comment on the music.
|
||||
- Management. The administrator can upload music, update or delete the database.
|
||||
- Share. Generate a link to share the music with others.
|
||||
|
||||
After research and discussion, in order to meet the above requirements, we decided to use the Golang programming language on the backend. SQLite as a database program. Vue as the front-end GUI interface.
|
||||
@@ -1,750 +0,0 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"io"
|
||||
"log"
|
||||
"msw-open-music/internal/pkg/database"
|
||||
"msw-open-music/internal/pkg/tmpfs"
|
||||
"net/http"
|
||||
"os"
|
||||
"os/exec"
|
||||
"strconv"
|
||||
"strings"
|
||||
"time"
|
||||
)
|
||||
|
||||
type API struct {
|
||||
Db *database.Database
|
||||
Server http.Server
|
||||
token string
|
||||
APIConfig APIConfig
|
||||
Tmpfs *tmpfs.Tmpfs
|
||||
}
|
||||
|
||||
type FfmpegConfigList struct {
|
||||
FfmpegConfigList []FfmpegConfig `json:"ffmpeg_config_list"`
|
||||
}
|
||||
|
||||
type AddFfmpegConfigRequest struct {
|
||||
Token string `json:"token"`
|
||||
Name string `json:"name"`
|
||||
FfmpegConfig FfmpegConfig `json:"ffmpeg_config"`
|
||||
}
|
||||
|
||||
type FfmpegConfig struct {
|
||||
Name string `json:"name"`
|
||||
Args string `json:"args"`
|
||||
}
|
||||
|
||||
type Status struct {
|
||||
Status string `json:"status,omitempty"`
|
||||
}
|
||||
var ok Status = Status{
|
||||
Status: "OK",
|
||||
}
|
||||
|
||||
type WalkRequest struct {
|
||||
Token string `json:"token"`
|
||||
Root string `json:"root"`
|
||||
Pattern []string `json:"pattern"`
|
||||
}
|
||||
|
||||
type ResetRequest struct {
|
||||
Token string `json:"token"`
|
||||
}
|
||||
|
||||
type SearchFilesRequest struct {
|
||||
Filename string `json:"filename"`
|
||||
Limit int64 `json:"limit"`
|
||||
Offset int64 `json:"offset"`
|
||||
}
|
||||
|
||||
type SearchFoldersRequest struct {
|
||||
Foldername string `json:"foldername"`
|
||||
Limit int64 `json:"limit"`
|
||||
Offset int64 `json:"offset"`
|
||||
}
|
||||
|
||||
type SearchFilesResponse struct {
|
||||
Files []database.File `json:"files"`
|
||||
}
|
||||
|
||||
type SearchFoldersResponse struct {
|
||||
Folders []database.Folder `json:"folders"`
|
||||
}
|
||||
|
||||
type GetFilesInFolderRequest struct {
|
||||
Folder_id int64 `json:"folder_id"`
|
||||
Limit int64 `json:"limit"`
|
||||
Offset int64 `json:"offset"`
|
||||
}
|
||||
|
||||
type GetFilesInFolderResponse struct {
|
||||
Files *[]database.File `json:"files"`
|
||||
}
|
||||
|
||||
type GetRandomFilesResponse struct {
|
||||
Files *[]database.File `json:"files"`
|
||||
}
|
||||
|
||||
func (api *API) HandleGetRandomFiles(w http.ResponseWriter, r *http.Request) {
|
||||
files, err := api.Db.GetRandomFiles(10);
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
getRandomFilesResponse := &GetRandomFilesResponse{
|
||||
Files: &files,
|
||||
}
|
||||
log.Println("[api] Get random files")
|
||||
json.NewEncoder(w).Encode(getRandomFilesResponse)
|
||||
}
|
||||
|
||||
func (api *API) HandleGetFilesInFolder(w http.ResponseWriter, r *http.Request) {
|
||||
getFilesInFolderRequest := &GetFilesInFolderRequest{
|
||||
Folder_id: -1,
|
||||
}
|
||||
|
||||
err := json.NewDecoder(r.Body).Decode(getFilesInFolderRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check empyt
|
||||
if getFilesInFolderRequest.Folder_id < 0 {
|
||||
api.HandleErrorString(w, r, `"folder_id" can't be none or negative`)
|
||||
return
|
||||
}
|
||||
|
||||
files, err := api.Db.GetFilesInFolder(getFilesInFolderRequest.Folder_id, getFilesInFolderRequest.Limit, getFilesInFolderRequest.Offset)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
getFilesInFolderResponse := &GetFilesInFolderResponse{
|
||||
Files: &files,
|
||||
}
|
||||
|
||||
log.Println("[api] Get files in folder", getFilesInFolderRequest.Folder_id)
|
||||
|
||||
json.NewEncoder(w).Encode(getFilesInFolderResponse)
|
||||
}
|
||||
|
||||
func (api *API) CheckToken(w http.ResponseWriter, r *http.Request, token string) (error) {
|
||||
if token != api.token {
|
||||
err := errors.New("token not matched")
|
||||
log.Println("[api] [Warning] Token not matched", token)
|
||||
api.HandleErrorCode(w, r, err, 403)
|
||||
return err
|
||||
}
|
||||
log.Println("[api] Token passed")
|
||||
return nil
|
||||
}
|
||||
|
||||
func (api *API) HandleError(w http.ResponseWriter, r *http.Request, err error) {
|
||||
api.HandleErrorString(w, r, err.Error())
|
||||
}
|
||||
|
||||
func (api *API) HandleErrorCode(w http.ResponseWriter, r *http.Request, err error, code int) {
|
||||
api.HandleErrorStringCode(w, r, err.Error(), code)
|
||||
}
|
||||
|
||||
func (api *API) HandleErrorString(w http.ResponseWriter, r *http.Request, errorString string) {
|
||||
api.HandleErrorStringCode(w, r, errorString, 500)
|
||||
}
|
||||
|
||||
func (api *API) HandleErrorStringCode(w http.ResponseWriter, r *http.Request, errorString string, code int) {
|
||||
log.Println("[api] [Error]", code, errorString)
|
||||
errStatus := &Status{
|
||||
Status: errorString,
|
||||
}
|
||||
w.WriteHeader(code)
|
||||
json.NewEncoder(w).Encode(errStatus)
|
||||
}
|
||||
|
||||
func (api *API) HandleReset(w http.ResponseWriter, r *http.Request) {
|
||||
resetRequest := &ResetRequest{}
|
||||
err := json.NewDecoder(r.Body).Decode(resetRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check token
|
||||
err = api.CheckToken(w, r, resetRequest.Token)
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
|
||||
// reset
|
||||
err = api.Db.ResetFiles()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
err = api.Db.ResetFolder()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
api.HandleStatus(w, r, "Database reseted")
|
||||
}
|
||||
|
||||
func (api *API) HandleWalk(w http.ResponseWriter, r *http.Request) {
|
||||
walkRequest := &WalkRequest{}
|
||||
err := json.NewDecoder(r.Body).Decode(walkRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check token match
|
||||
err = api.CheckToken(w, r, walkRequest.Token)
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
|
||||
// check root empty
|
||||
if walkRequest.Root == "" {
|
||||
api.HandleErrorString(w, r, `key "root" can't be empty`)
|
||||
return
|
||||
}
|
||||
|
||||
// check pattern empty
|
||||
if len(walkRequest.Pattern) == 0 {
|
||||
api.HandleErrorString(w, r, `"[]pattern" can't be empty`)
|
||||
return
|
||||
}
|
||||
|
||||
// walk
|
||||
err = api.Db.Walk(walkRequest.Root, walkRequest.Pattern)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
api.HandleStatus(w, r, "Database udpated")
|
||||
}
|
||||
|
||||
func (api *API) HandleOK(w http.ResponseWriter, r *http.Request) {
|
||||
json.NewEncoder(w).Encode(&ok)
|
||||
}
|
||||
|
||||
func (api *API) HandleStatus(w http.ResponseWriter, r *http.Request, status string) {
|
||||
s := &Status{
|
||||
Status: status,
|
||||
}
|
||||
|
||||
json.NewEncoder(w).Encode(s)
|
||||
}
|
||||
|
||||
func (api *API) HandleSearchFiles(w http.ResponseWriter, r *http.Request) {
|
||||
searchFilesRequest := &SearchFilesRequest{}
|
||||
err := json.NewDecoder(r.Body).Decode(searchFilesRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check empty
|
||||
if searchFilesRequest.Filename == "" {
|
||||
api.HandleErrorString(w, r, `"filename" can't be empty`)
|
||||
return
|
||||
}
|
||||
if api.CheckLimit(w, r, searchFilesRequest.Limit) != nil {
|
||||
return
|
||||
}
|
||||
|
||||
searchFilesResponse := &SearchFilesResponse{}
|
||||
|
||||
searchFilesResponse.Files, err = api.Db.SearchFiles(searchFilesRequest.Filename, searchFilesRequest.Limit, searchFilesRequest.Offset)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Search files", searchFilesRequest.Filename, searchFilesRequest.Limit, searchFilesRequest.Offset)
|
||||
|
||||
json.NewEncoder(w).Encode(searchFilesResponse)
|
||||
}
|
||||
|
||||
func (api *API) CheckLimit(w http.ResponseWriter, r *http.Request, limit int64) (error) {
|
||||
if limit <= 0 || limit > 10 {
|
||||
log.Println("[api] [Warning] Limit error", limit)
|
||||
err := errors.New(`"limit" can't be zero or more than 10`)
|
||||
api.HandleError(w, r, err)
|
||||
return err
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (api *API) HandleSearchFolders(w http.ResponseWriter, r *http.Request) {
|
||||
searchFoldersRequest := &SearchFoldersRequest{}
|
||||
err := json.NewDecoder(r.Body).Decode(searchFoldersRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check empty
|
||||
if searchFoldersRequest.Foldername == "" {
|
||||
api.HandleErrorString(w, r, `"foldername" can't be empty`)
|
||||
return
|
||||
}
|
||||
if api.CheckLimit(w, r, searchFoldersRequest.Limit) != nil {
|
||||
return
|
||||
}
|
||||
|
||||
searchFoldersResponse := &SearchFoldersResponse{}
|
||||
|
||||
searchFoldersResponse.Folders, err = api.Db.SearchFolders(searchFoldersRequest.Foldername, searchFoldersRequest.Limit, searchFoldersRequest.Offset)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Search folders", searchFoldersRequest.Foldername, searchFoldersRequest.Limit, searchFoldersRequest.Offset)
|
||||
|
||||
json.NewEncoder(w).Encode(searchFoldersResponse)
|
||||
}
|
||||
|
||||
type GetFileRequest struct {
|
||||
ID int64 `json:"id"`
|
||||
}
|
||||
|
||||
func (api *API) HandleGetFileInfo(w http.ResponseWriter, r *http.Request) {
|
||||
getFileRequest := &GetFileRequest{
|
||||
ID: -1,
|
||||
}
|
||||
|
||||
err := json.NewDecoder(r.Body).Decode(getFileRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check empty
|
||||
if getFileRequest.ID < 0 {
|
||||
api.HandleErrorString(w, r, `"id" can't be none or negative`)
|
||||
return
|
||||
}
|
||||
|
||||
file, err := api.Db.GetFile(getFileRequest.ID)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
err = json.NewEncoder(w).Encode(file)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
}
|
||||
|
||||
func (api *API) CheckGetFileStream(w http.ResponseWriter, r *http.Request) (error) {
|
||||
var err error
|
||||
q := r.URL.Query()
|
||||
ids := q["id"]
|
||||
if len(ids) == 0 {
|
||||
err = errors.New(`parameter "id" can't be empty`)
|
||||
api.HandleError(w, r, err)
|
||||
return err
|
||||
}
|
||||
_, err = strconv.Atoi(ids[0])
|
||||
if err != nil {
|
||||
err = errors.New(`parameter "id" should be an integer`)
|
||||
api.HandleError(w, r, err)
|
||||
return err
|
||||
}
|
||||
configs := q["config"]
|
||||
if len(configs) == 0 {
|
||||
err = errors.New(`parameter "config" can't be empty`)
|
||||
api.HandleError(w, r, err)
|
||||
return err
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (api *API) GetFfmpegConfig(configName string) (FfmpegConfig, bool) {
|
||||
ffmpegConfig := FfmpegConfig{}
|
||||
for _, f := range api.APIConfig.FfmpegConfigList {
|
||||
if f.Name == configName {
|
||||
ffmpegConfig = f
|
||||
}
|
||||
}
|
||||
if ffmpegConfig.Name == "" {
|
||||
return ffmpegConfig, false
|
||||
}
|
||||
return ffmpegConfig, true
|
||||
}
|
||||
|
||||
func (api *API) HandleGetFileStream(w http.ResponseWriter, r *http.Request) {
|
||||
err := api.CheckGetFileStream(w, r)
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
q := r.URL.Query()
|
||||
ids := q["id"]
|
||||
id, err := strconv.Atoi(ids[0])
|
||||
configs := q["config"]
|
||||
configName := configs[0]
|
||||
file, err := api.Db.GetFile(int64(id))
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
path, err := file.Path()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Stream file", path, configName)
|
||||
|
||||
ffmpegConfig, ok := api.GetFfmpegConfig(configName)
|
||||
if !ok {
|
||||
api.HandleErrorStringCode(w, r, `ffmpeg config not found`, 404)
|
||||
return
|
||||
}
|
||||
args := strings.Split(ffmpegConfig.Args, " ")
|
||||
startArgs := []string {"-threads", strconv.FormatInt(api.APIConfig.FfmpegThreads, 10), "-i", path}
|
||||
endArgs := []string {"-vn", "-f", "ogg", "-"}
|
||||
ffmpegArgs := append(startArgs, args...)
|
||||
ffmpegArgs = append(ffmpegArgs, endArgs...)
|
||||
cmd := exec.Command("ffmpeg", ffmpegArgs...)
|
||||
cmd.Stdout = w
|
||||
err = cmd.Run()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
}
|
||||
|
||||
type PrepareFileStreamDirectRequest struct {
|
||||
ID int64 `json:"id"`
|
||||
ConfigName string `json:"config_name"`
|
||||
}
|
||||
|
||||
type PrepareFileStreamDirectResponse struct {
|
||||
Filesize int64 `json:"filesize"`
|
||||
}
|
||||
|
||||
func (api *API) HandlePrepareFileStreamDirect(w http.ResponseWriter, r *http.Request) {
|
||||
prepareFileStreamDirectRequst := &PrepareFileStreamDirectRequest{
|
||||
ID: -1,
|
||||
}
|
||||
err := json.NewDecoder(r.Body).Decode(prepareFileStreamDirectRequst)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check empty
|
||||
if prepareFileStreamDirectRequst.ID < 0 {
|
||||
api.HandleErrorString(w, r, `"id" can't be none or negative`)
|
||||
return
|
||||
}
|
||||
if prepareFileStreamDirectRequst.ConfigName == "" {
|
||||
api.HandleErrorString(w, r, `"config_name" can't be empty`)
|
||||
return
|
||||
}
|
||||
|
||||
file, err := api.Db.GetFile(prepareFileStreamDirectRequst.ID)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
srcPath, err := file.Path()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Prepare stream direct file", srcPath, prepareFileStreamDirectRequst.ConfigName)
|
||||
ffmpegConfig, ok := api.GetFfmpegConfig(prepareFileStreamDirectRequst.ConfigName)
|
||||
if !ok {
|
||||
api.HandleErrorStringCode(w, r, `ffmpeg config not found`, 404)
|
||||
return
|
||||
}
|
||||
objPath := api.Tmpfs.GetObjFilePath(prepareFileStreamDirectRequst.ID, prepareFileStreamDirectRequst.ConfigName)
|
||||
|
||||
// check obj file exists
|
||||
exists := api.Tmpfs.Exits(objPath)
|
||||
if exists {
|
||||
fileInfo, err := os.Stat(objPath)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
prepareFileStreamDirectResponse := &PrepareFileStreamDirectResponse{
|
||||
Filesize: fileInfo.Size(),
|
||||
}
|
||||
json.NewEncoder(w).Encode(prepareFileStreamDirectResponse)
|
||||
return
|
||||
}
|
||||
|
||||
api.Tmpfs.Record(objPath)
|
||||
args := strings.Split(ffmpegConfig.Args, " ")
|
||||
startArgs := []string {"-threads", strconv.FormatInt(api.APIConfig.FfmpegThreads, 10), "-i", srcPath}
|
||||
endArgs := []string {"-vn", "-y", objPath}
|
||||
ffmpegArgs := append(startArgs, args...)
|
||||
ffmpegArgs = append(ffmpegArgs, endArgs...)
|
||||
cmd := exec.Command("ffmpeg", ffmpegArgs...)
|
||||
err = cmd.Run()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
fileInfo, err := os.Stat(objPath)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
prepareFileStreamDirectResponse := &PrepareFileStreamDirectResponse{
|
||||
Filesize: fileInfo.Size(),
|
||||
}
|
||||
json.NewEncoder(w).Encode(prepareFileStreamDirectResponse)
|
||||
}
|
||||
|
||||
func (api *API) HandleGetFileStreamDirect(w http.ResponseWriter, r *http.Request) {
|
||||
err := api.CheckGetFileStream(w, r)
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
q := r.URL.Query()
|
||||
ids := q["id"]
|
||||
id, err := strconv.Atoi(ids[0])
|
||||
configs := q["config"]
|
||||
configName := configs[0]
|
||||
|
||||
path := api.Tmpfs.GetObjFilePath(int64(id), configName)
|
||||
if api.Tmpfs.Exits(path) {
|
||||
api.Tmpfs.Record(path)
|
||||
}
|
||||
|
||||
log.Println("[api] Get direct cached file", path)
|
||||
|
||||
http.ServeFile(w, r, path)
|
||||
}
|
||||
|
||||
func (api *API) HandleGetFileDirect(w http.ResponseWriter, r *http.Request) {
|
||||
q := r.URL.Query()
|
||||
ids := q["id"]
|
||||
if len(ids) == 0 {
|
||||
api.HandleErrorString(w, r, `parameter "id" can't be empty`)
|
||||
return
|
||||
}
|
||||
id, err := strconv.Atoi(ids[0])
|
||||
if err != nil {
|
||||
api.HandleErrorString(w, r, `parameter "id" should be an integer`)
|
||||
return
|
||||
}
|
||||
file, err := api.Db.GetFile(int64(id))
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
path, err := file.Path()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Get direct raw file", path)
|
||||
|
||||
http.ServeFile(w, r, path)
|
||||
}
|
||||
|
||||
func (api *API) HandleGetFile(w http.ResponseWriter, r *http.Request) {
|
||||
getFileRequest := &GetFileRequest{
|
||||
ID: -1,
|
||||
}
|
||||
|
||||
err := json.NewDecoder(r.Body).Decode(getFileRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check empty
|
||||
if getFileRequest.ID < 0 {
|
||||
api.HandleErrorString(w, r, `"id" can't be none or negative`)
|
||||
return
|
||||
}
|
||||
|
||||
file, err := api.Db.GetFile(getFileRequest.ID)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
path, err := file.Path()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Get pipe raw file", path)
|
||||
|
||||
src, err := os.Open(path)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
defer src.Close()
|
||||
io.Copy(w, src)
|
||||
}
|
||||
|
||||
func (api *API) HandleGetFfmpegConfigs(w http.ResponseWriter, r *http.Request) {
|
||||
log.Println("[api] Get ffmpeg config list")
|
||||
ffmpegConfigList:= &FfmpegConfigList{
|
||||
FfmpegConfigList: api.APIConfig.FfmpegConfigList,
|
||||
}
|
||||
json.NewEncoder(w).Encode(&ffmpegConfigList)
|
||||
}
|
||||
|
||||
func (api *API) HandleAddFfmpegConfig(w http.ResponseWriter, r *http.Request) {
|
||||
addFfmpegConfigRequest := AddFfmpegConfigRequest{}
|
||||
err := json.NewDecoder(r.Body).Decode(&addFfmpegConfigRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check token
|
||||
err = api.CheckToken(w, r, addFfmpegConfigRequest.Token)
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
|
||||
// check name and args not null
|
||||
if addFfmpegConfigRequest.Name == "" {
|
||||
api.HandleErrorString(w, r, `"ffmpeg_config.name" can't be empty`)
|
||||
return
|
||||
}
|
||||
if addFfmpegConfigRequest.FfmpegConfig.Args == "" {
|
||||
api.HandleErrorString(w, r, `"ffmpeg_config.args" can't be empty`)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Add ffmpeg config")
|
||||
|
||||
api.APIConfig.FfmpegConfigList = append(api.APIConfig.FfmpegConfigList, addFfmpegConfigRequest.FfmpegConfig)
|
||||
|
||||
api.HandleOK(w, r)
|
||||
}
|
||||
|
||||
type FeedbackRequest struct {
|
||||
Feedback string `json:"feedback"`
|
||||
}
|
||||
|
||||
func (api *API) HandleFeedback(w http.ResponseWriter, r *http.Request) {
|
||||
feedbackRequest := &FeedbackRequest{}
|
||||
err :=json.NewDecoder(r.Body).Decode(feedbackRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check empty feedback
|
||||
if feedbackRequest.Feedback == "" {
|
||||
api.HandleErrorString(w, r, `"feedback" can't be empty`)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Feedback", feedbackRequest.Feedback)
|
||||
|
||||
headerBuff := &bytes.Buffer{}
|
||||
err = r.Header.Write(headerBuff)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
header := headerBuff.String()
|
||||
|
||||
err = api.Db.InsertFeedback(time.Now().Unix(), feedbackRequest.Feedback, header)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
api.HandleOK(w, r)
|
||||
}
|
||||
|
||||
func NewAPIConfig() (APIConfig) {
|
||||
apiConfig := APIConfig{}
|
||||
return apiConfig
|
||||
}
|
||||
|
||||
type APIConfig struct {
|
||||
DatabaseName string `json:"database_name"`
|
||||
Addr string `json:"addr"`
|
||||
Token string `json:"token"`
|
||||
FfmpegThreads int64 `json:"ffmpeg_threads"`
|
||||
FfmpegConfigList []FfmpegConfig `json:"ffmpeg_config_list"`
|
||||
}
|
||||
|
||||
type Config struct {
|
||||
APIConfig APIConfig `json:"api"`
|
||||
TmpfsConfig tmpfs.TmpfsConfig `json:"tmpfs"`
|
||||
}
|
||||
|
||||
func NewAPI(config Config) (*API, error) {
|
||||
var err error
|
||||
|
||||
apiConfig := config.APIConfig
|
||||
tmpfsConfig := config.TmpfsConfig
|
||||
|
||||
db, err := database.NewDatabase(apiConfig.DatabaseName)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
mux := http.NewServeMux()
|
||||
apiMux := http.NewServeMux()
|
||||
|
||||
api := &API{
|
||||
Db: db,
|
||||
Server: http.Server{
|
||||
Addr: apiConfig.Addr,
|
||||
Handler: mux,
|
||||
},
|
||||
APIConfig: apiConfig,
|
||||
}
|
||||
api.Tmpfs = tmpfs.NewTmpfs(tmpfsConfig)
|
||||
|
||||
// mount api
|
||||
apiMux.HandleFunc("/hello", api.HandleOK)
|
||||
apiMux.HandleFunc("/get_file", api.HandleGetFile)
|
||||
apiMux.HandleFunc("/get_file_direct", api.HandleGetFileDirect)
|
||||
apiMux.HandleFunc("/search_files", api.HandleSearchFiles)
|
||||
apiMux.HandleFunc("/search_folders", api.HandleSearchFolders)
|
||||
apiMux.HandleFunc("/get_files_in_folder", api.HandleGetFilesInFolder)
|
||||
apiMux.HandleFunc("/get_random_files", api.HandleGetRandomFiles)
|
||||
apiMux.HandleFunc("/get_file_stream", api.HandleGetFileStream)
|
||||
apiMux.HandleFunc("/get_ffmpeg_config_list", api.HandleGetFfmpegConfigs)
|
||||
apiMux.HandleFunc("/feedback", api.HandleFeedback)
|
||||
apiMux.HandleFunc("/get_file_info", api.HandleGetFileInfo)
|
||||
apiMux.HandleFunc("/get_file_stream_direct", api.HandleGetFileStreamDirect)
|
||||
apiMux.HandleFunc("/prepare_file_stream_direct", api.HandlePrepareFileStreamDirect)
|
||||
// below needs token
|
||||
apiMux.HandleFunc("/walk", api.HandleWalk)
|
||||
apiMux.HandleFunc("/reset", api.HandleReset)
|
||||
apiMux.HandleFunc("/add_ffmpeg_config", api.HandleAddFfmpegConfig)
|
||||
|
||||
mux.Handle("/api/v1/", http.StripPrefix("/api/v1", apiMux))
|
||||
mux.Handle("/", http.StripPrefix("/", http.FileServer(http.Dir("web/build"))))
|
||||
|
||||
api.token = apiConfig.Token
|
||||
|
||||
return api, nil
|
||||
}
|
||||
@@ -1,441 +0,0 @@
|
||||
package database
|
||||
|
||||
import (
|
||||
"database/sql"
|
||||
"errors"
|
||||
"log"
|
||||
"os"
|
||||
"path/filepath"
|
||||
|
||||
_ "github.com/mattn/go-sqlite3"
|
||||
)
|
||||
|
||||
var initFilesTableQuery = `CREATE TABLE IF NOT EXISTS files (
|
||||
id INTEGER PRIMARY KEY,
|
||||
folder_id INTEGER NOT NULL,
|
||||
filename TEXT NOT NULL,
|
||||
filesize INTEGER NOT NULL
|
||||
);`
|
||||
var initFoldersTableQuery = `CREATE TABLE IF NOT EXISTS folders (
|
||||
id INTEGER PRIMARY KEY,
|
||||
folder TEXT NOT NULL,
|
||||
foldername TEXT NOT NULL
|
||||
);`
|
||||
var initFeedbacksTableQuery = `CREATE TABLE IF NOT EXISTS feedbacks (
|
||||
id INTEGER PRIMARY KEY,
|
||||
time INTEGER NOT NULL,
|
||||
feedback TEXT NOT NULL,
|
||||
header TEXT NOT NULL
|
||||
);`
|
||||
var insertFolderQuery = `INSERT INTO folders (folder, foldername) VALUES (?, ?);`
|
||||
var findFolderQuery = `SELECT id FROM folders WHERE folder = ? LIMIT 1;`
|
||||
var insertFileQuery = `INSERT INTO files (folder_id, filename, filesize) VALUES (?, ?, ?);`
|
||||
var searchFilesQuery = `SELECT files.id, files.folder_id, files.filename, folders.foldername, files.filesize FROM files JOIN folders ON files.folder_id = folders.id WHERE filename LIKE ? LIMIT ? OFFSET ?;`
|
||||
var getFolderQuery = `SELECT folder FROM folders WHERE id = ? LIMIT 1;`
|
||||
var dropFilesQuery = `DROP TABLE files;`
|
||||
var dropFolderQuery = `DROP TABLE folders;`
|
||||
var getFileQuery = `SELECT files.id, files.folder_id, files.filename, folders.foldername, files.filesize FROM files JOIN folders ON files.folder_id = folders.id WHERE files.id = ? LIMIT 1;`
|
||||
var searchFoldersQuery = `SELECT id, folder, foldername FROM folders WHERE foldername LIKE ? LIMIT ? OFFSET ?;`
|
||||
var getFilesInFolderQuery = `SELECT files.id, files.filename, files.filesize, folders.foldername FROM files JOIN folders ON files.folder_id = folders.id WHERE folder_id = ? LIMIT ? OFFSET ?;`
|
||||
var getRandomFilesQuery = `SELECT files.id, files.folder_id, files.filename, folders.foldername, files.filesize FROM files JOIN folders on files.folder_id = folders.id ORDER BY RANDOM() LIMIT ?;`
|
||||
var insertFeedbackQuery = `INSERT INTO feedbacks (time, feedback, header) VALUES (?, ?, ?);`
|
||||
|
||||
type Database struct {
|
||||
sqlConn *sql.DB
|
||||
stmt *Stmt
|
||||
}
|
||||
|
||||
type Stmt struct {
|
||||
initFilesTable *sql.Stmt
|
||||
initFoldersTable *sql.Stmt
|
||||
initFeedbacksTable *sql.Stmt
|
||||
insertFolder *sql.Stmt
|
||||
insertFile *sql.Stmt
|
||||
findFolder *sql.Stmt
|
||||
searchFiles *sql.Stmt
|
||||
getFolder *sql.Stmt
|
||||
dropFiles *sql.Stmt
|
||||
dropFolder *sql.Stmt
|
||||
getFile *sql.Stmt
|
||||
searchFolders *sql.Stmt
|
||||
getFilesInFolder *sql.Stmt
|
||||
getRandomFiles *sql.Stmt
|
||||
insertFeedback *sql.Stmt
|
||||
}
|
||||
|
||||
type File struct {
|
||||
Db *Database `json:"-"`
|
||||
ID int64 `json:"id"`
|
||||
Folder_id int64 `json:"folder_id"`
|
||||
Foldername string `json:"foldername"`
|
||||
Filename string `json:"filename"`
|
||||
Filesize int64 `json:"filesize"`
|
||||
}
|
||||
|
||||
type Folder struct {
|
||||
Db *Database `json:"-"`
|
||||
ID int64 `json:"id"`
|
||||
Folder string `json:"-"`
|
||||
Foldername string `json:"foldername"`
|
||||
}
|
||||
|
||||
func (database *Database) InsertFeedback(time int64, feedback string, header string) (error) {
|
||||
_, err := database.stmt.insertFeedback.Exec(time, feedback, header)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (database *Database) GetRandomFiles(limit int64) ([]File, error) {
|
||||
rows, err := database.stmt.getRandomFiles.Query(limit)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
defer rows.Close()
|
||||
files := make([]File, 0)
|
||||
for rows.Next() {
|
||||
file := File{
|
||||
Db: database,
|
||||
}
|
||||
err = rows.Scan(&file.ID, &file.Folder_id, &file.Filename, &file.Foldername, &file.Filesize)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
files = append(files, file)
|
||||
}
|
||||
return files, nil
|
||||
}
|
||||
|
||||
func (database *Database) GetFilesInFolder(folder_id int64, limit int64, offset int64) ([]File, error) {
|
||||
rows, err := database.stmt.getFilesInFolder.Query(folder_id, limit, offset)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
defer rows.Close()
|
||||
files := make([]File, 0)
|
||||
for rows.Next() {
|
||||
file := File{
|
||||
Db: database,
|
||||
Folder_id: folder_id,
|
||||
}
|
||||
err = rows.Scan(&file.ID, &file.Filename, &file.Filesize, &file.Foldername)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
files = append(files, file)
|
||||
}
|
||||
return files, nil
|
||||
}
|
||||
|
||||
func (database *Database) SearchFolders(foldername string, limit int64, offset int64) ([]Folder, error) {
|
||||
rows, err := database.stmt.searchFolders.Query("%"+foldername+"%", limit, offset)
|
||||
if err != nil {
|
||||
return nil, errors.New("Error searching folders at query " + err.Error())
|
||||
}
|
||||
defer rows.Close()
|
||||
folders := make([]Folder, 0)
|
||||
for rows.Next() {
|
||||
folder := Folder{
|
||||
Db: database,
|
||||
}
|
||||
err = rows.Scan(&folder.ID, &folder.Folder, &folder.Foldername)
|
||||
if err != nil {
|
||||
return nil, errors.New("Error scanning SearchFolders" + err.Error())
|
||||
}
|
||||
folders = append(folders, folder)
|
||||
}
|
||||
return folders, nil
|
||||
}
|
||||
|
||||
func (database *Database) GetFile(id int64) (*File, error) {
|
||||
file := &File{
|
||||
Db: database,
|
||||
}
|
||||
err := database.stmt.getFile.QueryRow(id).Scan(&file.ID, &file.Folder_id, &file.Filename, &file.Foldername, &file.Filesize)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return file, nil
|
||||
}
|
||||
|
||||
func (database *Database) ResetFiles() (error) {
|
||||
log.Println("[db] Reset files")
|
||||
var err error
|
||||
_, err = database.stmt.dropFiles.Exec()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
_, err = database.stmt.initFilesTable.Exec()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return err
|
||||
}
|
||||
|
||||
func (database *Database) ResetFolder() (error) {
|
||||
log.Println("[db] Reset folders")
|
||||
var err error
|
||||
_, err = database.stmt.dropFolder.Exec()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
_, err = database.stmt.initFoldersTable.Exec()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return err
|
||||
}
|
||||
|
||||
func (database *Database) Walk(root string, pattern []string) (error) {
|
||||
patternDict := make(map[string]bool)
|
||||
for _, v := range pattern {
|
||||
patternDict[v] = true
|
||||
}
|
||||
log.Println("[db] Walk", root, patternDict)
|
||||
return filepath.Walk(root, func (path string, info os.FileInfo, err error) (error) {
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
if info.IsDir() {
|
||||
return nil
|
||||
}
|
||||
|
||||
// check pattern
|
||||
ext := filepath.Ext(info.Name())
|
||||
if _, ok := patternDict[ext]; !ok {
|
||||
return nil
|
||||
}
|
||||
|
||||
// insert file, folder will aut created
|
||||
err = database.Insert(path, info.Size())
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return nil
|
||||
})
|
||||
}
|
||||
|
||||
func (f *File) Path() (string, error) {
|
||||
folder, err := f.Db.GetFolder(f.Folder_id)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
return filepath.Join(folder.Folder, f.Filename), nil
|
||||
}
|
||||
|
||||
func (database *Database) GetFolder(folderId int64) (*Folder, error) {
|
||||
folder := &Folder{
|
||||
Db: database,
|
||||
}
|
||||
err := database.stmt.getFolder.QueryRow(folderId).Scan(&folder.Folder)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return folder, nil
|
||||
}
|
||||
|
||||
func (database *Database) SearchFiles(filename string, limit int64, offset int64) ([]File, error) {
|
||||
rows, err := database.stmt.searchFiles.Query("%"+filename+"%", limit, offset)
|
||||
if err != nil {
|
||||
return nil, errors.New("Error searching files at query " + err.Error())
|
||||
}
|
||||
defer rows.Close()
|
||||
files := make([]File, 0)
|
||||
for rows.Next() {
|
||||
var file File = File{
|
||||
Db: database,
|
||||
}
|
||||
err = rows.Scan(&file.ID, &file.Folder_id, &file.Filename, &file.Foldername, &file.Filesize)
|
||||
if err != nil {
|
||||
return nil, errors.New("Error scanning SearchFiles " + err.Error())
|
||||
}
|
||||
files = append(files, file)
|
||||
}
|
||||
if err = rows.Err(); err != nil {
|
||||
return nil, errors.New("Error scanning SearchFiles exit without full result" + err.Error())
|
||||
}
|
||||
return files, nil
|
||||
}
|
||||
|
||||
func (database *Database) FindFolder(folder string) (int64, error) {
|
||||
var id int64
|
||||
err := database.stmt.findFolder.QueryRow(folder).Scan(&id)
|
||||
if err != nil {
|
||||
return 0, err
|
||||
}
|
||||
return id, nil
|
||||
}
|
||||
|
||||
func (database *Database) InsertFolder(folder string) (int64, error) {
|
||||
result, err := database.stmt.insertFolder.Exec(folder, filepath.Base(folder))
|
||||
if err != nil {
|
||||
return 0, err
|
||||
}
|
||||
lastInsertId, err := result.LastInsertId()
|
||||
if err != nil {
|
||||
return 0, err
|
||||
}
|
||||
return lastInsertId, nil
|
||||
}
|
||||
|
||||
func (database *Database) InsertFile(folderId int64, filename string, filesize int64) (error) {
|
||||
_, err := database.stmt.insertFile.Exec(folderId, filename, filesize)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (database *Database) Insert(path string, filesize int64) (error) {
|
||||
folder, filename := filepath.Split(path)
|
||||
folderId, err := database.FindFolder(folder)
|
||||
if err != nil {
|
||||
folderId, err = database.InsertFolder(folder)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
err = database.InsertFile(folderId, filename, filesize)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func NewPreparedStatement(sqlConn *sql.DB) (*Stmt, error) {
|
||||
var err error
|
||||
|
||||
stmt := &Stmt{}
|
||||
|
||||
// init files table
|
||||
stmt.initFilesTable, err = sqlConn.Prepare(initFilesTableQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init folders table
|
||||
stmt.initFoldersTable, err = sqlConn.Prepare(initFoldersTableQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init feedbacks tables
|
||||
stmt.initFeedbacksTable, err = sqlConn.Prepare(initFeedbacksTableQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// run init statement
|
||||
_, err = stmt.initFilesTable.Exec()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
_, err = stmt.initFoldersTable.Exec()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
_, err = stmt.initFeedbacksTable.Exec()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init insert folder statement
|
||||
stmt.insertFolder, err = sqlConn.Prepare(insertFolderQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init findFolder statement
|
||||
stmt.findFolder, err = sqlConn.Prepare(findFolderQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init insertFile stmt
|
||||
stmt.insertFile, err = sqlConn.Prepare(insertFileQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init searchFile stmt
|
||||
stmt.searchFiles, err = sqlConn.Prepare(searchFilesQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init getFolder stmt
|
||||
stmt.getFolder, err = sqlConn.Prepare(getFolderQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init dropFolder stmt
|
||||
stmt.dropFolder, err = sqlConn.Prepare(dropFolderQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init dropFiles stmt
|
||||
stmt.dropFiles, err = sqlConn.Prepare(dropFilesQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init getFile stmt
|
||||
stmt.getFile, err = sqlConn.Prepare(getFileQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init searchFolder stmt
|
||||
stmt.searchFolders, err = sqlConn.Prepare(searchFoldersQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init getFilesInFolder stmt
|
||||
stmt.getFilesInFolder, err = sqlConn.Prepare(getFilesInFolderQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init getRandomFiles
|
||||
stmt.getRandomFiles, err = sqlConn.Prepare(getRandomFilesQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init insertFeedback
|
||||
stmt.insertFeedback, err = sqlConn.Prepare(insertFeedbackQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return stmt, err
|
||||
}
|
||||
|
||||
func NewDatabase(dbName string) (*Database, error) {
|
||||
var err error
|
||||
|
||||
// open database
|
||||
sqlConn, err := sql.Open("sqlite3", dbName)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// prepare statement
|
||||
stmt, err := NewPreparedStatement(sqlConn)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// new database
|
||||
database := &Database{
|
||||
sqlConn: sqlConn,
|
||||
stmt: stmt,
|
||||
}
|
||||
|
||||
return database, nil
|
||||
}
|
||||
2
main.go
2
main.go
@@ -4,7 +4,7 @@ import (
|
||||
"encoding/json"
|
||||
"flag"
|
||||
"log"
|
||||
"msw-open-music/internal/pkg/api"
|
||||
"msw-open-music/pkg/api"
|
||||
"os"
|
||||
)
|
||||
|
||||
|
||||
84
pkg/api/api.go
Normal file
84
pkg/api/api.go
Normal file
@@ -0,0 +1,84 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"msw-open-music/pkg/database"
|
||||
"msw-open-music/pkg/tmpfs"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
type API struct {
|
||||
Db *database.Database
|
||||
Server http.Server
|
||||
token string
|
||||
APIConfig APIConfig
|
||||
Tmpfs *tmpfs.Tmpfs
|
||||
}
|
||||
|
||||
func NewAPIConfig() APIConfig {
|
||||
apiConfig := APIConfig{}
|
||||
return apiConfig
|
||||
}
|
||||
|
||||
type APIConfig struct {
|
||||
DatabaseName string `json:"database_name"`
|
||||
Addr string `json:"addr"`
|
||||
Token string `json:"token"`
|
||||
FfmpegThreads int64 `json:"ffmpeg_threads"`
|
||||
FfmpegConfigList []FfmpegConfig `json:"ffmpeg_config_list"`
|
||||
}
|
||||
|
||||
type Config struct {
|
||||
APIConfig APIConfig `json:"api"`
|
||||
TmpfsConfig tmpfs.TmpfsConfig `json:"tmpfs"`
|
||||
}
|
||||
|
||||
func NewAPI(config Config) (*API, error) {
|
||||
var err error
|
||||
|
||||
apiConfig := config.APIConfig
|
||||
tmpfsConfig := config.TmpfsConfig
|
||||
|
||||
db, err := database.NewDatabase(apiConfig.DatabaseName)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
mux := http.NewServeMux()
|
||||
apiMux := http.NewServeMux()
|
||||
|
||||
api := &API{
|
||||
Db: db,
|
||||
Server: http.Server{
|
||||
Addr: apiConfig.Addr,
|
||||
Handler: mux,
|
||||
},
|
||||
APIConfig: apiConfig,
|
||||
}
|
||||
api.Tmpfs = tmpfs.NewTmpfs(tmpfsConfig)
|
||||
|
||||
// mount api
|
||||
apiMux.HandleFunc("/hello", api.HandleOK)
|
||||
apiMux.HandleFunc("/get_file", api.HandleGetFile)
|
||||
apiMux.HandleFunc("/get_file_direct", api.HandleGetFileDirect)
|
||||
apiMux.HandleFunc("/search_files", api.HandleSearchFiles)
|
||||
apiMux.HandleFunc("/search_folders", api.HandleSearchFolders)
|
||||
apiMux.HandleFunc("/get_files_in_folder", api.HandleGetFilesInFolder)
|
||||
apiMux.HandleFunc("/get_random_files", api.HandleGetRandomFiles)
|
||||
apiMux.HandleFunc("/get_file_stream", api.HandleGetFileStream)
|
||||
apiMux.HandleFunc("/get_ffmpeg_config_list", api.HandleGetFfmpegConfigs)
|
||||
apiMux.HandleFunc("/feedback", api.HandleFeedback)
|
||||
apiMux.HandleFunc("/get_file_info", api.HandleGetFileInfo)
|
||||
apiMux.HandleFunc("/get_file_stream_direct", api.HandleGetFileStreamDirect)
|
||||
apiMux.HandleFunc("/prepare_file_stream_direct", api.HandlePrepareFileStreamDirect)
|
||||
// below needs token
|
||||
apiMux.HandleFunc("/walk", api.HandleWalk)
|
||||
apiMux.HandleFunc("/reset", api.HandleReset)
|
||||
apiMux.HandleFunc("/add_ffmpeg_config", api.HandleAddFfmpegConfig)
|
||||
|
||||
mux.Handle("/api/v1/", http.StripPrefix("/api/v1", apiMux))
|
||||
mux.Handle("/", http.StripPrefix("/", http.FileServer(http.Dir("web/build"))))
|
||||
|
||||
api.token = apiConfig.Token
|
||||
|
||||
return api, nil
|
||||
}
|
||||
17
pkg/api/check.go
Normal file
17
pkg/api/check.go
Normal file
@@ -0,0 +1,17 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"log"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
func (api *API) CheckLimit(w http.ResponseWriter, r *http.Request, limit int64) error {
|
||||
if limit <= 0 || limit > 10 {
|
||||
log.Println("[api] [Warning] Limit error", limit)
|
||||
err := errors.New(`"limit" can't be zero or more than 10`)
|
||||
api.HandleError(w, r, err)
|
||||
return err
|
||||
}
|
||||
return nil
|
||||
}
|
||||
26
pkg/api/handle_common.go
Normal file
26
pkg/api/handle_common.go
Normal file
@@ -0,0 +1,26 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
type Status struct {
|
||||
Status string `json:"status,omitempty"`
|
||||
}
|
||||
|
||||
func (api *API) HandleStatus(w http.ResponseWriter, r *http.Request, status string) {
|
||||
s := &Status{
|
||||
Status: status,
|
||||
}
|
||||
|
||||
json.NewEncoder(w).Encode(s)
|
||||
}
|
||||
|
||||
var ok Status = Status{
|
||||
Status: "OK",
|
||||
}
|
||||
|
||||
func (api *API) HandleOK(w http.ResponseWriter, r *http.Request) {
|
||||
json.NewEncoder(w).Encode(&ok)
|
||||
}
|
||||
81
pkg/api/handle_database_manage.go
Normal file
81
pkg/api/handle_database_manage.go
Normal file
@@ -0,0 +1,81 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
type WalkRequest struct {
|
||||
Token string `json:"token"`
|
||||
Root string `json:"root"`
|
||||
Pattern []string `json:"pattern"`
|
||||
}
|
||||
|
||||
type ResetRequest struct {
|
||||
Token string `json:"token"`
|
||||
}
|
||||
|
||||
func (api *API) HandleReset(w http.ResponseWriter, r *http.Request) {
|
||||
resetRequest := &ResetRequest{}
|
||||
err := json.NewDecoder(r.Body).Decode(resetRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check token
|
||||
err = api.CheckToken(w, r, resetRequest.Token)
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
|
||||
// reset
|
||||
err = api.Db.ResetFiles()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
err = api.Db.ResetFolder()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
api.HandleStatus(w, r, "Database reseted")
|
||||
}
|
||||
|
||||
func (api *API) HandleWalk(w http.ResponseWriter, r *http.Request) {
|
||||
walkRequest := &WalkRequest{}
|
||||
err := json.NewDecoder(r.Body).Decode(walkRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check token match
|
||||
err = api.CheckToken(w, r, walkRequest.Token)
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
|
||||
// check root empty
|
||||
if walkRequest.Root == "" {
|
||||
api.HandleErrorString(w, r, `key "root" can't be empty`)
|
||||
return
|
||||
}
|
||||
|
||||
// check pattern empty
|
||||
if len(walkRequest.Pattern) == 0 {
|
||||
api.HandleErrorString(w, r, `"[]pattern" can't be empty`)
|
||||
return
|
||||
}
|
||||
|
||||
// walk
|
||||
err = api.Db.Walk(walkRequest.Root, walkRequest.Pattern)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
api.HandleStatus(w, r, "Database udpated")
|
||||
}
|
||||
28
pkg/api/handle_error.go
Normal file
28
pkg/api/handle_error.go
Normal file
@@ -0,0 +1,28 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"log"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
func (api *API) HandleError(w http.ResponseWriter, r *http.Request, err error) {
|
||||
api.HandleErrorString(w, r, err.Error())
|
||||
}
|
||||
|
||||
func (api *API) HandleErrorCode(w http.ResponseWriter, r *http.Request, err error, code int) {
|
||||
api.HandleErrorStringCode(w, r, err.Error(), code)
|
||||
}
|
||||
|
||||
func (api *API) HandleErrorString(w http.ResponseWriter, r *http.Request, errorString string) {
|
||||
api.HandleErrorStringCode(w, r, errorString, 500)
|
||||
}
|
||||
|
||||
func (api *API) HandleErrorStringCode(w http.ResponseWriter, r *http.Request, errorString string, code int) {
|
||||
log.Println("[api] [Error]", code, errorString)
|
||||
errStatus := &Status{
|
||||
Status: errorString,
|
||||
}
|
||||
w.WriteHeader(code)
|
||||
json.NewEncoder(w).Encode(errStatus)
|
||||
}
|
||||
45
pkg/api/handle_feedback.go
Normal file
45
pkg/api/handle_feedback.go
Normal file
@@ -0,0 +1,45 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"encoding/json"
|
||||
"log"
|
||||
"net/http"
|
||||
"time"
|
||||
)
|
||||
|
||||
type FeedbackRequest struct {
|
||||
Feedback string `json:"feedback"`
|
||||
}
|
||||
|
||||
func (api *API) HandleFeedback(w http.ResponseWriter, r *http.Request) {
|
||||
feedbackRequest := &FeedbackRequest{}
|
||||
err := json.NewDecoder(r.Body).Decode(feedbackRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check empty feedback
|
||||
if feedbackRequest.Feedback == "" {
|
||||
api.HandleErrorString(w, r, `"feedback" can't be empty`)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Feedback", feedbackRequest.Feedback)
|
||||
|
||||
headerBuff := &bytes.Buffer{}
|
||||
err = r.Header.Write(headerBuff)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
header := headerBuff.String()
|
||||
|
||||
err = api.Db.InsertFeedback(time.Now().Unix(), feedbackRequest.Feedback, header)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
api.HandleOK(w, r)
|
||||
}
|
||||
74
pkg/api/handle_ffmpeg_config.go
Normal file
74
pkg/api/handle_ffmpeg_config.go
Normal file
@@ -0,0 +1,74 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"log"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
type FfmpegConfig struct {
|
||||
Name string `json:"name"`
|
||||
Args string `json:"args"`
|
||||
}
|
||||
|
||||
type FfmpegConfigList struct {
|
||||
FfmpegConfigList []FfmpegConfig `json:"ffmpeg_config_list"`
|
||||
}
|
||||
|
||||
func (api *API) GetFfmpegConfig(configName string) (FfmpegConfig, bool) {
|
||||
ffmpegConfig := FfmpegConfig{}
|
||||
for _, f := range api.APIConfig.FfmpegConfigList {
|
||||
if f.Name == configName {
|
||||
ffmpegConfig = f
|
||||
}
|
||||
}
|
||||
if ffmpegConfig.Name == "" {
|
||||
return ffmpegConfig, false
|
||||
}
|
||||
return ffmpegConfig, true
|
||||
}
|
||||
|
||||
func (api *API) HandleGetFfmpegConfigs(w http.ResponseWriter, r *http.Request) {
|
||||
log.Println("[api] Get ffmpeg config list")
|
||||
ffmpegConfigList := &FfmpegConfigList{
|
||||
FfmpegConfigList: api.APIConfig.FfmpegConfigList,
|
||||
}
|
||||
json.NewEncoder(w).Encode(&ffmpegConfigList)
|
||||
}
|
||||
|
||||
type AddFfmpegConfigRequest struct {
|
||||
Token string `json:"token"`
|
||||
Name string `json:"name"`
|
||||
FfmpegConfig FfmpegConfig `json:"ffmpeg_config"`
|
||||
}
|
||||
|
||||
func (api *API) HandleAddFfmpegConfig(w http.ResponseWriter, r *http.Request) {
|
||||
addFfmpegConfigRequest := AddFfmpegConfigRequest{}
|
||||
err := json.NewDecoder(r.Body).Decode(&addFfmpegConfigRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check token
|
||||
err = api.CheckToken(w, r, addFfmpegConfigRequest.Token)
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
|
||||
// check name and args not null
|
||||
if addFfmpegConfigRequest.Name == "" {
|
||||
api.HandleErrorString(w, r, `"ffmpeg_config.name" can't be empty`)
|
||||
return
|
||||
}
|
||||
if addFfmpegConfigRequest.FfmpegConfig.Args == "" {
|
||||
api.HandleErrorString(w, r, `"ffmpeg_config.args" can't be empty`)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Add ffmpeg config")
|
||||
|
||||
api.APIConfig.FfmpegConfigList = append(api.APIConfig.FfmpegConfigList, addFfmpegConfigRequest.FfmpegConfig)
|
||||
|
||||
api.HandleOK(w, r)
|
||||
}
|
||||
117
pkg/api/handle_get_file_info.go
Normal file
117
pkg/api/handle_get_file_info.go
Normal file
@@ -0,0 +1,117 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"io"
|
||||
"log"
|
||||
"net/http"
|
||||
"os"
|
||||
"strconv"
|
||||
)
|
||||
|
||||
type GetFileRequest struct {
|
||||
ID int64 `json:"id"`
|
||||
}
|
||||
|
||||
func (api *API) HandleGetFileInfo(w http.ResponseWriter, r *http.Request) {
|
||||
getFileRequest := &GetFileRequest{
|
||||
ID: -1,
|
||||
}
|
||||
|
||||
err := json.NewDecoder(r.Body).Decode(getFileRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check empty
|
||||
if getFileRequest.ID < 0 {
|
||||
api.HandleErrorString(w, r, `"id" can't be none or negative`)
|
||||
return
|
||||
}
|
||||
|
||||
file, err := api.Db.GetFile(getFileRequest.ID)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
err = json.NewEncoder(w).Encode(file)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
}
|
||||
|
||||
// /get_file
|
||||
// get raw file with io.Copy method
|
||||
func (api *API) HandleGetFile(w http.ResponseWriter, r *http.Request) {
|
||||
getFileRequest := &GetFileRequest{
|
||||
ID: -1,
|
||||
}
|
||||
|
||||
err := json.NewDecoder(r.Body).Decode(getFileRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check empty
|
||||
if getFileRequest.ID < 0 {
|
||||
api.HandleErrorString(w, r, `"id" can't be none or negative`)
|
||||
return
|
||||
}
|
||||
|
||||
file, err := api.Db.GetFile(getFileRequest.ID)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
path, err := file.Path()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Get pipe raw file", path)
|
||||
|
||||
src, err := os.Open(path)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
defer src.Close()
|
||||
io.Copy(w, src)
|
||||
}
|
||||
|
||||
// /get_file_direct?id=1
|
||||
// get raw file with http.ServeFile method
|
||||
func (api *API) HandleGetFileDirect(w http.ResponseWriter, r *http.Request) {
|
||||
q := r.URL.Query()
|
||||
ids := q["id"]
|
||||
if len(ids) == 0 {
|
||||
api.HandleErrorString(w, r, `parameter "id" can't be empty`)
|
||||
return
|
||||
}
|
||||
id, err := strconv.Atoi(ids[0])
|
||||
if err != nil {
|
||||
api.HandleErrorString(w, r, `parameter "id" should be an integer`)
|
||||
return
|
||||
}
|
||||
file, err := api.Db.GetFile(int64(id))
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
path, err := file.Path()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Get direct raw file", path)
|
||||
|
||||
http.ServeFile(w, r, path)
|
||||
}
|
||||
50
pkg/api/handle_get_files_in_folder.go
Normal file
50
pkg/api/handle_get_files_in_folder.go
Normal file
@@ -0,0 +1,50 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"log"
|
||||
"msw-open-music/pkg/database"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
type GetFilesInFolderRequest struct {
|
||||
Folder_id int64 `json:"folder_id"`
|
||||
Limit int64 `json:"limit"`
|
||||
Offset int64 `json:"offset"`
|
||||
}
|
||||
|
||||
type GetFilesInFolderResponse struct {
|
||||
Files *[]database.File `json:"files"`
|
||||
}
|
||||
|
||||
func (api *API) HandleGetFilesInFolder(w http.ResponseWriter, r *http.Request) {
|
||||
getFilesInFolderRequest := &GetFilesInFolderRequest{
|
||||
Folder_id: -1,
|
||||
}
|
||||
|
||||
err := json.NewDecoder(r.Body).Decode(getFilesInFolderRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check empyt
|
||||
if getFilesInFolderRequest.Folder_id < 0 {
|
||||
api.HandleErrorString(w, r, `"folder_id" can't be none or negative`)
|
||||
return
|
||||
}
|
||||
|
||||
files, err := api.Db.GetFilesInFolder(getFilesInFolderRequest.Folder_id, getFilesInFolderRequest.Limit, getFilesInFolderRequest.Offset)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
getFilesInFolderResponse := &GetFilesInFolderResponse{
|
||||
Files: &files,
|
||||
}
|
||||
|
||||
log.Println("[api] Get files in folder", getFilesInFolderRequest.Folder_id)
|
||||
|
||||
json.NewEncoder(w).Encode(getFilesInFolderResponse)
|
||||
}
|
||||
25
pkg/api/handle_get_random_files.go
Normal file
25
pkg/api/handle_get_random_files.go
Normal file
@@ -0,0 +1,25 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"log"
|
||||
"msw-open-music/pkg/database"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
type GetRandomFilesResponse struct {
|
||||
Files *[]database.File `json:"files"`
|
||||
}
|
||||
|
||||
func (api *API) HandleGetRandomFiles(w http.ResponseWriter, r *http.Request) {
|
||||
files, err := api.Db.GetRandomFiles(10)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
getRandomFilesResponse := &GetRandomFilesResponse{
|
||||
Files: &files,
|
||||
}
|
||||
log.Println("[api] Get random files")
|
||||
json.NewEncoder(w).Encode(getRandomFilesResponse)
|
||||
}
|
||||
48
pkg/api/handle_search_files.go
Normal file
48
pkg/api/handle_search_files.go
Normal file
@@ -0,0 +1,48 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"log"
|
||||
"msw-open-music/pkg/database"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
type SearchFilesRequest struct {
|
||||
Filename string `json:"filename"`
|
||||
Limit int64 `json:"limit"`
|
||||
Offset int64 `json:"offset"`
|
||||
}
|
||||
|
||||
type SearchFilesResponse struct {
|
||||
Files []database.File `json:"files"`
|
||||
}
|
||||
|
||||
func (api *API) HandleSearchFiles(w http.ResponseWriter, r *http.Request) {
|
||||
searchFilesRequest := &SearchFilesRequest{}
|
||||
err := json.NewDecoder(r.Body).Decode(searchFilesRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check empty
|
||||
if searchFilesRequest.Filename == "" {
|
||||
api.HandleErrorString(w, r, `"filename" can't be empty`)
|
||||
return
|
||||
}
|
||||
if api.CheckLimit(w, r, searchFilesRequest.Limit) != nil {
|
||||
return
|
||||
}
|
||||
|
||||
searchFilesResponse := &SearchFilesResponse{}
|
||||
|
||||
searchFilesResponse.Files, err = api.Db.SearchFiles(searchFilesRequest.Filename, searchFilesRequest.Limit, searchFilesRequest.Offset)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Search files", searchFilesRequest.Filename, searchFilesRequest.Limit, searchFilesRequest.Offset)
|
||||
|
||||
json.NewEncoder(w).Encode(searchFilesResponse)
|
||||
}
|
||||
48
pkg/api/handle_search_folders.go
Normal file
48
pkg/api/handle_search_folders.go
Normal file
@@ -0,0 +1,48 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"log"
|
||||
"msw-open-music/pkg/database"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
type SearchFoldersRequest struct {
|
||||
Foldername string `json:"foldername"`
|
||||
Limit int64 `json:"limit"`
|
||||
Offset int64 `json:"offset"`
|
||||
}
|
||||
|
||||
type SearchFoldersResponse struct {
|
||||
Folders []database.Folder `json:"folders"`
|
||||
}
|
||||
|
||||
func (api *API) HandleSearchFolders(w http.ResponseWriter, r *http.Request) {
|
||||
searchFoldersRequest := &SearchFoldersRequest{}
|
||||
err := json.NewDecoder(r.Body).Decode(searchFoldersRequest)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check empty
|
||||
if searchFoldersRequest.Foldername == "" {
|
||||
api.HandleErrorString(w, r, `"foldername" can't be empty`)
|
||||
return
|
||||
}
|
||||
if api.CheckLimit(w, r, searchFoldersRequest.Limit) != nil {
|
||||
return
|
||||
}
|
||||
|
||||
searchFoldersResponse := &SearchFoldersResponse{}
|
||||
|
||||
searchFoldersResponse.Folders, err = api.Db.SearchFolders(searchFoldersRequest.Foldername, searchFoldersRequest.Limit, searchFoldersRequest.Offset)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Search folders", searchFoldersRequest.Foldername, searchFoldersRequest.Limit, searchFoldersRequest.Offset)
|
||||
|
||||
json.NewEncoder(w).Encode(searchFoldersResponse)
|
||||
}
|
||||
191
pkg/api/handle_stream.go
Normal file
191
pkg/api/handle_stream.go
Normal file
@@ -0,0 +1,191 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"log"
|
||||
"net/http"
|
||||
"os"
|
||||
"os/exec"
|
||||
"strconv"
|
||||
"strings"
|
||||
)
|
||||
|
||||
func (api *API) CheckGetFileStream(w http.ResponseWriter, r *http.Request) error {
|
||||
var err error
|
||||
q := r.URL.Query()
|
||||
ids := q["id"]
|
||||
if len(ids) == 0 {
|
||||
err = errors.New(`parameter "id" can't be empty`)
|
||||
api.HandleError(w, r, err)
|
||||
return err
|
||||
}
|
||||
_, err = strconv.Atoi(ids[0])
|
||||
if err != nil {
|
||||
err = errors.New(`parameter "id" should be an integer`)
|
||||
api.HandleError(w, r, err)
|
||||
return err
|
||||
}
|
||||
configs := q["config"]
|
||||
if len(configs) == 0 {
|
||||
err = errors.New(`parameter "config" can't be empty`)
|
||||
api.HandleError(w, r, err)
|
||||
return err
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// /get_file_stream?id=1&config=ffmpeg_config_name
|
||||
func (api *API) HandleGetFileStream(w http.ResponseWriter, r *http.Request) {
|
||||
err := api.CheckGetFileStream(w, r)
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
q := r.URL.Query()
|
||||
ids := q["id"]
|
||||
id, err := strconv.Atoi(ids[0])
|
||||
configs := q["config"]
|
||||
configName := configs[0]
|
||||
file, err := api.Db.GetFile(int64(id))
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
path, err := file.Path()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Stream file", path, configName)
|
||||
|
||||
ffmpegConfig, ok := api.GetFfmpegConfig(configName)
|
||||
if !ok {
|
||||
api.HandleErrorStringCode(w, r, `ffmpeg config not found`, 404)
|
||||
return
|
||||
}
|
||||
args := strings.Split(ffmpegConfig.Args, " ")
|
||||
startArgs := []string{"-threads", strconv.FormatInt(api.APIConfig.FfmpegThreads, 10), "-i", path}
|
||||
endArgs := []string{"-vn", "-f", "ogg", "-"}
|
||||
ffmpegArgs := append(startArgs, args...)
|
||||
ffmpegArgs = append(ffmpegArgs, endArgs...)
|
||||
cmd := exec.Command("ffmpeg", ffmpegArgs...)
|
||||
cmd.Stdout = w
|
||||
err = cmd.Run()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
}
|
||||
|
||||
type PrepareFileStreamDirectRequest struct {
|
||||
ID int64 `json:"id"`
|
||||
ConfigName string `json:"config_name"`
|
||||
}
|
||||
|
||||
type PrepareFileStreamDirectResponse struct {
|
||||
Filesize int64 `json:"filesize"`
|
||||
}
|
||||
|
||||
// /prepare_file_stream_direct?id=1&config=ffmpeg_config_name
|
||||
func (api *API) HandlePrepareFileStreamDirect(w http.ResponseWriter, r *http.Request) {
|
||||
prepareFileStreamDirectRequst := &PrepareFileStreamDirectRequest{
|
||||
ID: -1,
|
||||
}
|
||||
err := json.NewDecoder(r.Body).Decode(prepareFileStreamDirectRequst)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
// check empty
|
||||
if prepareFileStreamDirectRequst.ID < 0 {
|
||||
api.HandleErrorString(w, r, `"id" can't be none or negative`)
|
||||
return
|
||||
}
|
||||
if prepareFileStreamDirectRequst.ConfigName == "" {
|
||||
api.HandleErrorString(w, r, `"config_name" can't be empty`)
|
||||
return
|
||||
}
|
||||
|
||||
file, err := api.Db.GetFile(prepareFileStreamDirectRequst.ID)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
srcPath, err := file.Path()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
log.Println("[api] Prepare stream direct file", srcPath, prepareFileStreamDirectRequst.ConfigName)
|
||||
ffmpegConfig, ok := api.GetFfmpegConfig(prepareFileStreamDirectRequst.ConfigName)
|
||||
if !ok {
|
||||
api.HandleErrorStringCode(w, r, `ffmpeg config not found`, 404)
|
||||
return
|
||||
}
|
||||
objPath := api.Tmpfs.GetObjFilePath(prepareFileStreamDirectRequst.ID, prepareFileStreamDirectRequst.ConfigName)
|
||||
|
||||
// check obj file exists
|
||||
exists := api.Tmpfs.Exits(objPath)
|
||||
if exists {
|
||||
fileInfo, err := os.Stat(objPath)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
prepareFileStreamDirectResponse := &PrepareFileStreamDirectResponse{
|
||||
Filesize: fileInfo.Size(),
|
||||
}
|
||||
json.NewEncoder(w).Encode(prepareFileStreamDirectResponse)
|
||||
return
|
||||
}
|
||||
|
||||
api.Tmpfs.Record(objPath)
|
||||
args := strings.Split(ffmpegConfig.Args, " ")
|
||||
startArgs := []string{"-threads", strconv.FormatInt(api.APIConfig.FfmpegThreads, 10), "-i", srcPath}
|
||||
endArgs := []string{"-vn", "-y", objPath}
|
||||
ffmpegArgs := append(startArgs, args...)
|
||||
ffmpegArgs = append(ffmpegArgs, endArgs...)
|
||||
cmd := exec.Command("ffmpeg", ffmpegArgs...)
|
||||
err = cmd.Run()
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
|
||||
fileInfo, err := os.Stat(objPath)
|
||||
if err != nil {
|
||||
api.HandleError(w, r, err)
|
||||
return
|
||||
}
|
||||
prepareFileStreamDirectResponse := &PrepareFileStreamDirectResponse{
|
||||
Filesize: fileInfo.Size(),
|
||||
}
|
||||
json.NewEncoder(w).Encode(prepareFileStreamDirectResponse)
|
||||
}
|
||||
|
||||
// /get_file_stream_direct?id=1&config=ffmpeg_config_name
|
||||
// return converted file with http.ServeFile method
|
||||
func (api *API) HandleGetFileStreamDirect(w http.ResponseWriter, r *http.Request) {
|
||||
err := api.CheckGetFileStream(w, r)
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
q := r.URL.Query()
|
||||
ids := q["id"]
|
||||
id, err := strconv.Atoi(ids[0])
|
||||
configs := q["config"]
|
||||
configName := configs[0]
|
||||
|
||||
path := api.Tmpfs.GetObjFilePath(int64(id), configName)
|
||||
if api.Tmpfs.Exits(path) {
|
||||
api.Tmpfs.Record(path)
|
||||
}
|
||||
|
||||
log.Println("[api] Get direct cached file", path)
|
||||
|
||||
http.ServeFile(w, r, path)
|
||||
}
|
||||
18
pkg/api/handle_token.go
Normal file
18
pkg/api/handle_token.go
Normal file
@@ -0,0 +1,18 @@
|
||||
package api
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"log"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
func (api *API) CheckToken(w http.ResponseWriter, r *http.Request, token string) error {
|
||||
if token != api.token {
|
||||
err := errors.New("token not matched")
|
||||
log.Println("[api] [Warning] Token not matched", token)
|
||||
api.HandleErrorCode(w, r, err, 403)
|
||||
return err
|
||||
}
|
||||
log.Println("[api] Token passed")
|
||||
return nil
|
||||
}
|
||||
36
pkg/database/database.go
Normal file
36
pkg/database/database.go
Normal file
@@ -0,0 +1,36 @@
|
||||
package database
|
||||
|
||||
import (
|
||||
"database/sql"
|
||||
|
||||
_ "github.com/mattn/go-sqlite3"
|
||||
)
|
||||
|
||||
type Database struct {
|
||||
sqlConn *sql.DB
|
||||
stmt *Stmt
|
||||
}
|
||||
|
||||
func NewDatabase(dbName string) (*Database, error) {
|
||||
var err error
|
||||
|
||||
// open database
|
||||
sqlConn, err := sql.Open("sqlite3", dbName)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// prepare statement
|
||||
stmt, err := NewPreparedStatement(sqlConn)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// new database
|
||||
database := &Database{
|
||||
sqlConn: sqlConn,
|
||||
stmt: stmt,
|
||||
}
|
||||
|
||||
return database, nil
|
||||
}
|
||||
241
pkg/database/method.go
Normal file
241
pkg/database/method.go
Normal file
@@ -0,0 +1,241 @@
|
||||
package database
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"log"
|
||||
"os"
|
||||
"path/filepath"
|
||||
)
|
||||
|
||||
func (database *Database) InsertFeedback(time int64, feedback string, header string) error {
|
||||
_, err := database.stmt.insertFeedback.Exec(time, feedback, header)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (database *Database) GetRandomFiles(limit int64) ([]File, error) {
|
||||
rows, err := database.stmt.getRandomFiles.Query(limit)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
defer rows.Close()
|
||||
files := make([]File, 0)
|
||||
for rows.Next() {
|
||||
file := File{
|
||||
Db: database,
|
||||
}
|
||||
err = rows.Scan(&file.ID, &file.Folder_id, &file.Filename, &file.Foldername, &file.Filesize)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
files = append(files, file)
|
||||
}
|
||||
return files, nil
|
||||
}
|
||||
|
||||
func (database *Database) GetFilesInFolder(folder_id int64, limit int64, offset int64) ([]File, error) {
|
||||
rows, err := database.stmt.getFilesInFolder.Query(folder_id, limit, offset)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
defer rows.Close()
|
||||
files := make([]File, 0)
|
||||
for rows.Next() {
|
||||
file := File{
|
||||
Db: database,
|
||||
Folder_id: folder_id,
|
||||
}
|
||||
err = rows.Scan(&file.ID, &file.Filename, &file.Filesize, &file.Foldername)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
files = append(files, file)
|
||||
}
|
||||
return files, nil
|
||||
}
|
||||
|
||||
func (database *Database) SearchFolders(foldername string, limit int64, offset int64) ([]Folder, error) {
|
||||
rows, err := database.stmt.searchFolders.Query("%"+foldername+"%", limit, offset)
|
||||
if err != nil {
|
||||
return nil, errors.New("Error searching folders at query " + err.Error())
|
||||
}
|
||||
defer rows.Close()
|
||||
folders := make([]Folder, 0)
|
||||
for rows.Next() {
|
||||
folder := Folder{
|
||||
Db: database,
|
||||
}
|
||||
err = rows.Scan(&folder.ID, &folder.Folder, &folder.Foldername)
|
||||
if err != nil {
|
||||
return nil, errors.New("Error scanning SearchFolders" + err.Error())
|
||||
}
|
||||
folders = append(folders, folder)
|
||||
}
|
||||
return folders, nil
|
||||
}
|
||||
|
||||
func (database *Database) GetFile(id int64) (*File, error) {
|
||||
file := &File{
|
||||
Db: database,
|
||||
}
|
||||
err := database.stmt.getFile.QueryRow(id).Scan(&file.ID, &file.Folder_id, &file.Filename, &file.Foldername, &file.Filesize)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return file, nil
|
||||
}
|
||||
|
||||
func (database *Database) ResetFiles() error {
|
||||
log.Println("[db] Reset files")
|
||||
var err error
|
||||
_, err = database.stmt.dropFiles.Exec()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
_, err = database.stmt.initFilesTable.Exec()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return err
|
||||
}
|
||||
|
||||
func (database *Database) ResetFolder() error {
|
||||
log.Println("[db] Reset folders")
|
||||
var err error
|
||||
_, err = database.stmt.dropFolder.Exec()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
_, err = database.stmt.initFoldersTable.Exec()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return err
|
||||
}
|
||||
|
||||
func (database *Database) Walk(root string, pattern []string) error {
|
||||
patternDict := make(map[string]bool)
|
||||
for _, v := range pattern {
|
||||
patternDict[v] = true
|
||||
}
|
||||
log.Println("[db] Walk", root, patternDict)
|
||||
return filepath.Walk(root, func(path string, info os.FileInfo, err error) error {
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
if info.IsDir() {
|
||||
return nil
|
||||
}
|
||||
|
||||
// check pattern
|
||||
ext := filepath.Ext(info.Name())
|
||||
if _, ok := patternDict[ext]; !ok {
|
||||
return nil
|
||||
}
|
||||
|
||||
// insert file, folder will aut created
|
||||
err = database.Insert(path, info.Size())
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return nil
|
||||
})
|
||||
}
|
||||
|
||||
func (database *Database) GetFolder(folderId int64) (*Folder, error) {
|
||||
folder := &Folder{
|
||||
Db: database,
|
||||
}
|
||||
err := database.stmt.getFolder.QueryRow(folderId).Scan(&folder.Folder)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return folder, nil
|
||||
}
|
||||
|
||||
func (database *Database) SearchFiles(filename string, limit int64, offset int64) ([]File, error) {
|
||||
rows, err := database.stmt.searchFiles.Query("%"+filename+"%", limit, offset)
|
||||
if err != nil {
|
||||
return nil, errors.New("Error searching files at query " + err.Error())
|
||||
}
|
||||
defer rows.Close()
|
||||
files := make([]File, 0)
|
||||
for rows.Next() {
|
||||
var file File = File{
|
||||
Db: database,
|
||||
}
|
||||
err = rows.Scan(&file.ID, &file.Folder_id, &file.Filename, &file.Foldername, &file.Filesize)
|
||||
if err != nil {
|
||||
return nil, errors.New("Error scanning SearchFiles " + err.Error())
|
||||
}
|
||||
files = append(files, file)
|
||||
}
|
||||
if err = rows.Err(); err != nil {
|
||||
return nil, errors.New("Error scanning SearchFiles exit without full result" + err.Error())
|
||||
}
|
||||
return files, nil
|
||||
}
|
||||
|
||||
func (database *Database) FindFolder(folder string) (int64, error) {
|
||||
var id int64
|
||||
err := database.stmt.findFolder.QueryRow(folder).Scan(&id)
|
||||
if err != nil {
|
||||
return 0, err
|
||||
}
|
||||
return id, nil
|
||||
}
|
||||
|
||||
func (database *Database) FindFile(folderId int64, filename string) (int64, error) {
|
||||
var id int64
|
||||
err := database.stmt.findFile.QueryRow(folderId, filename).Scan(&id)
|
||||
if err != nil {
|
||||
return 0, err
|
||||
}
|
||||
return id, nil
|
||||
}
|
||||
|
||||
func (database *Database) InsertFolder(folder string) (int64, error) {
|
||||
result, err := database.stmt.insertFolder.Exec(folder, filepath.Base(folder))
|
||||
if err != nil {
|
||||
return 0, err
|
||||
}
|
||||
lastInsertId, err := result.LastInsertId()
|
||||
if err != nil {
|
||||
return 0, err
|
||||
}
|
||||
return lastInsertId, nil
|
||||
}
|
||||
|
||||
func (database *Database) InsertFile(folderId int64, filename string, filesize int64) error {
|
||||
_, err := database.stmt.insertFile.Exec(folderId, filename, filesize)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (database *Database) Insert(path string, filesize int64) error {
|
||||
folder, filename := filepath.Split(path)
|
||||
folderId, err := database.FindFolder(folder)
|
||||
if err != nil {
|
||||
folderId, err = database.InsertFolder(folder)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
// if file exists, skip it
|
||||
_, err = database.FindFile(folderId, filename)
|
||||
if err == nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
err = database.InsertFile(folderId, filename, filesize)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return nil
|
||||
}
|
||||
215
pkg/database/sql_stmt.go
Normal file
215
pkg/database/sql_stmt.go
Normal file
@@ -0,0 +1,215 @@
|
||||
package database
|
||||
|
||||
import (
|
||||
"database/sql"
|
||||
)
|
||||
|
||||
var initFilesTableQuery = `CREATE TABLE IF NOT EXISTS files (
|
||||
id INTEGER PRIMARY KEY,
|
||||
folder_id INTEGER NOT NULL,
|
||||
filename TEXT NOT NULL,
|
||||
filesize INTEGER NOT NULL
|
||||
);`
|
||||
|
||||
var initFoldersTableQuery = `CREATE TABLE IF NOT EXISTS folders (
|
||||
id INTEGER PRIMARY KEY,
|
||||
folder TEXT NOT NULL,
|
||||
foldername TEXT NOT NULL
|
||||
);`
|
||||
|
||||
var initFeedbacksTableQuery = `CREATE TABLE IF NOT EXISTS feedbacks (
|
||||
id INTEGER PRIMARY KEY,
|
||||
time INTEGER NOT NULL,
|
||||
feedback TEXT NOT NULL,
|
||||
header TEXT NOT NULL
|
||||
);`
|
||||
|
||||
var insertFolderQuery = `INSERT INTO folders (folder, foldername)
|
||||
VALUES (?, ?);`
|
||||
|
||||
var findFolderQuery = `SELECT id FROM folders WHERE folder = ? LIMIT 1;`
|
||||
|
||||
var findFileQuery = `SELECT id FROM files WHERE folder_id = ? AND filename = ? LIMIT 1;`
|
||||
|
||||
var insertFileQuery = `INSERT INTO files (folder_id, filename, filesize)
|
||||
VALUES (?, ?, ?);`
|
||||
|
||||
var searchFilesQuery = `SELECT
|
||||
files.id, files.folder_id, files.filename, folders.foldername, files.filesize
|
||||
FROM files
|
||||
JOIN folders ON files.folder_id = folders.id
|
||||
WHERE filename LIKE ?
|
||||
LIMIT ? OFFSET ?;`
|
||||
|
||||
var getFolderQuery = `SELECT folder FROM folders WHERE id = ? LIMIT 1;`
|
||||
|
||||
var dropFilesQuery = `DROP TABLE files;`
|
||||
|
||||
var dropFolderQuery = `DROP TABLE folders;`
|
||||
|
||||
var getFileQuery = `SELECT
|
||||
files.id, files.folder_id, files.filename, folders.foldername, files.filesize
|
||||
FROM files
|
||||
JOIN folders ON files.folder_id = folders.id
|
||||
WHERE files.id = ?
|
||||
LIMIT 1;`
|
||||
|
||||
var searchFoldersQuery = `SELECT
|
||||
id, folder, foldername
|
||||
FROM folders
|
||||
WHERE foldername LIKE ?
|
||||
LIMIT ? OFFSET ?;`
|
||||
|
||||
var getFilesInFolderQuery = `SELECT
|
||||
files.id, files.filename, files.filesize, folders.foldername
|
||||
FROM files
|
||||
JOIN folders ON files.folder_id = folders.id
|
||||
WHERE folder_id = ?
|
||||
LIMIT ? OFFSET ?;`
|
||||
|
||||
var getRandomFilesQuery = `SELECT
|
||||
files.id, files.folder_id, files.filename, folders.foldername, files.filesize
|
||||
FROM files
|
||||
JOIN folders ON files.folder_id = folders.id
|
||||
ORDER BY RANDOM()
|
||||
LIMIT ?;`
|
||||
|
||||
var insertFeedbackQuery = `INSERT INTO feedbacks (time, feedback, header)
|
||||
VALUES (?, ?, ?);`
|
||||
|
||||
type Stmt struct {
|
||||
initFilesTable *sql.Stmt
|
||||
initFoldersTable *sql.Stmt
|
||||
initFeedbacksTable *sql.Stmt
|
||||
insertFolder *sql.Stmt
|
||||
insertFile *sql.Stmt
|
||||
findFolder *sql.Stmt
|
||||
findFile *sql.Stmt
|
||||
searchFiles *sql.Stmt
|
||||
getFolder *sql.Stmt
|
||||
dropFiles *sql.Stmt
|
||||
dropFolder *sql.Stmt
|
||||
getFile *sql.Stmt
|
||||
searchFolders *sql.Stmt
|
||||
getFilesInFolder *sql.Stmt
|
||||
getRandomFiles *sql.Stmt
|
||||
insertFeedback *sql.Stmt
|
||||
}
|
||||
|
||||
func NewPreparedStatement(sqlConn *sql.DB) (*Stmt, error) {
|
||||
var err error
|
||||
|
||||
stmt := &Stmt{}
|
||||
|
||||
// init files table
|
||||
stmt.initFilesTable, err = sqlConn.Prepare(initFilesTableQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init folders table
|
||||
stmt.initFoldersTable, err = sqlConn.Prepare(initFoldersTableQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init feedbacks tables
|
||||
stmt.initFeedbacksTable, err = sqlConn.Prepare(initFeedbacksTableQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// run init statement
|
||||
_, err = stmt.initFilesTable.Exec()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
_, err = stmt.initFoldersTable.Exec()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
_, err = stmt.initFeedbacksTable.Exec()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init insert folder statement
|
||||
stmt.insertFolder, err = sqlConn.Prepare(insertFolderQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init findFolder statement
|
||||
stmt.findFolder, err = sqlConn.Prepare(findFolderQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init findFile statement
|
||||
stmt.findFile, err = sqlConn.Prepare(findFileQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init insertFile stmt
|
||||
stmt.insertFile, err = sqlConn.Prepare(insertFileQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init searchFile stmt
|
||||
stmt.searchFiles, err = sqlConn.Prepare(searchFilesQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init getFolder stmt
|
||||
stmt.getFolder, err = sqlConn.Prepare(getFolderQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init dropFolder stmt
|
||||
stmt.dropFolder, err = sqlConn.Prepare(dropFolderQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init dropFiles stmt
|
||||
stmt.dropFiles, err = sqlConn.Prepare(dropFilesQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init getFile stmt
|
||||
stmt.getFile, err = sqlConn.Prepare(getFileQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init searchFolder stmt
|
||||
stmt.searchFolders, err = sqlConn.Prepare(searchFoldersQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init getFilesInFolder stmt
|
||||
stmt.getFilesInFolder, err = sqlConn.Prepare(getFilesInFolderQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init getRandomFiles
|
||||
stmt.getRandomFiles, err = sqlConn.Prepare(getRandomFilesQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// init insertFeedback
|
||||
stmt.insertFeedback, err = sqlConn.Prepare(insertFeedbackQuery)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return stmt, err
|
||||
}
|
||||
30
pkg/database/struct.go
Normal file
30
pkg/database/struct.go
Normal file
@@ -0,0 +1,30 @@
|
||||
package database
|
||||
|
||||
import (
|
||||
"path/filepath"
|
||||
)
|
||||
|
||||
type File struct {
|
||||
Db *Database `json:"-"`
|
||||
ID int64 `json:"id"`
|
||||
Folder_id int64 `json:"folder_id"`
|
||||
Foldername string `json:"foldername"`
|
||||
Filename string `json:"filename"`
|
||||
Filesize int64 `json:"filesize"`
|
||||
}
|
||||
|
||||
type Folder struct {
|
||||
Db *Database `json:"-"`
|
||||
ID int64 `json:"id"`
|
||||
Folder string `json:"-"`
|
||||
Foldername string `json:"foldername"`
|
||||
}
|
||||
|
||||
func (f *File) Path() (string, error) {
|
||||
folder, err := f.Db.GetFolder(f.Folder_id)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
return filepath.Join(folder.Folder, f.Filename), nil
|
||||
}
|
||||
|
||||
@@ -1,39 +1,6 @@
|
||||
# msw-open-music web font-end
|
||||
# Getting Started with Create React App
|
||||
|
||||
This msw-open-music project was bootstrapped with `Create React App`
|
||||
|
||||
## Group 9 information
|
||||
|
||||
| Name | Name (EN) | No |
|
||||
| ------ | ------------- | ---------- |
|
||||
| 陈永源 | CHEN Yongyuan | 1930006025 |
|
||||
| 鲁雷 | Lu Lei | 2030026101 |
|
||||
| 张滨玮 | Zhang Binwei | 2030026197 |
|
||||
| 丁俊超 | Ding Junchao | 2030026258 |
|
||||
| 邱星越 | Qiu Xingyue | 2030026119 |
|
||||
| 李真晔 | Li Zhenye | 2030006104 |
|
||||
|
||||
|
||||
|
||||
## URL References
|
||||
|
||||
- `/#/` Default home page. Generate random files.
|
||||
- `/#/search-files` Search files
|
||||
- `/#/search-folders` Search folders
|
||||
- `/#/folder/:id` Show files in the folder
|
||||
- `/#/file/:id` Show file's information
|
||||
- `/#/file/:id/share` Share a specific file
|
||||
- `/#/file/:id/review` Review a file
|
||||
- `/#/manage` Manage system setting and status
|
||||
- `/#/login` Login
|
||||
- `/#/register/` Register
|
||||
- `/#/profile/:id` Profile of a user
|
||||
|
||||
## HOW TO DEPLOY?
|
||||
|
||||
Welcome to visit the demo <https://demo.uicbbs.com>
|
||||
|
||||
Put the files under `build` folder to your HTTP server (Apache, nginx, caddy, etc.) root folder.
|
||||
This project was bootstrapped with [Create React App](https://github.com/facebook/create-react-app).
|
||||
|
||||
## Available Scripts
|
||||
|
||||
@@ -47,6 +14,11 @@ Open [http://localhost:3000](http://localhost:3000) to view it in the browser.
|
||||
The page will reload if you make edits.\
|
||||
You will also see any lint errors in the console.
|
||||
|
||||
### `npm test`
|
||||
|
||||
Launches the test runner in the interactive watch mode.\
|
||||
See the section about [running tests](https://facebook.github.io/create-react-app/docs/running-tests) for more information.
|
||||
|
||||
### `npm run build`
|
||||
|
||||
Builds the app for production to the `build` folder.\
|
||||
@@ -56,3 +28,43 @@ The build is minified and the filenames include the hashes.\
|
||||
Your app is ready to be deployed!
|
||||
|
||||
See the section about [deployment](https://facebook.github.io/create-react-app/docs/deployment) for more information.
|
||||
|
||||
### `npm run eject`
|
||||
|
||||
**Note: this is a one-way operation. Once you `eject`, you can’t go back!**
|
||||
|
||||
If you aren’t satisfied with the build tool and configuration choices, you can `eject` at any time. This command will remove the single build dependency from your project.
|
||||
|
||||
Instead, it will copy all the configuration files and the transitive dependencies (webpack, Babel, ESLint, etc) right into your project so you have full control over them. All of the commands except `eject` will still work, but they will point to the copied scripts so you can tweak them. At this point you’re on your own.
|
||||
|
||||
You don’t have to ever use `eject`. The curated feature set is suitable for small and middle deployments, and you shouldn’t feel obligated to use this feature. However we understand that this tool wouldn’t be useful if you couldn’t customize it when you are ready for it.
|
||||
|
||||
## Learn More
|
||||
|
||||
You can learn more in the [Create React App documentation](https://facebook.github.io/create-react-app/docs/getting-started).
|
||||
|
||||
To learn React, check out the [React documentation](https://reactjs.org/).
|
||||
|
||||
### Code Splitting
|
||||
|
||||
This section has moved here: [https://facebook.github.io/create-react-app/docs/code-splitting](https://facebook.github.io/create-react-app/docs/code-splitting)
|
||||
|
||||
### Analyzing the Bundle Size
|
||||
|
||||
This section has moved here: [https://facebook.github.io/create-react-app/docs/analyzing-the-bundle-size](https://facebook.github.io/create-react-app/docs/analyzing-the-bundle-size)
|
||||
|
||||
### Making a Progressive Web App
|
||||
|
||||
This section has moved here: [https://facebook.github.io/create-react-app/docs/making-a-progressive-web-app](https://facebook.github.io/create-react-app/docs/making-a-progressive-web-app)
|
||||
|
||||
### Advanced Configuration
|
||||
|
||||
This section has moved here: [https://facebook.github.io/create-react-app/docs/advanced-configuration](https://facebook.github.io/create-react-app/docs/advanced-configuration)
|
||||
|
||||
### Deployment
|
||||
|
||||
This section has moved here: [https://facebook.github.io/create-react-app/docs/deployment](https://facebook.github.io/create-react-app/docs/deployment)
|
||||
|
||||
### `npm run build` fails to minify
|
||||
|
||||
This section has moved here: [https://facebook.github.io/create-react-app/docs/troubleshooting#npm-run-build-fails-to-minify](https://facebook.github.io/create-react-app/docs/troubleshooting#npm-run-build-fails-to-minify)
|
||||
|
||||
@@ -15,18 +15,10 @@ body {
|
||||
box-shadow: 0 0 8px #393939;
|
||||
border-radius: 6px 6px 0 0;
|
||||
}
|
||||
.avatar {
|
||||
border-radius: 50%;
|
||||
background-color: lightpink;
|
||||
padding: 0.39rem;
|
||||
}
|
||||
.title {
|
||||
margin-left: 1em;
|
||||
margin-right: 1em;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
vertical-align: middle;
|
||||
justify-content: space-between;
|
||||
}
|
||||
.title-text {
|
||||
margin-left: 1em;
|
||||
|
||||
@@ -1,24 +1,22 @@
|
||||
import { HashRouter as Router, Routes, Route, NavLink } from "react-router-dom";
|
||||
import {
|
||||
HashRouter as Router,
|
||||
Routes,
|
||||
Route,
|
||||
NavLink,
|
||||
} from "react-router-dom";
|
||||
import "./App.css";
|
||||
|
||||
import GetRandomFiles from "./component/GetRandomFiles";
|
||||
import SearchFiles from "./component/SearchFiles";
|
||||
import SearchFolders from "./component/SearchFolders";
|
||||
import FilesInFolder from "./component/FilesInFolder";
|
||||
import Manage from "./component/Manage";
|
||||
import Share from "./component/Share";
|
||||
import AudioPlayer from "./component/AudioPlayer";
|
||||
import FilesInFolder from "./component/FilesInFolder";
|
||||
import FileInfo from "./component/FileInfo";
|
||||
import Review from "./component/Review";
|
||||
import Profile from "./component/Profile";
|
||||
import User from "./component/User";
|
||||
import Login from "./component/Login";
|
||||
import Register from "./component/Register";
|
||||
import { useState } from "react";
|
||||
|
||||
function App() {
|
||||
const [playingFile, setPlayingFile] = useState({});
|
||||
const [user, setUser] = useState(null);
|
||||
return (
|
||||
<div className="base">
|
||||
<Router>
|
||||
@@ -26,16 +24,15 @@ function App() {
|
||||
<h3 className="title">
|
||||
<img src="favicon.png" alt="logo" className="logo" />
|
||||
<span className="title-text">MSW Open Music Project</span>
|
||||
<User user={user} setUser={setUser} />
|
||||
</h3>
|
||||
<nav className="nav">
|
||||
<NavLink to="/" className="nav-link">
|
||||
Feeling luckly
|
||||
</NavLink>
|
||||
<NavLink to="/search-files" className="nav-link">
|
||||
<NavLink to="/files" className="nav-link">
|
||||
Files
|
||||
</NavLink>
|
||||
<NavLink to="/search-folders" className="nav-link">
|
||||
<NavLink to="/folders" className="nav-link">
|
||||
Folders
|
||||
</NavLink>
|
||||
<NavLink to="/manage" className="nav-link">
|
||||
@@ -51,30 +48,22 @@ function App() {
|
||||
element={<GetRandomFiles setPlayingFile={setPlayingFile} />}
|
||||
/>
|
||||
<Route
|
||||
path="/search-files"
|
||||
path="/files"
|
||||
element={<SearchFiles setPlayingFile={setPlayingFile} />}
|
||||
/>
|
||||
<Route
|
||||
path="/search-folders"
|
||||
path="/folders"
|
||||
element={<SearchFolders setPlayingFile={setPlayingFile} />}
|
||||
/>
|
||||
<Route
|
||||
path="/folder/:id"
|
||||
path="/folders/:id"
|
||||
element={<FilesInFolder setPlayingFile={setPlayingFile} />}
|
||||
/>
|
||||
<Route path="/manage" element={<Manage />} />
|
||||
<Route
|
||||
path="/file/:id/share"
|
||||
path="/files/:id/share"
|
||||
element={<Share setPlayingFile={setPlayingFile} />}
|
||||
/>
|
||||
<Route path="/file/:id/review" element={<Review />} />
|
||||
<Route
|
||||
path="/profile/:id"
|
||||
element={<Profile user={user} setUser={setUser} />}
|
||||
/>
|
||||
<Route path="/login" element={<Login setUser={setUser} />} />
|
||||
<Route path="/register" element={<Register setUser={setUser} />} />
|
||||
<Route path="/file/:id" element={<FileInfo />} />
|
||||
</Routes>
|
||||
</main>
|
||||
<footer>
|
||||
|
||||
@@ -80,7 +80,7 @@ function AudioPlayer(props) {
|
||||
|
||||
<button
|
||||
onClick={() =>
|
||||
navigate(`search-folders/${props.playingFile.folder_id}`)
|
||||
navigate(`/folders/${props.playingFile.folder_id}`)
|
||||
}
|
||||
>
|
||||
{props.playingFile.foldername}
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import { useEffect, useState } from "react";
|
||||
import getFfmpegConfigListRespondExample from "../example-respond/get_ffmpeg_config_list.json"
|
||||
|
||||
function FfmpegConfig(props) {
|
||||
// props.setSelectedFfmpegConfig
|
||||
@@ -8,8 +7,17 @@ function FfmpegConfig(props) {
|
||||
const [ffmpegConfigList, setFfmpegConfigList] = useState([]);
|
||||
|
||||
useEffect(() => {
|
||||
setFfmpegConfigList(getFfmpegConfigListRespondExample.ffmpeg_config_list);
|
||||
props.setSelectedFfmpegConfig(getFfmpegConfigListRespondExample.ffmpeg_config_list[0]);
|
||||
fetch("/api/v1/get_ffmpeg_config_list")
|
||||
.then((response) => response.json())
|
||||
.then((data) => {
|
||||
setFfmpegConfigList(data.ffmpeg_config_list);
|
||||
if (data.ffmpeg_config_list.length > 0) {
|
||||
props.setSelectedFfmpegConfig(data.ffmpeg_config_list[0]);
|
||||
}
|
||||
})
|
||||
.catch((error) => {
|
||||
alert("get_ffmpeg_config_list error: " + error);
|
||||
});
|
||||
}, []);
|
||||
|
||||
return (
|
||||
|
||||
@@ -15,11 +15,14 @@ function FileDialog(props) {
|
||||
<dialog open={props.showStatus}>
|
||||
<p>{props.file.filename}</p>
|
||||
<p>
|
||||
Download using browser
|
||||
Download 使用浏览器下载原文件
|
||||
<br />
|
||||
Play on the web page
|
||||
Play 调用网页播放器播放
|
||||
<br />
|
||||
</p>
|
||||
<a href={downloadURL} download>
|
||||
<button>Download</button>
|
||||
</a>
|
||||
<button
|
||||
onClick={() => {
|
||||
props.setPlayingFile(props.file);
|
||||
@@ -30,10 +33,11 @@ function FileDialog(props) {
|
||||
</button>
|
||||
<button
|
||||
onClick={() => {
|
||||
navigate(`/file/${props.file.id}`);
|
||||
navigate(`/files/${props.file.id}/share`);
|
||||
props.setShowStatus(false);
|
||||
}}
|
||||
>
|
||||
Info
|
||||
Share
|
||||
</button>
|
||||
<button onClick={() => props.setShowStatus(false)}>Close</button>
|
||||
</dialog>
|
||||
|
||||
@@ -25,7 +25,7 @@ function FileEntry(props) {
|
||||
</td>
|
||||
<td
|
||||
className="clickable"
|
||||
onClick={() => navigate(`/folder/${props.file.folder_id}`)}
|
||||
onClick={() => navigate(`/folders/${props.file.folder_id}`)}
|
||||
>
|
||||
{props.file.foldername}
|
||||
</td>
|
||||
|
||||
@@ -1,69 +0,0 @@
|
||||
import { useParams, Link, useNavigate } from "react-router-dom";
|
||||
|
||||
function FileInfo() {
|
||||
let params = useParams();
|
||||
let navigate = useNavigate();
|
||||
return (
|
||||
<div className="page">
|
||||
<h3>File Information</h3>
|
||||
<span>
|
||||
<button>Download</button>
|
||||
<button
|
||||
onClick={() => {
|
||||
navigate("/file/" + params.id + '/share');
|
||||
}}
|
||||
>
|
||||
Share
|
||||
</button>
|
||||
<button
|
||||
onClick={() => {
|
||||
navigate("/file/" + params.id + '/review');
|
||||
}}
|
||||
>Review</button>
|
||||
</span>
|
||||
<table>
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Name</th>
|
||||
<th>Value</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr>
|
||||
<td>File Name</td>
|
||||
<td>{params.id}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>File Size</td>
|
||||
<td>123456</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>File Type</td>
|
||||
<td>media/aac</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Last Modified</td>
|
||||
<td>2020-01-01</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Import by</td>
|
||||
<td>
|
||||
<Link to="/profile/3">@admin</Link>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Import Date</td>
|
||||
<td>2020-01-01</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Location</td>
|
||||
<td>/data/media/aac</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
<button>Update</button>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export default FileInfo;
|
||||
@@ -1,13 +1,58 @@
|
||||
import FilesTable from "./FilesTable";
|
||||
import searchFilesRespondExample from "../example-respond/search_files.json"
|
||||
import { useParams } from "react-router";
|
||||
import { useState, useEffect } from "react";
|
||||
import FilesTable from "./FilesTable";
|
||||
|
||||
function FilesInFolder(props) {
|
||||
let params = useParams();
|
||||
const [files, setFiles] = useState([]);
|
||||
const [isLoading, setIsLoading] = useState(false);
|
||||
const [offset, setOffset] = useState(0);
|
||||
const limit = 10;
|
||||
|
||||
useEffect(() => {
|
||||
setIsLoading(true);
|
||||
fetch("/api/v1/get_files_in_folder", {
|
||||
method: "POST",
|
||||
headers: { "Content-Type": "application/json" },
|
||||
body: JSON.stringify({
|
||||
folder_id: parseInt(params.id),
|
||||
offset: offset,
|
||||
limit: limit,
|
||||
}),
|
||||
})
|
||||
.then((response) => response.json())
|
||||
.then((data) => {
|
||||
setFiles(data.files ? data.files : []);
|
||||
})
|
||||
.catch((error) => alert(error))
|
||||
.finally(() => {
|
||||
setIsLoading(false);
|
||||
});
|
||||
}, [params.id, offset]);
|
||||
|
||||
function nextPage() {
|
||||
setOffset(offset + limit);
|
||||
}
|
||||
|
||||
function lastPage() {
|
||||
const offsetValue = offset - limit;
|
||||
if (offsetValue < 0) {
|
||||
return;
|
||||
}
|
||||
setOffset(offsetValue);
|
||||
}
|
||||
|
||||
return (
|
||||
<div>
|
||||
<h3>Files in folder id {params.id}</h3>
|
||||
<FilesTable setPlayingFile={props.setPlayingFile} files={searchFilesRespondExample.files} />
|
||||
<div className="page">
|
||||
<h3>Files in Folder</h3>
|
||||
<div className="search_toolbar">
|
||||
<button onClick={lastPage}>Last page</button>
|
||||
<button disabled>
|
||||
{isLoading ? "Loading..." : `${offset} - ${offset + files.length}`}
|
||||
</button>
|
||||
<button onClick={nextPage}>Next page</button>
|
||||
</div>
|
||||
<FilesTable setPlayingFile={props.setPlayingFile} files={files} />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
import FileEntry from "./FileEntry";
|
||||
|
||||
function FilesTable(props) {
|
||||
if (props.files.length === 0) {
|
||||
return null;
|
||||
}
|
||||
return (
|
||||
<table>
|
||||
<thead>
|
||||
|
||||
@@ -2,6 +2,9 @@ import { useNavigate } from "react-router";
|
||||
|
||||
function FoldersTable(props) {
|
||||
let navigate = useNavigate();
|
||||
if (props.folders.length === 0) {
|
||||
return null;
|
||||
}
|
||||
return (
|
||||
<table>
|
||||
<thead>
|
||||
@@ -14,12 +17,12 @@ function FoldersTable(props) {
|
||||
{props.folders.map((folder) => (
|
||||
<tr key={folder.id}>
|
||||
<td
|
||||
onClick={() => navigate(`/folder/${folder.id}`)}
|
||||
onClick={() => navigate(`/folders/${folder.id}`)}
|
||||
className="clickable"
|
||||
>
|
||||
{folder.foldername}
|
||||
</td>
|
||||
<td onClick={() => navigate(`/folder/${folder.id}`)}>
|
||||
<td onClick={() => navigate(`/folders/${folder.id}`)}>
|
||||
<button>View</button>
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
@@ -1,13 +1,23 @@
|
||||
import { useEffect, useState } from "react";
|
||||
import FilesTable from "./FilesTable";
|
||||
import getRandomFilesRespondExample from "../example-respond/get_random_files.json"
|
||||
|
||||
function GetRandomFiles(props) {
|
||||
const [files, setFiles] = useState([]);
|
||||
const [isLoading, setIsLoading] = useState(false);
|
||||
|
||||
function refresh(setFiles) {
|
||||
setFiles(getRandomFilesRespondExample.files);
|
||||
setIsLoading(true);
|
||||
fetch("/api/v1/get_random_files")
|
||||
.then((response) => response.json())
|
||||
.then((data) => {
|
||||
setFiles(data.files);
|
||||
})
|
||||
.catch((error) => {
|
||||
alert("get_random_files error: " + error);
|
||||
})
|
||||
.finally(() => {
|
||||
setIsLoading(false);
|
||||
});
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
|
||||
@@ -1,48 +0,0 @@
|
||||
import { useNavigate } from "react-router-dom";
|
||||
import { useState } from "react";
|
||||
|
||||
function Login(props) {
|
||||
let navigate = useNavigate();
|
||||
let [username, setUsername] = useState("");
|
||||
let [password, setPassword] = useState("");
|
||||
return (
|
||||
<div>
|
||||
<h1>Login</h1>
|
||||
<label htmlFor="username"></label>
|
||||
<input
|
||||
type="text"
|
||||
id="username"
|
||||
value={username}
|
||||
onChange={(e) => setUsername(e.target.value)}
|
||||
/>
|
||||
<label htmlFor="password">Password</label>
|
||||
<input
|
||||
type="password"
|
||||
id="password"
|
||||
value={password}
|
||||
onChange={(e) => setPassword(e.target.value)}
|
||||
/>
|
||||
<span>
|
||||
<button
|
||||
onClick={() => {
|
||||
if (!username || !password) {
|
||||
alert("Please enter username and password");
|
||||
return;
|
||||
}
|
||||
props.setUser({ id: 123, username: username, password: password });
|
||||
navigate("/");
|
||||
}}
|
||||
>
|
||||
Login
|
||||
</button>
|
||||
<button
|
||||
onClick={() => {
|
||||
navigate("/register");
|
||||
}}
|
||||
>Register</button>
|
||||
</span>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export default Login;
|
||||
@@ -1,102 +1,49 @@
|
||||
import getFfmpegConfigListRespondExample from "../example-respond/get_ffmpeg_config_list.json";
|
||||
import { useState } from "react";
|
||||
|
||||
function Manage() {
|
||||
const [token, setToken] = useState("");
|
||||
const [walkPath, setWalkPath] = useState("");
|
||||
|
||||
function updateDatabase() {
|
||||
fetch("/api/v1/walk", {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify({
|
||||
token: token,
|
||||
root: walkPath,
|
||||
pattern: [".wav", ".mp3"],
|
||||
}),
|
||||
})
|
||||
.then((res) => res.json())
|
||||
.then((data) => {
|
||||
console.log(data);
|
||||
});
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="page">
|
||||
<div>
|
||||
<h2>Manage</h2>
|
||||
<h3>Server status</h3>
|
||||
<table>
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Name</th>
|
||||
<th>Value</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr>
|
||||
<td>Server status</td>
|
||||
<td>
|
||||
<span className="status-ok">OK</span>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Server uptime</td>
|
||||
<td>
|
||||
<span>1 day, 23 hours, 59 minutes and 59 seconds</span>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Server load</td>
|
||||
<td>
|
||||
<span>0.00 / 0.00 / 0.00</span>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Server memory usage</td>
|
||||
<td>
|
||||
<span>0.00 MB</span>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Server disk usage</td>
|
||||
<td>
|
||||
<span>0.00 MB</span>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Server uptime</td>
|
||||
<td>
|
||||
<span>1 day, 23 hours, 59 minutes and 59 seconds</span>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Server load</td>
|
||||
<td>
|
||||
<span>0.00 / 0.00 / 0.00</span>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Server memory usage</td>
|
||||
<td>
|
||||
<span>0.00 MB</span>
|
||||
</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Server disk usage</td>
|
||||
<td>
|
||||
<span>0.00 MB</span>
|
||||
</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
<h3>Database opeartions</h3>
|
||||
<ul>
|
||||
<li>.mp3</li>
|
||||
<li>.flac</li>
|
||||
<li>.wav</li>
|
||||
<li>.ogg</li>
|
||||
<li>.aac</li>
|
||||
<li>.m4a</li>
|
||||
</ul>
|
||||
<input type="text" placeholder=".mp3" />
|
||||
<button>Add Pattern</button>
|
||||
<input type="text" placeholder="/path/to/root" />
|
||||
<button>Import</button>
|
||||
<h3>Ffmpeg Settings</h3>
|
||||
<ol>
|
||||
{getFfmpegConfigListRespondExample.ffmpeg_config_list.map(
|
||||
(item, index) => (
|
||||
<li>
|
||||
{item.name} {item.args}
|
||||
</li>
|
||||
)
|
||||
)}
|
||||
</ol>
|
||||
<span>
|
||||
<input type="text" placeholder="name" />
|
||||
<input type="text" placeholder="args" />
|
||||
<button>Add</button>
|
||||
</span>
|
||||
<input
|
||||
type="text"
|
||||
value={token}
|
||||
placeholder="token"
|
||||
onChange={(e) => setToken(e.target.value)}
|
||||
/>
|
||||
<input
|
||||
type="text"
|
||||
value={walkPath}
|
||||
placeholder="walk path"
|
||||
onChange={(e) => setWalkPath(e.target.value)}
|
||||
/>
|
||||
<button
|
||||
onClick={() => {
|
||||
updateDatabase();
|
||||
}}
|
||||
>
|
||||
Update Database
|
||||
</button>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -1,37 +0,0 @@
|
||||
import { useParams, useNavigate } from "react-router-dom";
|
||||
import ReviewEntry from "./ReviewEntry";
|
||||
import SearchFiles from "./SearchFiles";
|
||||
import getRandomFilesRespondExample from "../example-respond/get_random_files.json";
|
||||
|
||||
function Profile(props) {
|
||||
let params = useParams();
|
||||
let navigate = useNavigate();
|
||||
|
||||
return (
|
||||
<div>
|
||||
<h1>Profile of user {params.id}</h1>
|
||||
{props.user && props.user.id === parseInt(params.id) && (
|
||||
<button
|
||||
onClick={() => {
|
||||
props.setUser(null);
|
||||
navigate("/");
|
||||
}}
|
||||
>
|
||||
Logout
|
||||
</button>
|
||||
)}
|
||||
<p>
|
||||
Lorem ipsum dolor sit amet consectetur adipisicing elit. Quisquam
|
||||
doloremque, quidem quisquam, quisquam quisquam quisquam quisquam
|
||||
dignissimos.
|
||||
</p>
|
||||
<h3>Reviews</h3>
|
||||
<ReviewEntry />
|
||||
<ReviewEntry />
|
||||
<h3>Liked music</h3>
|
||||
<SearchFiles folder={{}} />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export default Profile;
|
||||
@@ -1,51 +0,0 @@
|
||||
import { useNavigate } from "react-router-dom";
|
||||
import { useState } from "react";
|
||||
|
||||
function Register(props) {
|
||||
let navigate = useNavigate();
|
||||
const [username, setUsername] = useState("");
|
||||
const [password, setPassword] = useState("");
|
||||
const [password2, setPassword2] = useState("");
|
||||
return (
|
||||
<div>
|
||||
<h1>Register</h1>
|
||||
<label htmlFor="username">Username:</label>
|
||||
<input
|
||||
type="text"
|
||||
id="username"
|
||||
value={username}
|
||||
onChange={(e) => setUsername(e.target.value)}
|
||||
/>
|
||||
<label htmlFor="password">Password:</label>
|
||||
<input
|
||||
type="password"
|
||||
id="password"
|
||||
value={password}
|
||||
onChange={(e) => setPassword(e.target.value)}
|
||||
/>
|
||||
<label htmlFor="password2">Confirm Password:</label>
|
||||
<input
|
||||
type="password"
|
||||
id="password2"
|
||||
value={password2}
|
||||
onChange={(e) => setPassword2(e.target.value)}
|
||||
/>
|
||||
<button
|
||||
onClick={() => {
|
||||
if (!username || !password || !password2) {
|
||||
alert("Please fill out all fields");
|
||||
} else if (password !== password2) {
|
||||
alert("Passwords do not match");
|
||||
} else {
|
||||
props.setUser({ id: 39, username: username, password: password });
|
||||
navigate("/");
|
||||
}
|
||||
}}
|
||||
>
|
||||
Register
|
||||
</button>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export default Register;
|
||||
@@ -1,35 +0,0 @@
|
||||
import { useParams } from "react-router";
|
||||
import { Link } from "react-router-dom";
|
||||
import ReviewEntry from "./ReviewEntry";
|
||||
|
||||
function Review() {
|
||||
let params = useParams();
|
||||
|
||||
return (
|
||||
<div className="page">
|
||||
<h3>Review on music ID {params.id}</h3>
|
||||
<textarea
|
||||
className="review-text"
|
||||
placeholder="Write your review here"
|
||||
></textarea>
|
||||
<span>
|
||||
<button>Submit</button>
|
||||
<button>Add to fav</button>
|
||||
</span>
|
||||
<details open>
|
||||
<summary>Liked by</summary>
|
||||
<p>
|
||||
<Link to="/profile/1">@User 1</Link>
|
||||
<Link to="/profile/2">@User 2</Link>
|
||||
<Link to="/profile/3">@User 3</Link>
|
||||
<Link to="/profile/4">@User 4</Link>
|
||||
</p>
|
||||
</details>
|
||||
<ReviewEntry />
|
||||
<ReviewEntry />
|
||||
<ReviewEntry />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export default Review;
|
||||
@@ -1,18 +0,0 @@
|
||||
import { Link} from "react-router-dom";
|
||||
|
||||
function ReviewEntry() {
|
||||
return (
|
||||
<p>
|
||||
<h5>
|
||||
<Link to="/profile/2">@rin</Link> comment music ID 39 at
|
||||
2019-01-01 12:23:45
|
||||
</h5>
|
||||
Agree with <Link to="/profile/1">@hmsy</Link>. I also like how well the
|
||||
musician plays the guitar. They are all very good. They really make the
|
||||
song sound better. I like the way the bass plays and the way the guitar
|
||||
sounds. I like the way the drums sound.
|
||||
</p>
|
||||
);
|
||||
}
|
||||
|
||||
export default ReviewEntry;
|
||||
@@ -1,6 +1,5 @@
|
||||
import { useState, useEffect } from "react";
|
||||
import FilesTable from "./FilesTable";
|
||||
import searchFilesRespondExample from "../example-respond/search_files.json"
|
||||
|
||||
function SearchFiles(props) {
|
||||
const [files, setFiles] = useState([]);
|
||||
@@ -10,7 +9,27 @@ function SearchFiles(props) {
|
||||
const limit = 10;
|
||||
|
||||
function searchFiles() {
|
||||
setFiles(searchFilesRespondExample.files);
|
||||
setIsLoading(true);
|
||||
fetch("/api/v1/search_files", {
|
||||
method: "POST",
|
||||
headers: { "Content-Type": "application/json" },
|
||||
body: JSON.stringify({
|
||||
filename: filename,
|
||||
limit: limit,
|
||||
offset: offset,
|
||||
}),
|
||||
})
|
||||
.then((response) => response.json())
|
||||
.then((data) => {
|
||||
const files = data.files ? data.files : [];
|
||||
setFiles(files);
|
||||
})
|
||||
.catch((error) => {
|
||||
alert("search_files error: " + error);
|
||||
})
|
||||
.finally(() => {
|
||||
setIsLoading(false);
|
||||
});
|
||||
}
|
||||
|
||||
function nextPage() {
|
||||
@@ -25,13 +44,12 @@ function SearchFiles(props) {
|
||||
setOffset(offsetValue);
|
||||
}
|
||||
|
||||
useEffect(() => searchFiles(), [offset, props.folder]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
useEffect(() => searchFiles(), [offset]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
return (
|
||||
<div className="page">
|
||||
<h3>Search Files</h3>
|
||||
<div className="search_toolbar">
|
||||
{!props.folder && (
|
||||
<input
|
||||
onChange={(event) => setFilename(event.target.value)}
|
||||
onKeyDown={(event) => {
|
||||
@@ -42,18 +60,13 @@ function SearchFiles(props) {
|
||||
type="text"
|
||||
placeholder="Enter filename"
|
||||
/>
|
||||
)}
|
||||
<button
|
||||
disabled={!!props.folder}
|
||||
onClick={() => {
|
||||
searchFiles();
|
||||
}}
|
||||
>
|
||||
{isLoading ? "Loading..." : "Search"}
|
||||
</button>
|
||||
{props.folder && props.folder.foldername && (
|
||||
<button onClick={searchFiles}>{props.folder.foldername}</button>
|
||||
)}
|
||||
<button onClick={lastPage}>Last page</button>
|
||||
<button disabled>
|
||||
{offset} - {offset + files.length}
|
||||
|
||||
@@ -1,19 +1,37 @@
|
||||
import { useEffect, useState } from "react";
|
||||
import { useParams } from "react-router";
|
||||
import FoldersTable from "./FoldersTable";
|
||||
import SearchFiles from "./SearchFiles";
|
||||
import searchFoldersRespondExample from "../example-respond/search_folders.json";
|
||||
|
||||
function SearchFolders(props) {
|
||||
function SearchFolders() {
|
||||
const [foldername, setFoldername] = useState("");
|
||||
const [folders, setFolders] = useState([]);
|
||||
const [folder, setFolder] = useState({});
|
||||
const [offset, setOffset] = useState(0);
|
||||
const [isLoading, setIsLoading] = useState(false);
|
||||
const limit = 10;
|
||||
|
||||
function searchFolder() {
|
||||
setFolders(searchFoldersRespondExample.folders);
|
||||
if (foldername === "") {
|
||||
return;
|
||||
}
|
||||
setIsLoading(true);
|
||||
fetch("/api/v1/search_folders", {
|
||||
method: "POST",
|
||||
headers: { "Content-Type": "application/json" },
|
||||
body: JSON.stringify({
|
||||
foldername: foldername,
|
||||
limit: limit,
|
||||
offset: offset,
|
||||
}),
|
||||
})
|
||||
.then((response) => response.json())
|
||||
.then((data) => {
|
||||
setFolders(data.folders ? data.folders : []);
|
||||
})
|
||||
.catch((error) => {
|
||||
alert("search_folders error: " + error);
|
||||
})
|
||||
.finally(() => {
|
||||
setIsLoading(false);
|
||||
});
|
||||
}
|
||||
|
||||
function nextPage() {
|
||||
@@ -28,17 +46,7 @@ function SearchFolders(props) {
|
||||
setOffset(offsetValue);
|
||||
}
|
||||
|
||||
function viewFolder(folder) {
|
||||
setFolder(folder);
|
||||
}
|
||||
|
||||
let params = useParams();
|
||||
useEffect(() => searchFolder(), [offset]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
useEffect(() => {
|
||||
if (params.id !== undefined) {
|
||||
setFolder({ id: parseInt(params.id) });
|
||||
}
|
||||
}, [params.id]);
|
||||
|
||||
return (
|
||||
<div className="page">
|
||||
@@ -63,8 +71,7 @@ function SearchFolders(props) {
|
||||
</button>
|
||||
<button onClick={nextPage}>Next page</button>
|
||||
</div>
|
||||
<FoldersTable viewFolder={viewFolder} folders={folders} />
|
||||
<SearchFiles setPlayingFile={props.setPlayingFile} folder={folder} />
|
||||
<FoldersTable folders={folders} />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -1,13 +1,25 @@
|
||||
import { useEffect, useState } from "react";
|
||||
import { useParams } from "react-router";
|
||||
import FilesTable from "./FilesTable";
|
||||
import GetFileInfoRespondExample from "../example-respond/get_file_info.json";
|
||||
|
||||
function Share(props) {
|
||||
let params = useParams();
|
||||
const [file, setFile] = useState([]);
|
||||
useEffect(() => {
|
||||
setFile([GetFileInfoRespondExample]);
|
||||
fetch("/api/v1/get_file_info", {
|
||||
method: "POST",
|
||||
headers: { "Content-Type": "application/json" },
|
||||
body: JSON.stringify({
|
||||
id: parseInt(params.id),
|
||||
}),
|
||||
})
|
||||
.then((response) => response.json())
|
||||
.then((data) => {
|
||||
setFile([data]);
|
||||
})
|
||||
.catch((error) => {
|
||||
alert("get_file_info error: " + error);
|
||||
});
|
||||
}, [params]);
|
||||
return (
|
||||
<div className="page">
|
||||
|
||||
@@ -1,23 +0,0 @@
|
||||
import { useNavigate } from "react-router";
|
||||
|
||||
function User(props) {
|
||||
// props.user
|
||||
// props.setUser
|
||||
let navigate = useNavigate();
|
||||
return (
|
||||
<div
|
||||
className="avatar"
|
||||
onClick={() => {
|
||||
if (props.user) {
|
||||
navigate("/profile/" + props.user.id);
|
||||
} else {
|
||||
navigate("/login");
|
||||
}
|
||||
}}
|
||||
>
|
||||
{props.user ? props.user.username : "Login"}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export default User;
|
||||
@@ -1,12 +0,0 @@
|
||||
{
|
||||
"ffmpeg_config_list": [
|
||||
{ "name": "OPUS 128k", "args": "-c:a libopus -ab 128k" },
|
||||
{ "name": "OPUS 96k", "args": "-c:a libopus -ab 96k" },
|
||||
{ "name": "OPUS 256k", "args": "-c:a libopus -ab 256k" },
|
||||
{ "name": "OPUS 320k", "args": "-c:a libopus -ab 320k" },
|
||||
{ "name": "OPUS 512k", "args": "-c:a libopus -ab 512k" },
|
||||
{ "name": "AAC 128k", "args": "-c:a aac -ab 128k" },
|
||||
{ "name": "AAC 256k", "args": "-c:a aac -ab 256k" },
|
||||
{ "name": "全损音质 32k", "args": "-c:a libopus -ab 32k" }
|
||||
]
|
||||
}
|
||||
@@ -1,7 +0,0 @@
|
||||
{
|
||||
"id": 9856,
|
||||
"folder_id": 898,
|
||||
"foldername": "[2021.05.12] TVアニメ「シャドーハウス」EDテーマ「ないない」/ReoNa [スペシャルエディション] [FLAC 96kHz/24bit]",
|
||||
"filename": "03. 生きてるだけでえらいよ.flac",
|
||||
"filesize": 122761032
|
||||
}
|
||||
@@ -1,74 +0,0 @@
|
||||
{
|
||||
"files": [
|
||||
{
|
||||
"id": 9727,
|
||||
"folder_id": 958,
|
||||
"foldername": "garnet (narry) — emerald [GARN-0002] (flac)",
|
||||
"filename": "06. narry — malachite.flac",
|
||||
"filesize": 28228112
|
||||
},
|
||||
{
|
||||
"id": 4785,
|
||||
"folder_id": 457,
|
||||
"foldername": "Winter (FLAC)",
|
||||
"filename": "08 - 恋.flac",
|
||||
"filesize": 33576086
|
||||
},
|
||||
{
|
||||
"id": 13943,
|
||||
"folder_id": 1368,
|
||||
"foldername": "[mikudb] 融合YELLOWS",
|
||||
"filename": "10. カレーライスのうた.mp3",
|
||||
"filesize": 2524925
|
||||
},
|
||||
{
|
||||
"id": 21743,
|
||||
"folder_id": 2207,
|
||||
"foldername": "Tsumanne\(^o^)/",
|
||||
"filename": "08.スーパートルコ行進曲 - オワタ\(^o^)/.mp3",
|
||||
"filesize": 7677985
|
||||
},
|
||||
{
|
||||
"id": 35918,
|
||||
"folder_id": 3758,
|
||||
"foldername": "2008 - Higurashi 2 - Ano hi, Ano Basho, Subete ni ' Arigatou",
|
||||
"filename": "06 - Thanks (Bashee Arenge Version).flac",
|
||||
"filesize": 39545977
|
||||
},
|
||||
{
|
||||
"id": 32394,
|
||||
"folder_id": 3341,
|
||||
"foldername": "[彩音 ~xi-on~] Eyes",
|
||||
"filename": "08 - 少女さとり ~ 3rd Eye.mp3",
|
||||
"filesize": 7538525
|
||||
},
|
||||
{
|
||||
"id": 16934,
|
||||
"folder_id": 1671,
|
||||
"foldername": "Explorism",
|
||||
"filename": "14.Please, My Producer (feat. SAK).mp3",
|
||||
"filesize": 6099717
|
||||
},
|
||||
{
|
||||
"id": 1381,
|
||||
"folder_id": 131,
|
||||
"foldername": "Garakuta Live (FLAC)",
|
||||
"filename": "(09) [しけもく] 愛染エピローグ.flac",
|
||||
"filesize": 25173829
|
||||
},
|
||||
{
|
||||
"id": 18066,
|
||||
"folder_id": 1791,
|
||||
"foldername": "Jailbreak from the Sunday Morning!!",
|
||||
"filename": "05. Good bye!! Melancory Sunday Morning!!.mp3",
|
||||
"filesize": 10197778
|
||||
},
|
||||
{
|
||||
"id": 41261,
|
||||
"folder_id": 4305,
|
||||
"foldername": "[KSLA-0124~6] Modification of Key Sounds Label [CD-FLAC]",
|
||||
"filename": "3-01. Trigger (MUZIK SERVANT vs Freezer Remix).flac",
|
||||
"filesize": 42772516
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -1,25 +0,0 @@
|
||||
{
|
||||
"files": [
|
||||
{
|
||||
"id": 26555,
|
||||
"folder_id": 2579,
|
||||
"foldername": "[mikudb] SEB presents SUPER HATSUNE BEAT Vol. 1",
|
||||
"filename": "02. White Letter (ゆうゆP Euro Arrange).mp3",
|
||||
"filesize": 4252006
|
||||
},
|
||||
{
|
||||
"id": 40891,
|
||||
"folder_id": 4121,
|
||||
"foldername": "初音ミクベスト ~memories~",
|
||||
"filename": "10.White letter.mp3",
|
||||
"filesize": 6723982
|
||||
},
|
||||
{
|
||||
"id": 43289,
|
||||
"folder_id": 4384,
|
||||
"foldername": "Hatsune Miku Best ~memories~",
|
||||
"filename": "10.White Letter.flac",
|
||||
"filesize": 20817678
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -1,32 +0,0 @@
|
||||
{
|
||||
"folders": [
|
||||
{
|
||||
"id": 2037,
|
||||
"foldername": "P∴Rhythmatiq — 七色リミックス [OSLA-0005] (flac+scans)"
|
||||
},
|
||||
{
|
||||
"id": 2130,
|
||||
"foldername": " P∴Rhythmatiq — P∴Rhythmatiq act:09 [PRTQ-0017] (flac+scans)"
|
||||
},
|
||||
{
|
||||
"id": 2176,
|
||||
"foldername": "P∴Rhythmatiq — P∴Rhythmatiq act:02 [PRTQ-0002] (flac)"
|
||||
},
|
||||
{
|
||||
"id": 2184,
|
||||
"foldername": "P∴Rhythmatiq — P∴Rhythmatiq act:04 [PRTQ-0005] (flac)"
|
||||
},
|
||||
{
|
||||
"id": 2190,
|
||||
"foldername": "P∴Rhythmatiq — P∴Rhythmatiq Rock!! [PQPC-0001] (flac)"
|
||||
},
|
||||
{
|
||||
"id": 2360,
|
||||
"foldername": "P∴Rhythmatiq — P∴Rhythmatiq act:11 [PRTQ-0025] (flac)"
|
||||
},
|
||||
{ "id": 3443, "foldername": "P∴Rhythmatiq EXTRA" },
|
||||
{ "id": 3444, "foldername": "P∴Rhythmatiq He:arts" },
|
||||
{ "id": 3445, "foldername": "P∴Rhythmatiq Re act" },
|
||||
{ "id": 3446, "foldername": "P∴Rhythmatiq Rock!!" }
|
||||
]
|
||||
}
|
||||
Reference in New Issue
Block a user