43 Commits

Author SHA1 Message Date
5b262e6651 fixed fatal linux error with listenaddr incorrectly duplicated 2019-03-22 16:22:48 -04:00
496e255884 updating dependencies, adding go mod for versioning 2019-03-21 22:09:58 -04:00
0560c2f3fe Fixed error msg about 64-bit port integer (did not actually cause fatal errors), tested with updated go bin 2018-12-27 15:24:07 -05:00
01a976ab2f updating logrus configurations 2018-12-08 21:58:29 -05:00
d6341c9844 finished core rewrite for stability, just needs extensive testing, still need queue rewrite 2018-12-07 19:50:48 -05:00
1fac8757d0 Pulled in latest version of libraries, added Socks5 config 2018-11-15 17:19:15 -05:00
aba7382113 Fixing Queue issues, start/stop torrent issues 2018-09-13 19:34:30 -04:00
a5e9b6745f Moving Queue and Ratio checks to cron (fixes failure to stop on ratio) 2018-09-10 15:18:30 -04:00
224e7892ef Updating torrent library to latest, fixing breaking changes to torrent library api 2018-09-01 20:24:03 -04:00
6e5ba2c755 Ready for new release with new engine, will start bugfixes 2018-06-08 18:19:51 -04:00
cbfcba4cbc Adding a few settings to webui, cleanup of unneeded func 2018-05-27 17:45:29 -04:00
d15bb9752a Finished Engine re-write, awaiting testing 2018-05-27 17:34:14 -04:00
35a5ac37eb Engine rewrite about 80% done, but a ton of bugs and a few new features to add, almost no testing done 2018-05-17 13:52:47 -04:00
4909429390 starting to redo the core to do queuing and better downloading 2018-04-18 22:41:32 -04:00
0fdc926cc4 adding ability to generate API keys 2018-04-07 13:07:47 -04:00
3280360d47 fixing sorting in webui 2018-04-05 16:30:44 -04:00
f69ec5b9f2 Changing permissions to walk the entire structure 2018-04-03 21:39:22 -04:00
aee3516682 fix folder permissions in donetorrentactions 2018-04-02 21:21:41 -04:00
a7881a14c7 fixing path issue with starting torrent 2018-03-30 20:05:18 -04:00
128ec774bd changing how the start API command works to start torrents 2018-03-27 15:39:02 -04:00
bc612bf5e4 removing symlink option only copy for now 2018-03-26 21:06:14 -04:00
3f1f9e7104 separate thread for torrent list 2018-03-25 23:07:22 -04:00
eeb6e102f1 fixing log not writing to file 2018-03-25 21:23:15 -04:00
0a0f0cd577 fixing notification issue, parallelizing startTorrent, verifying torrent after move 2018-03-25 09:34:32 -04:00
10399cc6e5 updating readme with new documentation link 2018-03-23 15:17:35 -04:00
9363649df0 Merge branch 'master' of https://github.com/deranjer/goTorrent 2018-03-22 22:48:36 -04:00
a804b401a7 Update README.md
Updating roadmap..
2018-03-22 22:46:47 -04:00
3b2c392bdf Changed to force manual IP address entry 2018-03-20 21:54:38 -04:00
fa46ba6025 rewriting how file prio works, adding token generation to backend, minor fixes 2018-03-19 21:22:57 -04:00
a56a507ca2 some modal changes, adding memory leak to fix stop/drop issue 2018-03-05 22:48:16 -05:00
ca1ed925d3 a few js changes for react upgrades 2018-03-04 21:23:57 -05:00
34e5f5139a Completely updated React, fixed #11, (hopefully) 2018-03-04 19:11:49 -05:00
6e0afd6e2a Fixing some API issues, adding a few API responses 2018-03-01 15:31:11 -05:00
fb71ca9b4e Fixing some API calls to accept optional payload 2018-02-24 12:25:09 -05:00
4015a48454 Getting ready to release 0.3.0, changing to new documentation system 2018-02-20 22:11:11 -05:00
840a965877 Added Settings Webui (view only), rewrite of API, Fixes #14, Fixes #2, now Testing 2018-02-20 21:51:49 -05:00
d4966f597b Fixes #15, started seperating Settings into their own package 2018-02-17 11:52:38 -05:00
ba0f076c66 cleaning up an issue with client config generation 2018-02-16 20:41:09 -05:00
3978be8a40 Adding ReverseProxy settings File 2018-02-15 22:55:47 -05:00
c5b86597cb File prio code added, API rewrite completed, some core features rewritten for clarity 2018-02-15 22:49:11 -05:00
b843cfc11b adding frontend authentication, starting file priority code 2018-02-10 09:53:02 -05:00
42f4ecc81b Reverse Proxy with SSL support, Generated client Configs, JWT client to server auth, closes #13 2018-02-07 21:42:35 -05:00
d6288f4aaa Reverse Proxy with SSL support, Generated client Configs, JWT client to server auth, closes #13 2018-02-07 21:41:00 -05:00
13284 changed files with 274116 additions and 519084 deletions

9
.gitignore vendored
View File

@@ -2,8 +2,10 @@ downloads/
downloading/
downloaded/
uploadedTorrents/
boltBrowser/
storage.db.lock
storage.db
storage.db.old
.torrent.bolt.db.lock
.torrent.bolt.db
.idea/torrent-project.iml
@@ -17,5 +19,10 @@ boltbrowser.win64.exe
logs/server.log
.goreleaser.yml
config.toml.backup
config.1.toml
config.toml.old
/public/static/js/kickwebsocket.js.backup
dist
/public/static/js/kickwebsocket-generated.js
clientAuth.txt
dist
debScripts/

6
.vscode/tasks.json vendored
View File

@@ -1,7 +1,7 @@
{
// See https://go.microsoft.com/fwlink/?LinkId=733558
// for the documentation about the tasks.json format
"version": "2.0.0",
"version": "2.0.0",
"tasks": [
{
"taskName": "Run Program",
@@ -12,9 +12,9 @@
]
},
{
"taskName": "Build GopherJS",
"taskName": "goReleaser Snapshot",
"type": "shell",
"command": "C:/Users/deranjer/go/bin/gopherjs.exe build C:/Users/deranjer/GoglandProjects/torrent-project/public/static/js/frontend-websocket.go",
"command": "C:/Users/deranjer/go/bin/goreleaser.exe -rm-dist -snapshot",
"problemMatcher": [
"$go"
]

3
Dockerfile Normal file
View File

@@ -0,0 +1,3 @@
FROM scratch
COPY goTorrent /
ENTRYPOINT [ "/goTorrent" ]

130
README.md
View File

@@ -38,6 +38,16 @@ Image of the frontend UI
- [X] Global Rate Limiting for Upload/Download Speed
- [X] Add torrents from watch folder (cron job every 5 minutes)
- [X] Authentication from client to server (done via JWT, will add functionality for 3rd party clients later)
- [X] Reverse Proxy Support with SSL upgrade added (with provided config for nginx)
- [X] Mostly generated client config from toml.config on first run
- [X] Ability to view TOML settings from WebUI (and perhaps change a few as well)
- [X] Ability to set priority for individual files (needs more testing!)
- [ ] Unit testing completed for a large portion of the package
@@ -45,131 +55,17 @@ Image of the frontend UI
- [ ] Put the "Move torrent after download" into own goroutine with checks so the WebUI doesn't freeze when moving torrent
- [ ] Ability to set priority for individual files (just added to anacrolix/torrent so coming soon, already added to my UI)
- [ ] Ability to view TOML settings from WebUI (and perhaps change a few as well)
- [ ] Authentication from client to server
- Late 2018
- [ ] Define the websocket API for users to write their own clients/extensions
- [X] Define the websocket API for users to write their own clients/extensions
- [ ] React-native Android app (I don't own any Mac products so there will be no iPhone version)
# Installation:
# Documentation
## Linux (tested on Debian)
You can watch a YouTube video of me setting it up:
<a href="http://www.youtube.com/watch?feature=player_embedded&v=G0gO_cm_Oks
" target="_blank"><img src="http://img.youtube.com/vi/G0gO_cm_Oks/0.jpg"
alt="goTorrent Alpha Setup Video" width="240" height="180" border="10" /></a>
### Configuring the backend
Download the latest release from the releases tab, it will be in a tar.gz format.
Create a directory where goTorrent will run from
sudo mkdir /opt/goTorrent
Put the tar.gz release into the folder, and extract it.
tar -zxvf goTorrent_release_64-git.tar.gz
You can then remove the tar.gz if you wish. You should have something similar to the following files:
drwxr-xr-x 5 root root 9 Jan 21 14:56 .
drwxr-xr-x 5 root root 5 Jan 21 14:54 ..
-rw-rw-rw- 1 root root 1086 Dec 1 01:42 LICENSE
-rw-rw-rw- 1 root root 69 Dec 1 01:01 README.md
-rw-rw-rw- 1 root root 4466 Jan 21 03:48 config.toml
drwxr-xr-x 3 root root 3 Jan 21 14:55 dist-specific-files
-rw-rw-rw- 1 root root 12503552 Jan 21 03:53 goTorrent
drwxr-xr-x 3 root root 3 Jan 21 14:55 public
drwxr-xr-x 2 root root 3 Jan 21 14:55 templates
The `config.toml` file contains all of the settings for the server part of the application. Most of the important settings are at the top of the file, so open it with your prefered text editor.
[serverConfig]
ServerPort = ":8000" #leave format as is it expects a string with colon
ServerAddr = "" #blank will bind to default IP address, usually fine to leave be
LogLevel = "Warn" # Options = Debug, Info, Warn, Error, Fatal, Panic
LogOutput = "file" #Options = file, stdout #file will print it to logs/server.log
SeedRatioStop = 1.50 #automatically stops the torrent after it reaches this seeding ratio
#Relative or absolute path accepted, the server will convert any relative path to an absolute path.
DefaultMoveFolder = 'downloaded' #default path that a finished torrent is symlinked to after completion. Torrents added via RSS will default here
TorrentWatchFolder = 'torrentUpload' #folder path that is watched for .torrent files and adds them automatically every 5 minutes
#Limits your upload and download speed globally, all are averages and not burst protected (usually burst on start).
#Low = ~.05MB/s, Medium = ~.5MB/s, High = ~1.5MB/s
UploadRateLimit = "Unlimited" #Options are "Low", "Medium", "High", "Unlimited" #Unlimited is default
DownloadRateLimit = "Unlimited"
[notifications]
PushBulletToken = "" #add your pushbullet api token here to notify of torrent completion to pushbullet
Usually you don't need to change anything in this file, goTorrent will use your default IP address and bind to it. You can change the port if you wish.
Next, we need to make sure that the executable runs, so run the following:
chmod +x goTorrent
This will make the program executable.
###Connecting the Frontend to the Backend
We need to connect our react frontend to our Golang backend, for this we only need to edit one JS file.
nano public/static/js/kickwebsocket.js
var ws = new WebSocket("ws://192.168.1.141:8000/websocket"); //creating websocket
Just change the IP address after ws:// to your server IP address, and change the port if you changed the port in the `config.toml` file.
Then save that file and return to `/opt/goTorrent`.
Now we can test the server. For testing I recommend going into the `config.toml` file and changing the `LogOutput` to `stdout`, and the `LogLevel` to `Info`.
Then start the server:
./goTorrent
If you have `LogLevel` set to `Info`, you should see the confirmation that the client config has been generated.
You can then open your browser and connect to IP:Port (http) and you should see the main page. You will see an error for retrieving RSS feeds in stdout, but this is expected for first load.
You can press `F12` if using Chrome to open the console and click around the UI to see the logging available for the frontend.
### Running goTorrent as a Service
If you are on a linux system that uses systemd, in the `dist-specific-files\Linux-systemd\` folder there is a `goTorrent.service` file that can be used to setup systemd for goTorrent. A quick overview of what is needed.
1. Edit the systemd file to specify your specific implementation
2. Copy the file to your systemd folder, i.e. `/etc/systemd/system`
3. Enable the service `systemctl enable goTorrent.service`
4. If using a new user, create that user and assign permissions:
a. `useradd goTorrent`
b. `sudo chown -R goTorrent:goTorrent /opt/goTorrent`
c. If you want to test server: `su goTorrent` then run the executable
5. Set your `config.toml` file to the values you want.
6. Start your server: `systemctl start goTorrent`
7. Check for errors: `systemctl status goTorrent`. You can also check `logs\server.log`.
### Windows
Please see the linux instructions as they are similar, for running it as a service I havn't tried out any of the programs that claim to do that, but perhaps try [NSSM](http://nssm.cc/download)
All the documentation is available [here](https://deranjer.github.io/goTorrentDocs/)
# Special Thanks

View File

@@ -1,26 +1,46 @@
[serverConfig]
ServerPort = ":8000" #leave format as is it expects a string with colon
ServerAddr = "" #blank will bind to default IP address, usually fine to leave be
LogLevel = "Warn" # Options = Debug, Info, Warn, Error, Fatal, Panic
ServerPort = "8000" #Required to input as string
ServerAddr = "192.168.1.8" #Put in the IP address you want to bind to as string
LogLevel = "Debug" # Options = Debug, Info, Warn, Error, Fatal, Panic
LogOutput = "file" #Options = file, stdout #file will print it to logs/server.log
SeedRatioStop = 1.50 #automatically stops the torrent after it reaches this seeding ratio
#Relative or absolute path accepted, the server will convert any relative path to an absolute path.
DefaultMoveFolder = 'downloaded' #default path that a finished torrent is symlinked to after completion. Torrents added via RSS will default here
DefaultMoveFolder = 'downloads' #default path that a finished torrent is symlinked to after completion. Torrents added via RSS will default here
TorrentWatchFolder = 'torrentUpload' #folder path that is watched for .torrent files and adds them automatically every 5 minutes
#Limits your upload and download speed globally, all are averages and not burst protected (usually burst on start).
#Low = ~.05MB/s, Medium = ~.5MB/s, High = ~1.5MB/s
UploadRateLimit = "Unlimited" #Options are "Low", "Medium", "High", "Unlimited" #Unlimited is default
DownloadRateLimit = "Unlimited"
#Maximum number of allowed active torrents, the rest will be queued
MaxActiveTorrents = 5
[goTorrentWebUI]
#Basic goTorrentWebUI authentication (not terribly secure, implemented in JS, password is hashed to SHA256, not salted, basically don't depend on this if you require very good security)
WebUIAuth = false # bool, if false no authentication is required for the webUI
WebUIUser = "admin"
WebUIPassword = "Password1"
[notifications]
PushBulletToken = "" #add your pushbullet api token here to notify of torrent completion to pushbullet
[reverseProxy]
#This is for setting up goTorrent behind a reverse Proxy (with SSL, reverse proxy with no SSL will require editing the WSS connection to a WS connection manually)
ProxyEnabled = false #bool, either false or true
#URL is CASE SENSITIVE
BaseURL = "domain.com/subroute/" # MUST be in the format (if you have a subdomain, and must have trailing slash) "yoursubdomain.domain.org/subroute/"
[socksProxy]
SocksProxyEnabled = false #bool, either false or true
# Sets usage of Socks5 Proxy. Authentication should be included in the url if needed.
# Examples: socks5://demo:demo@192.168.99.100:1080
# http://proxy.domain.com:3128
SocksProxyURL = ""
[EncryptionPolicy]
@@ -28,7 +48,6 @@
ForceEncryption = false
PreferNoEncryption = true
[torrentClientConfig]
DownloadDir = 'downloading' #the full OR relative path where the torrent server stores in-progress torrents
@@ -37,6 +56,9 @@
# Never send chunks to peers.
NoUpload = false #boolean
#User-provided Client peer ID. If not present, one is generated automatically.
PeerID = "" #string
#The address to listen for new uTP and TCP bittorrent protocol connections. DHT shares a UDP socket with uTP unless configured otherwise.
ListenAddr = "" #Leave Blank for default, syntax "HOST:PORT"
@@ -48,9 +70,6 @@
# Don't create a DHT.
NoDHT = false #boolean
#User-provided Client peer ID. If not present, one is generated automatically.
PeerID = "" #string
#For the bittorrent protocol.
DisableUTP = false #bool

122
config.toml.bk Normal file
View File

@@ -0,0 +1,122 @@
[serverConfig]
ServerPort = ":8000" #leave format as is it expects a string with colon
ServerAddr = "192.168.1.8" #Put in the IP address you want to bind to
LogLevel = "Info" # Options = Debug, Info, Warn, Error, Fatal, Panic
LogOutput = "stdout" #Options = file, stdout #file will print it to logs/server.log
SeedRatioStop = 1.50 #automatically stops the torrent after it reaches this seeding ratio
#Relative or absolute path accepted, the server will convert any relative path to an absolute path.
DefaultMoveFolder = 'Z:\downloads' #default path that a finished torrent is symlinked to after completion. Torrents added via RSS will default here
TorrentWatchFolder = 'torrentUpload' #folder path that is watched for .torrent files and adds them automatically every 5 minutes
#Limits your upload and download speed globally, all are averages and not burst protected (usually burst on start).
#Low = ~.05MB/s, Medium = ~.5MB/s, High = ~1.5MB/s
UploadRateLimit = "Unlimited" #Options are "Low", "Medium", "High", "Unlimited" #Unlimited is default
DownloadRateLimit = "Unlimited"
[goTorrentWebUI]
#Basic goTorrentWebUI authentication (not terribly secure, implemented in JS, password is hashed to SHA256, not salted, basically don't depend on this if you require very good security)
WebUIAuth = false # bool, if false no authentication is required for the webUI
WebUIUser = "admin"
WebUIPassword = "Password1"
[notifications]
PushBulletToken = "o.8sUHemPkTCaty3u7KnyvEBN19EkeT63g" #add your pushbullet api token here to notify of torrent completion to pushbullet
[reverseProxy]
#This is for setting up goTorrent behind a reverse Proxy (with SSL, reverse proxy with no SSL will require editing the WSS connection to a WS connection manually)
ProxyEnabled = false #bool, either false or true
#URL is CASE SENSITIVE
BaseURL = "derajnet.duckdns.org/gopher/" # MUST be in the format (if you have a subdomain, and must have trailing slash) "yoursubdomain.domain.org/subroute/"
[EncryptionPolicy]
DisableEncryption = false
ForceEncryption = false
PreferNoEncryption = false
[torrentClientConfig]
DownloadDir = 'downloading' #the full OR relative path where the torrent server stores in-progress torrents
Seed = false #boolean #seed after download
# Never send chunks to peers.
NoUpload = false #boolean
#User-provided Client peer ID. If not present, one is generated automatically.
PeerID = "" #string
#The address to listen for new uTP and TCP bittorrent protocol connections. DHT shares a UDP socket with uTP unless configured otherwise.
ListenAddr = "" #Leave Blank for default, syntax "HOST:PORT"
#Don't announce to trackers. This only leaves DHT to discover peers.
DisableTrackers = false #boolean
DisablePEX = false # boolean
# Don't create a DHT.
NoDHT = false #boolean
#For the bittorrent protocol.
DisableUTP = false #bool
#For the bittorrent protocol.
DisableTCP = false #bool
#Called to instantiate storage for each added torrent. Builtin backends
# are in the storage package. If not set, the "file" implementation is used.
DefaultStorage = "storage.ClientImpl"
#encryption policy
IPBlocklist = "" #of type iplist.Ranger
DisableIPv6 = false #boolean
Debug = false #boolean
#HTTP *http.Client
HTTPUserAgent = "" # HTTPUserAgent changes default UserAgent for HTTP requests
ExtendedHandshakeClientVersion = ""
Bep20 = ""
# Overrides the default DHT configuration, see dhtServerConfig #advanced.. so be careful
DHTConfig = "" # default is "dht.ServerConfig"
[dhtServerConfig]
# Set NodeId Manually. Caller must ensure that if NodeId does not conform to DHT Security Extensions, that NoSecurity is also set.
NodeId = "" #[20]byte
Conn = "" # https:#godoc.org/net#PacketConn #not implemented
# Don't respond to queries from other nodes.
Passive = false # boolean
# the default addresses are "router.utorrent.com:6881","router.bittorrent.com:6881","dht.transmissionbt.com:6881","dht.aelitis.com:6881",
#https:#github.com/anacrolix/dht/blob/master/dht.go
StartingNodes = "dht.GlobalBootstrapAddrs"
#Disable the DHT security extension: http:#www.libtorrent.org/dht_sec.html.
NoSecurity = false
#Initial IP blocklist to use. Applied before serving and bootstrapping begins.
IPBlocklist = "" #of type iplist.Ranger
#Used to secure the server's ID. Defaults to the Conn's LocalAddr(). Set to the IP that remote nodes will see,
#as that IP is what they'll use to validate our ID.
PublicIP = "" #net.IP
#Hook received queries. Return true if you don't want to propagate to the default handlers.
OnQuery = "func(query *krpc.Msg, source net.Addr) (propagate bool)"
#Called when a peer successfully announces to us.
OnAnnouncePeer = "func(infoHash metainfo.Hash, peer Peer)"
#How long to wait before re-sending queries that haven't received a response. Defaults to a random value between 4.5 and 5.5s.
QueryResendDelay = "func() time.Duration"

View File

@@ -0,0 +1,12 @@
location ^~ /gotorrent/ {
proxy_pass http://192.168.1.100:8000/;
proxy_redirect http:// https://;
proxy_pass_header Server;
proxy_set_header Host $http_host;
proxy_set_header X-Real-IP $http_address;
proxy_set_header X-Scheme $scheme;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "Upgrade";
}

View File

@@ -5,21 +5,26 @@ import (
"github.com/anacrolix/torrent"
"github.com/anacrolix/torrent/metainfo"
Settings "github.com/deranjer/goTorrent/settings"
Storage "github.com/deranjer/goTorrent/storage"
)
//All the message types are first, first the server handling messages from the client
//Message contains the JSON messages from the client, we first unmarshal to get the messagetype, then pass it on to each module
type Message struct {
MessageType string
MessageDetail string `json:",omitempty"`
MessageDetailTwo string `json:",omitempty"`
MessageDetailThree string `json:",omitempty"`
Payload []string
MessageType string
Payload interface{}
}
//Next are the messages the server sends to the client
//AuthResponse is sent when the client fails to perform authentication correctly
type AuthResponse struct {
MessageType string
Payload string
}
//ServerPushMessage is information (usually logs and status messages) that the server pushes to the client
type ServerPushMessage struct {
MessageType string
@@ -40,8 +45,17 @@ type RSSFeedsNames struct {
RSSFeedURL string
}
//SingleRSSFeedMessage contains the torrents/name/etc of a single torrent feed
type SingleRSSFeedMessage struct { //TODO had issues with getting this to work with Storage or Engine
MessageType string
URL string //the URL of the individual RSS feed
Name string
TotalTorrents int
Torrents []Storage.SingleRSSTorrent //name of the torrents
}
//TorrentList struct contains the torrent list that is sent to the client
type TorrentList struct { //helps create the JSON structure that react expects to recieve
type TorrentList struct { //helps create the JSON structure that react expects to receive
MessageType string `json:"MessageType"`
Totaltorrents int `json:"total"`
ClientDBstruct []ClientDB `json:"data"`
@@ -56,48 +70,53 @@ type TorrentFileList struct {
//PeerFileList returns a slice of peers
type PeerFileList struct {
MessageType string `json:"MessageType"`
TotalPeers int `json:"TotalPeers"`
PeerList []torrent.Peer `json:"PeerList"`
MessageType string
TotalPeers int
PeerList []torrent.Peer
}
//TorrentFile describes a single file that a torrent client is downloading for a single torrent
type TorrentFile struct {
TorrentHashString string //Used to tie the file to a torrent //TODO not sure if neededs
FileName string
FilePath string
FileSize string
FilePercent string
FilePriority string
TorrentHashString string //Used to tie the file to a torrent //TODO not sure if needed
FileName string //The name of the file
FilePath string //The relative filepath to the file
FileSize string //Humanized file size display
FilePercent string //String value of percent of individual file percent done
FilePriority string //Currently "High", "Normal", or "Cancel"
}
type SettingsFile struct {
MessageType string
Config Settings.FullClientSettings
}
//ClientDB struct contains the struct that is used to compose the torrentlist
type ClientDB struct { //TODO maybe separate out the internal bits into another client struct
TorrentHashString string `json:"TorrentHashString"` //Passed to client for displaying hash and is used to uniquly identify all torrents
TorrentName string `json:"TorrentName"`
DownloadedSize string `json:"DownloadedSize"` //how much the client has downloaded total
Size string `json:"Size"` //total size of the torrent
DownloadSpeed string `json:"DownloadSpeed"` //the dl speed of the torrent
Status string `json:"Status"` //Passed to client for display
PercentDone string `json:"PercentDone"` //Passed to client to show percent done
ActivePeers string `json:"ActivePeers"` //passed to client
UploadSpeed string `json:"UploadSpeed"` //passed to client to show Uploadspeed
StoragePath string `json:"StoragePath"` //Passed to client (and stored in stormdb)
TorrentHashString string //Passed to client for displaying hash and is used to uniquely identify all torrents
TorrentName string //String of the name of the torrent
DownloadedSize string //how much the client has downloaded total
Size string //total size of the torrent
DownloadSpeed string //the dl speed of the torrent
Status string //Passed to client for display
PercentDone string //Passed to client to show percent done
ActivePeers string //passed to client
UploadSpeed string //passed to client to show Uploadspeed
StoragePath string //Passed to client (and stored in stormdb)
DateAdded string //Passed to client (and stored in stormdb)
ETA string `json:"ETA"` //Passed to client
ETA string //Passed to client
TorrentLabel string //Passed to client and stored in stormdb
SourceType string `json:"SourceType"` //Stores whether the torrent came from a torrent file or a magnet link
SourceType string //Stores whether the torrent came from a torrent file or a magnet link
KnownSwarm []torrent.Peer //Passed to client for Peer Tab
UploadRatio string //Passed to client, stores the string for uploadratio stored in stormdb
TotalUploadedSize string //Humanized version of TotalUploadedBytes to pass to the client
TotalUploadedBytes int64 //includes bytes that happened before reboot (from stormdb)
TotalUploadedBytes int64 `json:"-"` //includes bytes that happened before reboot (from stormdb)
downloadSpeedInt int64 //Internal used for calculating dl speed
BytesCompleted int64 //Internal used for calculating the dl speed
DataBytesWritten int64 //Internal used for calculating dl speed
DataBytesRead int64 //Internal used for calculating dl speed
UpdatedAt time.Time //Internal used for calculating speeds of upload and download
TorrentHash metainfo.Hash //Used to create string for TorrentHashString... not sure why I have it... make that a TODO I guess
NumberofFiles int
NumberofPieces int
MaxConnections int //Used to stop the torrent by limiting the max allowed connections
BytesCompleted int64 `json:"-"` //Internal used for calculating the dl speed
DataBytesWritten int64 `json:"-"` //Internal used for calculating dl speed
DataBytesRead int64 `json:"-"` //Internal used for calculating dl speed
UpdatedAt time.Time `json:"-"` //Internal used for calculating speeds of upload and download
TorrentHash metainfo.Hash `json:"-"` //Used to create string for TorrentHashString... not sure why I have it... make that a TODO I guess
NumberofFiles int //Number of files in the torrent
NumberofPieces int //Total number of pieces in the torrent (Not currently used)
MaxConnections int //Used to stop the torrent by limiting the max allowed connections
}

View File

@@ -7,6 +7,7 @@ import (
"github.com/anacrolix/torrent"
"github.com/asdine/storm"
Settings "github.com/deranjer/goTorrent/settings"
Storage "github.com/deranjer/goTorrent/storage"
"github.com/mmcdole/gofeed"
"github.com/robfig/cron"
@@ -21,7 +22,7 @@ func InitializeCronEngine() *cron.Cron {
}
//CheckTorrentWatchFolder adds torrents from a watch folder //TODO see if you can use filepath.Abs instead of changing directory
func CheckTorrentWatchFolder(c *cron.Cron, db *storm.DB, tclient *torrent.Client, torrentLocalStorage Storage.TorrentLocal, config FullClientSettings) {
func CheckTorrentWatchFolder(c *cron.Cron, db *storm.DB, tclient *torrent.Client, torrentLocalStorage Storage.TorrentLocal, config Settings.FullClientSettings, torrentQueues Storage.TorrentQueues) {
c.AddFunc("@every 5m", func() {
Logger.WithFields(logrus.Fields{"Watch Folder": config.TorrentWatchFolder}).Info("Running the watch folder cron job")
torrentFiles, err := ioutil.ReadDir(config.TorrentWatchFolder)
@@ -49,15 +50,59 @@ func CheckTorrentWatchFolder(c *cron.Cron, db *storm.DB, tclient *torrent.Client
os.Remove(fullFilePathAbs) //delete the torrent after adding it and copying it over
Logger.WithFields(logrus.Fields{"Source Folder": fullFilePathAbs, "Destination Folder": fullNewFilePathAbs, "Torrent": file.Name()}).Info("Added torrent from watch folder, and moved torrent file")
StartTorrent(clientTorrent, torrentLocalStorage, db, "file", fullNewFilePathAbs, config.DefaultMoveFolder, "default", config)
AddTorrent(clientTorrent, torrentLocalStorage, db, "file", fullNewFilePathAbs, config.DefaultMoveFolder, "default", config)
}
}
})
}
//CheckTorrentsCron runs a upload ratio check, a queue check (essentially anything that should not be frontend dependent)
func CheckTorrentsCron(c *cron.Cron, db *storm.DB, tclient *torrent.Client, config Settings.FullClientSettings) {
c.AddFunc("@every 30s", func() {
Logger.Debug("Running a torrent Ratio and Queue Check")
torrentLocalArray := Storage.FetchAllStoredTorrents(db)
torrentQueues := Storage.FetchQueues(db)
for _, singleTorrentFromStorage := range torrentLocalArray {
var singleTorrent *torrent.Torrent
for _, liveTorrent := range tclient.Torrents() { //matching the torrent from storage to the live torrent
if singleTorrentFromStorage.Hash == liveTorrent.InfoHash().String() {
singleTorrent = liveTorrent
}
}
calculatedCompletedSize := CalculateCompletedSize(singleTorrentFromStorage, singleTorrent)
bytesCompleted := CalculateCompletedSize(singleTorrentFromStorage, singleTorrent)
if float64(singleTorrentFromStorage.UploadedBytes)/float64(bytesCompleted) >= config.SeedRatioStop && singleTorrentFromStorage.TorrentUploadLimit == true { //If storage shows torrent stopped or if it is over the seeding ratio AND is under the global limit
Logger.WithFields(logrus.Fields{"Action: Stopping torrent due to seed Ratio": singleTorrentFromStorage.TorrentName}).Info()
StopTorrent(singleTorrent, singleTorrentFromStorage, db)
}
if len(torrentQueues.ActiveTorrents) < config.MaxActiveTorrents && singleTorrentFromStorage.TorrentStatus == "Queued" {
Logger.WithFields(logrus.Fields{"Action: Adding Torrent to Active Queue": singleTorrentFromStorage.TorrentName}).Info()
AddTorrentToActive(singleTorrentFromStorage, singleTorrent, db)
}
if (calculatedCompletedSize == singleTorrentFromStorage.TorrentSize) && (singleTorrentFromStorage.TorrentMoved == false) { //if we are done downloading and haven't moved torrent yet
Logger.WithFields(logrus.Fields{"singleTorrent": singleTorrentFromStorage.TorrentName}).Info("Torrent Completed, moving...")
tStorage := Storage.FetchTorrentFromStorage(db, singleTorrent.InfoHash().String()) //Todo... find a better way to do this in the go-routine currently just to make sure it doesn't trigger multiple times
tStorage.TorrentMoved = true
Storage.UpdateStorageTick(db, tStorage)
go func() { //moving torrent in separate go-routine then verifying that the data is still there and correct
err := MoveAndLeaveSymlink(config, singleTorrent.InfoHash().String(), db, false, "") //can take some time to move file so running this in another thread TODO make this a goroutine and skip this block if the routine is still running
if err != nil { //If we fail, print the error and attempt a retry
Logger.WithFields(logrus.Fields{"singleTorrent": singleTorrentFromStorage.TorrentName, "error": err}).Error("Failed to move Torrent!")
VerifyData(singleTorrent)
tStorage.TorrentMoved = false
Storage.UpdateStorageTick(db, tStorage)
}
}()
}
}
ValidateQueues(db, config, tclient) //Ensure we don't have too many in activeQueue
})
}
//RefreshRSSCron refreshes all of the RSS feeds on an hourly basis
func RefreshRSSCron(c *cron.Cron, db *storm.DB, tclient *torrent.Client, torrentLocalStorage Storage.TorrentLocal, config FullClientSettings) {
func RefreshRSSCron(c *cron.Cron, db *storm.DB, tclient *torrent.Client, torrentLocalStorage Storage.TorrentLocal, config Settings.FullClientSettings, torrentQueues Storage.TorrentQueues) {
c.AddFunc("@hourly", func() {
torrentHashHistory := Storage.FetchHashHistory(db)
RSSFeedStore := Storage.FetchRSSFeeds(db)
@@ -85,7 +130,7 @@ func RefreshRSSCron(c *cron.Cron, db *storm.DB, tclient *torrent.Client, torrent
Logger.WithFields(logrus.Fields{"err": err, "Torrent": RSSTorrent.Title}).Warn("Unable to add torrent to torrent client!")
break //break out of the loop entirely for this message since we hit an error
}
StartTorrent(clientTorrent, torrentLocalStorage, db, "magnet", "", config.DefaultMoveFolder, "RSS", config) //TODO let user specify torrent default storage location and let change on fly
AddTorrent(clientTorrent, torrentLocalStorage, db, "magnet", "", config.DefaultMoveFolder, "RSS", config) //TODO let user specify torrent default storage location and let change on fly
singleFeed.Torrents = append(singleFeed.Torrents, singleRSSTorrent)
}

View File

@@ -3,9 +3,9 @@ package engine
import (
"os"
"path/filepath"
"runtime"
"github.com/asdine/storm"
Settings "github.com/deranjer/goTorrent/settings"
Storage "github.com/deranjer/goTorrent/storage"
pushbullet "github.com/mitsuse/pushbullet-go"
"github.com/mitsuse/pushbullet-go/requests"
@@ -14,7 +14,7 @@ import (
)
//MoveAndLeaveSymlink takes the file from the default download dir and moves it to the user specified directory and then leaves a symlink behind.
func MoveAndLeaveSymlink(config FullClientSettings, tHash string, db *storm.DB, moveDone bool, oldPath string) { //moveDone and oldPath are for moving a completed torrent
func MoveAndLeaveSymlink(config Settings.FullClientSettings, tHash string, db *storm.DB, moveDone bool, oldPath string) error { //moveDone and oldPath are for moving a completed torrent
tStorage := Storage.FetchTorrentFromStorage(db, tHash)
Logger.WithFields(logrus.Fields{"Torrent Name": tStorage.TorrentName}).Info("Move and Create symlink started for torrent")
var oldFilePath string
@@ -24,6 +24,8 @@ func MoveAndLeaveSymlink(config FullClientSettings, tHash string, db *storm.DB,
oldFilePath, err = filepath.Abs(oldFilePathTemp)
if err != nil {
Logger.WithFields(logrus.Fields{"Torrent Name": tStorage.TorrentName, "Filepath": oldFilePath}).Error("Cannot create absolute file path!")
moveDone = false
return err
}
} else {
oldFilePathTemp := filepath.Join(config.TorrentConfig.DataDir, tStorage.TorrentName)
@@ -31,41 +33,57 @@ func MoveAndLeaveSymlink(config FullClientSettings, tHash string, db *storm.DB,
oldFilePath, err = filepath.Abs(oldFilePathTemp)
if err != nil {
Logger.WithFields(logrus.Fields{"Torrent Name": tStorage.TorrentName, "Filepath": oldFilePath}).Error("Cannot create absolute file path!")
moveDone = false
return err
}
}
newFilePathTemp := filepath.Join(tStorage.StoragePath, tStorage.TorrentName)
newFilePath, err := filepath.Abs(newFilePathTemp)
if err != nil {
Logger.WithFields(logrus.Fields{"Torrent Name": tStorage.TorrentName, "Filepath": newFilePath}).Error("Cannot create absolute file path for new file path!")
moveDone = false
return err
}
_, err = os.Stat(tStorage.StoragePath)
if os.IsNotExist(err) {
err := os.MkdirAll(tStorage.StoragePath, 0755)
err := os.MkdirAll(tStorage.StoragePath, 0777)
if err != nil {
Logger.WithFields(logrus.Fields{"New File Path": newFilePath, "error": err}).Error("Cannot create new directory")
moveDone = false
return err
}
}
oldFileInfo, err := os.Stat(oldFilePath)
if err != nil {
Logger.WithFields(logrus.Fields{"Old File info": oldFileInfo, "Old File Path": oldFilePath, "error": err}).Error("Cannot find the old file to copy/symlink!")
return
moveDone = false
return err
}
if oldFilePath != newFilePath {
newFilePathDir := filepath.Dir(newFilePath)
os.Mkdir(newFilePathDir, 0755)
os.Mkdir(newFilePathDir, 0777)
err := folderCopy.Copy(oldFilePath, newFilePath) //copy the folder to the new location
if err != nil {
Logger.WithFields(logrus.Fields{"Old File Path": oldFilePath, "New File Path": newFilePath, "error": err}).Error("Error Copying Folder!")
return err
}
os.Chmod(newFilePath, 0777)
if runtime.GOOS != "windows" { //TODO the windows symlink is broken on windows 10 creator edition, so on the other platforms create symlink (windows will copy) until Go1.11
err = filepath.Walk(newFilePath, func(path string, info os.FileInfo, err error) error { //Walking the file path to change the permissions
if err != nil {
Logger.WithFields(logrus.Fields{"file": path, "error": err}).Error("Potentially non-critical error, continuing..")
}
os.Chmod(path, 0777)
return nil
})
/* if runtime.GOOS != "windows" { //TODO the windows symlink is broken on windows 10 creator edition, so on the other platforms create symlink (windows will copy) until Go1.11
os.RemoveAll(oldFilePath)
err = os.Symlink(newFilePath, oldFilePath)
if err != nil {
Logger.WithFields(logrus.Fields{"Old File Path": oldFilePath, "New File Path": newFilePath, "error": err}).Error("Error creating symlink")
moveDone = false
return err
}
}
} */
if moveDone == false {
tStorage.TorrentMoved = true //TODO error handling instead of just saying torrent was moved when it was not
notifyUser(tStorage, config, db) //Only notify if we haven't moved yet, don't want to push notify user every time user uses change storage button
@@ -74,10 +92,10 @@ func MoveAndLeaveSymlink(config FullClientSettings, tHash string, db *storm.DB,
tStorage.StoragePath = filepath.Dir(newFilePath)
Storage.UpdateStorageTick(db, tStorage)
}
return nil
}
func notifyUser(tStorage Storage.TorrentLocal, config FullClientSettings, db *storm.DB) {
func notifyUser(tStorage Storage.TorrentLocal, config Settings.FullClientSettings, db *storm.DB) {
Logger.WithFields(logrus.Fields{"New File Path": tStorage.StoragePath, "Torrent Name": tStorage.TorrentName}).Info("Attempting to notify user..")
tStorage.TorrentMoved = true
//Storage.AddTorrentLocalStorage(db, tStorage) //Updating the fact that we moved the torrent

View File

@@ -1,46 +0,0 @@
package engine
import (
"testing"
"github.com/asdine/storm"
Storage "github.com/deranjer/goTorrent/storage"
)
func TestMoveAndLeaveSymlink(t *testing.T) {
type args struct {
config FullClientSettings
tStorage Storage.TorrentLocal
db *storm.DB
}
tests := []struct {
name string
args args
}{
// TODO: Add test cases.
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
MoveAndLeaveSymlink(tt.args.config, tt.args.tStorage, tt.args.db)
})
}
}
func Test_notifyUser(t *testing.T) {
type args struct {
tStorage Storage.TorrentLocal
config FullClientSettings
db *storm.DB
}
tests := []struct {
name string
args args
}{
// TODO: Add test cases.
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
notifyUser(tt.args.tStorage, tt.args.config, tt.args.db)
})
}
}

View File

@@ -11,6 +11,7 @@ import (
"github.com/anacrolix/torrent"
"github.com/anacrolix/torrent/metainfo"
"github.com/asdine/storm"
Settings "github.com/deranjer/goTorrent/settings"
Storage "github.com/deranjer/goTorrent/storage"
"github.com/gorilla/websocket"
"github.com/mmcdole/gofeed"
@@ -20,6 +21,9 @@ import (
//Logger is the injected variable for global logger
var Logger *logrus.Logger
//Config is the injected variable for the torrent config
var Config Settings.FullClientSettings
//Conn is the injected variable for the websocket connection
var Conn *websocket.Conn
@@ -84,7 +88,6 @@ func timeOutInfo(clientTorrent *torrent.Torrent, seconds time.Duration) (deleted
select {
case <-clientTorrent.GotInfo(): //attempting to retrieve info for torrent
Logger.WithFields(logrus.Fields{"clientTorrentName": clientTorrent.Name()}).Debug("Received torrent info for torrent")
clientTorrent.DownloadAll()
return false
case <-timeout: // getting info for torrent has timed out so purging the torrent
Logger.WithFields(logrus.Fields{"clientTorrentName": clientTorrent.Name()}).Error("Forced to drop torrent from timeout waiting for info")
@@ -126,18 +129,19 @@ func readTorrentFileFromDB(element *Storage.TorrentLocal, tclient *torrent.Clien
return singleTorrent, nil
}
//StartTorrent creates the storage.db entry and starts A NEW TORRENT and adds to the running torrent array
func StartTorrent(clientTorrent *torrent.Torrent, torrentLocalStorage Storage.TorrentLocal, torrentDbStorage *storm.DB, torrentType, torrentFilePathAbs, torrentStoragePath, labelValue string, config FullClientSettings) {
//AddTorrent creates the storage.db entry and starts A NEW TORRENT and adds to the running torrent array
func AddTorrent(clientTorrent *torrent.Torrent, torrentLocalStorage Storage.TorrentLocal, db *storm.DB, torrentType, torrentFilePathAbs, torrentStoragePath, labelValue string, config Settings.FullClientSettings) {
timedOut := timeOutInfo(clientTorrent, 45) //seeing if adding the torrent times out (giving 45 seconds)
if timedOut { //if we fail to add the torrent return
return
}
var TempHash metainfo.Hash
TempHash = clientTorrent.InfoHash()
allStoredTorrents := Storage.FetchAllStoredTorrents(torrentDbStorage)
fmt.Println("GOT INFOHASH", TempHash.String())
allStoredTorrents := Storage.FetchAllStoredTorrents(db)
for _, runningTorrentHashes := range allStoredTorrents {
if runningTorrentHashes.Hash == TempHash.String() {
Logger.WithFields(logrus.Fields{"Hash": TempHash.String()}).Error("Torrent has duplicate hash to already running torrent... will not add to storage")
Logger.WithFields(logrus.Fields{"Hash": TempHash.String()}).Info("Torrent has duplicate hash to already running torrent... will not add to storage")
return
}
}
@@ -161,45 +165,39 @@ func StartTorrent(clientTorrent *torrent.Torrent, torrentLocalStorage Storage.To
}
torrentLocalStorage.TorrentFile = torrentfile //storing the entire file in to database
}
Logger.WithFields(logrus.Fields{"Storage Path": torrentStoragePath, "Torrent Name": clientTorrent.Name()}).Info("Adding Torrent with following storage path")
Logger.WithFields(logrus.Fields{"Storage Path": torrentStoragePath, "Torrent Name": clientTorrent.Name()}).Info("Adding Torrent with following storage path, to active Queue")
torrentFiles := clientTorrent.Files() //storing all of the files in the database along with the priority
var TorrentFilePriorityArray = []Storage.TorrentFilePriority{}
for _, singleFile := range torrentFiles { //creating the database setup for the file array
var torrentFilePriority = Storage.TorrentFilePriority{}
torrentFilePriority.TorrentFilePath = singleFile.DisplayPath()
torrentFilePriority.TorrentFilePriority = "Normal"
torrentFilePriority.TorrentFileSize = singleFile.Length()
TorrentFilePriorityArray = append(TorrentFilePriorityArray, torrentFilePriority)
}
torrentLocalStorage.TorrentFilePriority = TorrentFilePriorityArray
Storage.AddTorrentLocalStorage(torrentDbStorage, torrentLocalStorage) //writing all of the data to the database
clientTorrent.DownloadAll() //starting the download
CreateServerPushMessage(ServerPushMessage{MessageType: "serverPushMessage", MessageLevel: "success", Payload: "Torrent added!"}, Conn)
//torrentQueues := Storage.FetchQueues(db)
AddTorrentToActive(&torrentLocalStorage, clientTorrent, db)
Storage.AddTorrentLocalStorage(db, torrentLocalStorage) //writing all of the data to the database
}
//CreateRunningTorrentArray creates the entire torrent list to pass to client
func CreateRunningTorrentArray(tclient *torrent.Client, TorrentLocalArray []*Storage.TorrentLocal, PreviousTorrentArray []ClientDB, config FullClientSettings, db *storm.DB) (RunningTorrentArray []ClientDB) {
//CreateInitialTorrentArray adds all the torrents on program start from the database
func CreateInitialTorrentArray(tclient *torrent.Client, TorrentLocalArray []*Storage.TorrentLocal, db *storm.DB, config Settings.FullClientSettings) {
for _, singleTorrentFromStorage := range TorrentLocalArray {
var singleTorrent *torrent.Torrent
var TempHash metainfo.Hash
tickUpdateStruct := Storage.TorrentLocal{} //we are shoving the tick updates into a torrentlocal struct to pass to storage happens at the end of the routine
fullClientDB := new(ClientDB)
//singleTorrentStorageInfo := Storage.FetchTorrentFromStorage(db, TempHash.String()) //pulling the single torrent info from storage ()
var err error
if singleTorrentFromStorage.TorrentType == "file" { //if it is a file pull it from the uploaded torrent folder
var err error
singleTorrent, err = readTorrentFileFromDB(singleTorrentFromStorage, tclient, db)
if err != nil {
continue
}
fullClientDB.SourceType = "Torrent File"
} else {
singleTorrentFromStorageMagnet := "magnet:?xt=urn:btih:" + singleTorrentFromStorage.Hash //For magnet links just need to prepend the magnet part to the hash to readd
singleTorrent, _ = tclient.AddMagnet(singleTorrentFromStorageMagnet)
fullClientDB.SourceType = "Magnet Link"
singleTorrent, err = tclient.AddMagnet(singleTorrentFromStorageMagnet)
if err != nil {
continue
}
}
if len(singleTorrentFromStorage.InfoBytes) == 0 { //TODO.. kind of a fringe scenario.. not sure if needed since the db should always have the infobytes
timeOut := timeOutInfo(singleTorrent, 45)
@@ -210,61 +208,150 @@ func CreateRunningTorrentArray(tclient *torrent.Client, TorrentLocalArray []*Sto
singleTorrentFromStorage.InfoBytes = singleTorrent.Metainfo().InfoBytes
}
err := singleTorrent.SetInfoBytes(singleTorrentFromStorage.InfoBytes) //setting the infobytes back into the torrent
err = singleTorrent.SetInfoBytes(singleTorrentFromStorage.InfoBytes) //setting the infobytes back into the torrent
if err != nil {
Logger.WithFields(logrus.Fields{"torrentFile": singleTorrent.Name(), "error": err}).Error("Unable to add infobytes to the torrent!")
}
//Logger.WithFields(logrus.Fields{"singleTorrent": singleTorrentFromStorage.TorrentName}).Info("Generating infohash")
TempHash = singleTorrent.InfoHash()
if (singleTorrent.BytesCompleted() == singleTorrentFromStorage.TorrentSize) && (singleTorrentFromStorage.TorrentMoved == false) { //if we are done downloading and haven't moved torrent yet
Logger.WithFields(logrus.Fields{"singleTorrent": singleTorrentFromStorage.TorrentName}).Info("Torrent Completed, moving...")
MoveAndLeaveSymlink(config, singleTorrent.InfoHash().String(), db, false, "") //can take some time to move file so running this in another thread TODO make this a goroutine and skip this block if the routine is still running
torrentQueues := Storage.FetchQueues(db)
if singleTorrentFromStorage.TorrentStatus == "Stopped" {
singleTorrent.SetMaxEstablishedConns(0)
continue
}
if singleTorrentFromStorage.TorrentStatus == "ForceStart" {
AddTorrentToForceStart(singleTorrentFromStorage, singleTorrent, db)
}
if len(torrentQueues.ActiveTorrents) == 0 && len(torrentQueues.QueuedTorrents) == 0 { // If empty, run through all the torrents and assign them
if len(torrentQueues.ActiveTorrents) < Config.MaxActiveTorrents {
if singleTorrentFromStorage.TorrentStatus == "Completed" || singleTorrentFromStorage.TorrentStatus == "Seeding" {
Logger.WithFields(logrus.Fields{"Torrent Name": singleTorrentFromStorage.TorrentName}).Info("Completed Torrents have lower priority, adding to Queued")
AddTorrentToQueue(singleTorrentFromStorage, singleTorrent, db)
} else {
Logger.WithFields(logrus.Fields{"Torrent Name": singleTorrentFromStorage.TorrentName}).Info("Adding Torrent to Active Queue (Initial Torrent Load)")
AddTorrentToActive(singleTorrentFromStorage, singleTorrent, db)
}
} else {
Logger.WithFields(logrus.Fields{"Torrent Name": singleTorrentFromStorage.TorrentName}).Info("Last resort for torrent, adding to Queued")
AddTorrentToQueue(singleTorrentFromStorage, singleTorrent, db)
}
} else { //If we already have a queue set up then assign torrents to queue
if singleTorrentFromStorage.TorrentStatus == "Queued" {
AddTorrentToQueue(singleTorrentFromStorage, singleTorrent, db)
} else {
if len(torrentQueues.ActiveTorrents) < Config.MaxActiveTorrents {
Logger.WithFields(logrus.Fields{"Torrent Name": singleTorrentFromStorage.TorrentName}).Info("Adding Torrent to Active Queue (Initial Torrent Load Second)")
AddTorrentToActive(singleTorrentFromStorage, singleTorrent, db)
} else {
AddTorrentToQueue(singleTorrentFromStorage, singleTorrent, db)
}
}
RemoveDuplicatesFromQueues(db)
}
Storage.UpdateStorageTick(db, *singleTorrentFromStorage)
}
torrentQueues := Storage.FetchQueues(db)
if len(torrentQueues.ActiveTorrents) < config.MaxActiveTorrents && len(torrentQueues.QueuedTorrents) > 0 { //after all the torrents are added, see if out active torrent list isn't full, then add from the queue
Logger.WithFields(logrus.Fields{"Max Active: ": config.MaxActiveTorrents, "Current : ": torrentQueues.ActiveTorrents}).Info("Adding Torrents from queue to active to fill...")
maxCanSend := config.MaxActiveTorrents - len(torrentQueues.ActiveTorrents)
if maxCanSend > len(torrentQueues.QueuedTorrents) {
maxCanSend = len(torrentQueues.QueuedTorrents)
}
torrentsToStart := make([]string, maxCanSend)
copy(torrentsToStart, torrentQueues.QueuedTorrents[len(torrentsToStart)-1:])
for _, torrentStart := range torrentsToStart {
for _, singleTorrent := range tclient.Torrents() {
if singleTorrent.InfoHash().String() == torrentStart {
singleTorrentFromStorage := Storage.FetchTorrentFromStorage(db, torrentStart)
AddTorrentToActive(&singleTorrentFromStorage, singleTorrent, db)
}
}
}
}
SetFilePriority(tclient, db) //Setting the desired file priority from storage
Logger.WithFields(logrus.Fields{"Max Active: ": config.MaxActiveTorrents, "Current : ": torrentQueues.ActiveTorrents}).Debug("Queue after all initial torrents have been added")
}
//CreateRunningTorrentArray creates the entire torrent list to pass to client
func CreateRunningTorrentArray(tclient *torrent.Client, TorrentLocalArray []*Storage.TorrentLocal, PreviousTorrentArray []ClientDB, config Settings.FullClientSettings, db *storm.DB) (RunningTorrentArray []ClientDB) {
torrentQueues := Storage.FetchQueues(db)
Logger.WithFields(logrus.Fields{"Max Active: ": config.MaxActiveTorrents, "TorrentQueues": torrentQueues}).Debug("Current TorrentQueues")
for _, singleTorrentFromStorage := range TorrentLocalArray {
torrentQueues := Storage.FetchQueues(db)
var singleTorrent *torrent.Torrent
for _, liveTorrent := range tclient.Torrents() { //matching the torrent from storage to the live torrent
if singleTorrentFromStorage.Hash == liveTorrent.InfoHash().String() {
singleTorrent = liveTorrent
}
}
tickUpdateStruct := Storage.TorrentLocal{} //we are shoving the tick updates into a torrentlocal struct to pass to storage happens at the end of the routine
fullClientDB := new(ClientDB)
//Handling deleted torrents here
if singleTorrentFromStorage.TorrentStatus == "Dropped" {
Logger.WithFields(logrus.Fields{"selection": singleTorrentFromStorage.TorrentName}).Info("Deleting just the torrent")
DeleteTorrentFromQueues(singleTorrentFromStorage.Hash, db)
singleTorrent.Drop()
Storage.DelTorrentLocalStorage(db, singleTorrentFromStorage.Hash)
}
if singleTorrentFromStorage.TorrentStatus == "DroppedData" {
Logger.WithFields(logrus.Fields{"selection": singleTorrentFromStorage.TorrentName}).Info("Deleting torrent and data")
singleTorrent.Drop()
DeleteTorrentFromQueues(singleTorrentFromStorage.Hash, db)
Storage.DelTorrentLocalStorageAndFiles(db, singleTorrentFromStorage.Hash, Config.TorrentConfig.DataDir)
}
if singleTorrentFromStorage.TorrentType == "file" { //if it is a file pull it from the uploaded torrent folder
fullClientDB.SourceType = "Torrent File"
} else {
fullClientDB.SourceType = "Magnet Link"
}
var TempHash metainfo.Hash
TempHash = singleTorrent.InfoHash()
calculatedTotalSize := CalculateDownloadSize(singleTorrentFromStorage, singleTorrent)
calculatedCompletedSize := CalculateCompletedSize(singleTorrentFromStorage, singleTorrent)
fullStruct := singleTorrent.Stats()
activePeersString := strconv.Itoa(fullStruct.ActivePeers) //converting to strings
totalPeersString := fmt.Sprintf("%v", fullStruct.TotalPeers)
fullClientDB.StoragePath = singleTorrentFromStorage.StoragePath
downloadedSizeHumanized := HumanizeBytes(float32(singleTorrent.BytesCompleted())) //convert size to GB if needed
totalSizeHumanized := HumanizeBytes(float32(singleTorrentFromStorage.TorrentSize))
downloadedSizeHumanized := HumanizeBytes(float32(calculatedCompletedSize)) //convert size to GB if needed
totalSizeHumanized := HumanizeBytes(float32(calculatedTotalSize))
fullClientDB.DownloadedSize = downloadedSizeHumanized
fullClientDB.Size = totalSizeHumanized
PercentDone := fmt.Sprintf("%.2f", float32(singleTorrent.BytesCompleted())/float32(singleTorrentFromStorage.TorrentSize))
PercentDone := fmt.Sprintf("%.2f", float32(calculatedCompletedSize)/float32(calculatedTotalSize))
fullClientDB.TorrentHash = TempHash
fullClientDB.PercentDone = PercentDone
fullClientDB.DataBytesRead = fullStruct.ConnStats.BytesReadData //used for calculations not passed to client calculating up/down speed
fullClientDB.DataBytesWritten = fullStruct.ConnStats.BytesWrittenData //used for calculations not passed to client calculating up/down speed
fullClientDB.DataBytesRead = fullStruct.ConnStats.BytesReadData.Int64() //used for calculations not passed to client calculating up/down speed
fullClientDB.DataBytesWritten = fullStruct.ConnStats.BytesWrittenData.Int64() //used for calculations not passed to client calculating up/down speed
fullClientDB.ActivePeers = activePeersString + " / (" + totalPeersString + ")"
fullClientDB.TorrentHashString = TempHash.String()
fullClientDB.TorrentName = singleTorrentFromStorage.TorrentName
fullClientDB.DateAdded = singleTorrentFromStorage.DateAdded
fullClientDB.TorrentLabel = singleTorrentFromStorage.Label
fullClientDB.BytesCompleted = singleTorrent.BytesCompleted()
fullClientDB.BytesCompleted = calculatedCompletedSize
fullClientDB.NumberofFiles = len(singleTorrent.Files())
if len(PreviousTorrentArray) > 0 { //if we actually have a previous array //ranging over the previous torrent array to calculate the speed for each torrent
for _, previousElement := range PreviousTorrentArray {
TempHash := singleTorrent.InfoHash()
if previousElement.TorrentHashString == TempHash.String() { //matching previous to new
CalculateTorrentSpeed(singleTorrent, fullClientDB, previousElement)
fullClientDB.TotalUploadedBytes = singleTorrentFromStorage.UploadedBytes + (fullStruct.ConnStats.BytesWrittenData - previousElement.DataBytesWritten)
CalculateTorrentSpeed(singleTorrent, fullClientDB, previousElement, calculatedCompletedSize)
fullClientDB.TotalUploadedBytes = singleTorrentFromStorage.UploadedBytes + (fullStruct.ConnStats.BytesWrittenData.Int64() - previousElement.DataBytesWritten)
}
}
}
CalculateTorrentETA(singleTorrentFromStorage.TorrentSize, singleTorrent.BytesCompleted(), fullClientDB) //needs to be here since we need the speed calculated before we can estimate the eta.
CalculateTorrentETA(singleTorrentFromStorage.TorrentSize, calculatedCompletedSize, fullClientDB) //needs to be here since we need the speed calculated before we can estimate the eta.
fullClientDB.TotalUploadedSize = HumanizeBytes(float32(fullClientDB.TotalUploadedBytes))
fullClientDB.UploadRatio = CalculateUploadRatio(singleTorrent, fullClientDB) //calculate the upload ratio
CalculateTorrentStatus(singleTorrent, fullClientDB, config, singleTorrentFromStorage)
CalculateTorrentStatus(singleTorrent, fullClientDB, config, singleTorrentFromStorage, calculatedCompletedSize, calculatedTotalSize, torrentQueues, db) //add torrents to the queue, remove from queue, etc
tickUpdateStruct.UploadRatio = fullClientDB.UploadRatio
tickUpdateStruct.TorrentSize = calculatedTotalSize
tickUpdateStruct.UploadedBytes = fullClientDB.TotalUploadedBytes
tickUpdateStruct.TorrentStatus = fullClientDB.Status
tickUpdateStruct.Hash = fullClientDB.TorrentHashString //needed for index
Storage.UpdateStorageTick(db, tickUpdateStruct)
RunningTorrentArray = append(RunningTorrentArray, *fullClientDB)
@@ -273,7 +360,7 @@ func CreateRunningTorrentArray(tclient *torrent.Client, TorrentLocalArray []*Sto
}
//CreateFileListArray creates a file list for a single torrent that is selected and sent to the server
func CreateFileListArray(tclient *torrent.Client, selectedHash string, db *storm.DB, config FullClientSettings) TorrentFileList {
func CreateFileListArray(tclient *torrent.Client, selectedHash string, db *storm.DB, config Settings.FullClientSettings) TorrentFileList {
runningTorrents := tclient.Torrents() //don't need running torrent array since we aren't adding or deleting from storage
torrentFileListStorage := Storage.FetchTorrentFromStorage(db, selectedHash)
TorrentFileListSelected := TorrentFileList{}

View File

@@ -7,6 +7,8 @@ import (
"time"
"github.com/anacrolix/torrent"
"github.com/asdine/storm"
Settings "github.com/deranjer/goTorrent/settings"
"github.com/deranjer/goTorrent/storage"
Storage "github.com/deranjer/goTorrent/storage"
"github.com/sirupsen/logrus"
@@ -21,6 +23,20 @@ func secondsToMinutes(inSeconds int64) string {
return str
}
//VerifyData just verifies the data of a torrent by hash
func VerifyData(singleTorrent *torrent.Torrent) {
singleTorrent.VerifyData()
}
//MakeRange creates a range of pieces to set their priority based on a file
func MakeRange(min, max int) []int {
a := make([]int, max-min+1)
for i := range a {
a[i] = min + i
}
return a
}
//HumanizeBytes returns a nice humanized version of bytes in either GB or MB
func HumanizeBytes(bytes float32) string {
if bytes < 1000000 { //if we have less than 1MB in bytes convert to KB
@@ -40,7 +56,7 @@ func HumanizeBytes(bytes float32) string {
}
//CopyFile takes a source file string and a destination file string and copies the file
func CopyFile(srcFile string, destFile string) {
func CopyFile(srcFile string, destFile string) { //TODO move this to our imported copy repo
fileContents, err := os.Open(srcFile)
defer fileContents.Close()
if err != nil {
@@ -58,19 +74,46 @@ func CopyFile(srcFile string, destFile string) {
}
//SetFilePriority sets the priorities for all of the files in all of the torrents
func SetFilePriority(t *torrent.Client, db *storm.DB) {
storedTorrents := Storage.FetchAllStoredTorrents(db)
for _, singleTorrent := range t.Torrents() {
for _, storedTorrent := range storedTorrents {
if storedTorrent.Hash == singleTorrent.InfoHash().String() {
for _, file := range singleTorrent.Files() {
for _, storedFile := range storedTorrent.TorrentFilePriority {
if storedFile.TorrentFilePath == file.DisplayPath() {
switch storedFile.TorrentFilePriority {
case "High":
file.SetPriority(torrent.PiecePriorityHigh)
case "Normal":
file.SetPriority(torrent.PiecePriorityNormal)
case "Cancel":
file.SetPriority(torrent.PiecePriorityNone)
default:
file.SetPriority(torrent.PiecePriorityNormal)
}
}
}
}
}
}
}
}
//CalculateTorrentSpeed is used to calculate the torrent upload and download speed over time c is current clientdb, oc is last client db to calculate speed over time
func CalculateTorrentSpeed(t *torrent.Torrent, c *ClientDB, oc ClientDB) {
func CalculateTorrentSpeed(t *torrent.Torrent, c *ClientDB, oc ClientDB, completedSize int64) {
now := time.Now()
bytes := t.BytesCompleted()
bytes := completedSize
bytesUpload := t.Stats().BytesWrittenData
dt := float32(now.Sub(oc.UpdatedAt)) // get the delta time length between now and last updated
db := float32(bytes - oc.BytesCompleted) //getting the delta bytes
rate := db * (float32(time.Second) / dt) // converting into seconds
dbU := float32(bytesUpload - oc.DataBytesWritten)
dbU := float32(bytesUpload.Int64() - oc.DataBytesWritten)
rateUpload := dbU * (float32(time.Second) / dt)
if rate >= 0 {
rate = rate / 1024 / 1024 //creating integer to calculate ETA
c.DownloadSpeed = fmt.Sprintf("%.2f", rate)
rateMB := rate / 1024 / 1024 //creating MB to calculate ETA
c.DownloadSpeed = fmt.Sprintf("%.2f", rateMB)
c.DownloadSpeed = c.DownloadSpeed + " MB/s"
c.downloadSpeedInt = int64(rate)
}
@@ -84,20 +127,48 @@ func CalculateTorrentSpeed(t *torrent.Torrent, c *ClientDB, oc ClientDB) {
}
//CalculateDownloadSize will calculate the download size once file priorities are sorted out
func CalculateDownloadSize(tFromStorage *Storage.TorrentLocal) {
func CalculateDownloadSize(tFromStorage *Storage.TorrentLocal, activeTorrent *torrent.Torrent) int64 {
var totalLength int64
for _, file := range tFromStorage.TorrentFilePriority {
if file.TorrentFilePriority != "Cancel" {
totalLength = totalLength + file.TorrentFileSize
}
}
return totalLength
}
//CalculateCompletedSize will be used to calculate how much of the actual torrent we have completed minus the canceled files (even if they have been partially downloaded)
func CalculateCompletedSize(tFromStorage *Storage.TorrentLocal, activeTorrent *torrent.Torrent) int64 {
var discardByteLength int64
for _, storageFile := range tFromStorage.TorrentFilePriority {
if storageFile.TorrentFilePriority == "Cancel" { //If the file is canceled don't count it as downloaded
for _, activeFile := range activeTorrent.Files() {
if activeFile.DisplayPath() == storageFile.TorrentFilePath { //match the file from storage to active
for _, piece := range activeFile.State() {
if piece.Partial || piece.Complete {
discardByteLength = discardByteLength + piece.Bytes
}
}
}
}
}
}
downloadedLength := activeTorrent.BytesCompleted() - discardByteLength
if downloadedLength < 0 {
downloadedLength = 0
}
return downloadedLength
}
//CalculateTorrentETA is used to estimate the remaining dl time of the torrent based on the speed that the MB are being downloaded
func CalculateTorrentETA(tSize int64, tBytesCompleted int64, c *ClientDB) {
missingBytes := tSize - tBytesCompleted
missingMB := missingBytes / 1024 / 1024
if missingMB == 0 {
if missingBytes == 0 {
c.ETA = "Done"
} else if c.downloadSpeedInt == 0 {
c.ETA = "N/A"
} else {
ETASeconds := missingMB / c.downloadSpeedInt
ETASeconds := missingBytes / c.downloadSpeedInt
str := secondsToMinutes(ETASeconds) //converting seconds to minutes + seconds
c.ETA = str
}
@@ -113,26 +184,247 @@ func CalculateUploadRatio(t *torrent.Torrent, c *ClientDB) string {
return uploadRatio
}
//CalculateTorrentStatus is used to determine what the STATUS column of the frontend will display ll2
func CalculateTorrentStatus(t *torrent.Torrent, c *ClientDB, config FullClientSettings, tFromStorage *storage.TorrentLocal) {
if (tFromStorage.TorrentStatus == "Stopped") || (float64(c.TotalUploadedBytes)/float64(t.BytesCompleted()) >= config.SeedRatioStop && tFromStorage.TorrentUploadLimit == true) { //If storage shows torrent stopped or if it is over the seeding ratio AND is under the global limit
c.Status = "Stopped"
c.MaxConnections = 0
t.SetMaxEstablishedConns(0)
} else { //Only has 2 states in storage, stopped or running, so we know it should be running, and the websocket request handled updating the database with connections and status
c.MaxConnections = 80
t.SetMaxEstablishedConns(80) //TODO this should not be needed but apparently is needed
t.DownloadAll() //ensure that we are setting the torrent to download
if t.Seeding() && t.Stats().ActivePeers > 0 && t.BytesMissing() == 0 {
c.Status = "Seeding"
} else if t.Stats().ActivePeers > 0 && t.BytesMissing() > 0 {
c.Status = "Downloading"
} else if t.Stats().ActivePeers == 0 && t.BytesMissing() == 0 {
c.Status = "Completed"
} else if t.Stats().ActivePeers == 0 && t.BytesMissing() > 0 {
c.Status = "Awaiting Peers"
} else {
c.Status = "Unknown"
//StopTorrent stops the torrent, updates the database and sends a message. Since stoptorrent is called by each loop (individually) no need to call an array
func StopTorrent(singleTorrent *torrent.Torrent, torrentLocalStorage *Storage.TorrentLocal, db *storm.DB) {
if torrentLocalStorage.TorrentStatus == "Stopped" { //if we are already stopped
Logger.WithFields(logrus.Fields{"Torrent Name": torrentLocalStorage.TorrentName}).Info("Torrent Already Stopped, returning...")
return
}
torrentLocalStorage.TorrentStatus = "Stopped"
torrentLocalStorage.MaxConnections = 0
singleTorrent.SetMaxEstablishedConns(0)
DeleteTorrentFromQueues(singleTorrent.InfoHash().String(), db)
Storage.UpdateStorageTick(db, *torrentLocalStorage)
CreateServerPushMessage(ServerPushMessage{MessageType: "serverPushMessage", MessageLevel: "success", Payload: "Torrent Stopped!"}, Conn)
Logger.WithFields(logrus.Fields{"Torrent Name": torrentLocalStorage.TorrentName}).Info("Torrent Stopped Success!")
}
//AddTorrentToForceStart forces torrent to be high priority on start
func AddTorrentToForceStart(torrentLocalStorage *Storage.TorrentLocal, singleTorrent *torrent.Torrent, db *storm.DB) {
torrentQueues := Storage.FetchQueues(db)
for index, torrentHash := range torrentQueues.ActiveTorrents {
if torrentHash == singleTorrent.InfoHash().String() { //If torrent already in active remove from active
torrentQueues.ActiveTorrents = append(torrentQueues.ActiveTorrents[:index], torrentQueues.ActiveTorrents[index+1:]...)
}
}
for index, queuedTorrentHash := range torrentQueues.QueuedTorrents { //Removing from the queued torrents if in queued torrents
if queuedTorrentHash == singleTorrent.InfoHash().String() {
torrentQueues.QueuedTorrents = append(torrentQueues.QueuedTorrents[:index], torrentQueues.QueuedTorrents[index+1:]...)
}
}
singleTorrent.NewReader()
singleTorrent.SetMaxEstablishedConns(80)
torrentQueues.ActiveTorrents = append(torrentQueues.ActiveTorrents, singleTorrent.InfoHash().String())
torrentLocalStorage.TorrentStatus = "ForceStart"
torrentLocalStorage.MaxConnections = 80
for _, file := range singleTorrent.Files() {
for _, sentFile := range torrentLocalStorage.TorrentFilePriority {
if file.DisplayPath() == sentFile.TorrentFilePath {
switch sentFile.TorrentFilePriority {
case "High":
file.SetPriority(torrent.PiecePriorityHigh)
case "Normal":
file.SetPriority(torrent.PiecePriorityNormal)
case "Cancel":
file.SetPriority(torrent.PiecePriorityNone)
default:
file.SetPriority(torrent.PiecePriorityNormal)
}
}
}
}
Logger.WithFields(logrus.Fields{"Torrent Name": torrentLocalStorage.TorrentName}).Info("Adding Torrent to ForceStart Queue")
Storage.UpdateStorageTick(db, *torrentLocalStorage)
Storage.UpdateQueues(db, torrentQueues)
}
//AddTorrentToActive adds a torrent to the active slice
func AddTorrentToActive(torrentLocalStorage *Storage.TorrentLocal, singleTorrent *torrent.Torrent, db *storm.DB) {
torrentQueues := Storage.FetchQueues(db)
if torrentLocalStorage.TorrentStatus == "Stopped" {
Logger.WithFields(logrus.Fields{"Torrent Name": torrentLocalStorage.TorrentName}).Info("Torrent set as stopped, skipping add")
return
}
for _, torrentHash := range torrentQueues.ActiveTorrents {
if torrentHash == singleTorrent.InfoHash().String() { //If torrent already in active skip
return
}
}
for index, queuedTorrentHash := range torrentQueues.QueuedTorrents { //Removing from the queued torrents if in queued torrents
if queuedTorrentHash == singleTorrent.InfoHash().String() {
torrentQueues.QueuedTorrents = append(torrentQueues.QueuedTorrents[:index], torrentQueues.QueuedTorrents[index+1:]...)
}
}
singleTorrent.NewReader()
singleTorrent.SetMaxEstablishedConns(80)
torrentQueues.ActiveTorrents = append(torrentQueues.ActiveTorrents, singleTorrent.InfoHash().String())
torrentLocalStorage.TorrentStatus = "Running"
torrentLocalStorage.MaxConnections = 80
for _, file := range singleTorrent.Files() {
for _, sentFile := range torrentLocalStorage.TorrentFilePriority {
if file.DisplayPath() == sentFile.TorrentFilePath {
switch sentFile.TorrentFilePriority {
case "High":
file.SetPriority(torrent.PiecePriorityHigh)
case "Normal":
file.SetPriority(torrent.PiecePriorityNormal)
case "Cancel":
file.SetPriority(torrent.PiecePriorityNone)
default:
file.SetPriority(torrent.PiecePriorityNormal)
}
}
}
}
Logger.WithFields(logrus.Fields{"Torrent Name": torrentLocalStorage.TorrentName}).Info("Adding Torrent to Active Queue (Manual Call)")
Storage.UpdateStorageTick(db, *torrentLocalStorage)
Storage.UpdateQueues(db, torrentQueues)
}
//RemoveTorrentFromActive forces a torrent to be removed from the active list if the max limit is already there and user forces a new torrent to be added
func RemoveTorrentFromActive(torrentLocalStorage *Storage.TorrentLocal, singleTorrent *torrent.Torrent, db *storm.DB) {
torrentQueues := Storage.FetchQueues(db)
for x, torrentHash := range torrentQueues.ActiveTorrents {
if torrentHash == singleTorrent.InfoHash().String() {
torrentQueues.ActiveTorrents = append(torrentQueues.ActiveTorrents[:x], torrentQueues.ActiveTorrents[x+1:]...)
torrentQueues.QueuedTorrents = append(torrentQueues.QueuedTorrents, torrentHash)
torrentLocalStorage.TorrentStatus = "Queued"
torrentLocalStorage.MaxConnections = 0
singleTorrent.SetMaxEstablishedConns(0)
Storage.UpdateQueues(db, torrentQueues)
//AddTorrentToQueue(torrentLocalStorage, singleTorrent, db) //Adding the lasttorrent from active to queued
Storage.UpdateStorageTick(db, *torrentLocalStorage)
}
}
}
//DeleteTorrentFromQueues deletes the torrent from all queues (for a stop or delete action)
func DeleteTorrentFromQueues(torrentHash string, db *storm.DB) {
torrentQueues := Storage.FetchQueues(db)
for x, torrentHashActive := range torrentQueues.ActiveTorrents { //FOR EXTRA CAUTION deleting it from both queues in case a mistake occurred.
if torrentHash == torrentHashActive {
torrentQueues.ActiveTorrents = append(torrentQueues.ActiveTorrents[:x], torrentQueues.ActiveTorrents[x+1:]...)
Logger.Info("Removing Torrent from Active: ", torrentHash)
}
}
for x, torrentHashQueued := range torrentQueues.QueuedTorrents { //FOR EXTRA CAUTION deleting it from both queues in case a mistake occurred.
if torrentHash == torrentHashQueued {
torrentQueues.QueuedTorrents = append(torrentQueues.QueuedTorrents[:x], torrentQueues.QueuedTorrents[x+1:]...)
Logger.Info("Removing Torrent from Queued", torrentHash)
}
}
for x, torrentHashActive := range torrentQueues.ForcedTorrents { //FOR EXTRA CAUTION deleting it from all queues in case a mistake occurred.
if torrentHash == torrentHashActive {
torrentQueues.ForcedTorrents = append(torrentQueues.ForcedTorrents[:x], torrentQueues.ForcedTorrents[x+1:]...)
Logger.Info("Removing Torrent from Forced: ", torrentHash)
}
}
Storage.UpdateQueues(db, torrentQueues)
Logger.WithFields(logrus.Fields{"Torrent Hash": torrentHash, "TorrentQueues": torrentQueues}).Info("Removing Torrent from all Queues")
}
//AddTorrentToQueue adds a torrent to the queue
func AddTorrentToQueue(torrentLocalStorage *Storage.TorrentLocal, singleTorrent *torrent.Torrent, db *storm.DB) {
torrentQueues := Storage.FetchQueues(db)
for _, torrentHash := range torrentQueues.QueuedTorrents {
if singleTorrent.InfoHash().String() == torrentHash { //don't add duplicate to que but do everything else (TODO, maybe find a better way?)
singleTorrent.SetMaxEstablishedConns(0)
torrentLocalStorage.MaxConnections = 0
torrentLocalStorage.TorrentStatus = "Queued"
Logger.WithFields(logrus.Fields{"TorrentName": torrentLocalStorage.TorrentName}).Info("Adding torrent to the queue, not active")
Storage.UpdateStorageTick(db, *torrentLocalStorage)
return
}
}
torrentQueues.QueuedTorrents = append(torrentQueues.QueuedTorrents, singleTorrent.InfoHash().String())
singleTorrent.SetMaxEstablishedConns(0)
torrentLocalStorage.MaxConnections = 0
torrentLocalStorage.TorrentStatus = "Queued"
Logger.WithFields(logrus.Fields{"TorrentName": torrentLocalStorage.TorrentName}).Info("Adding torrent to the queue, not active")
Storage.UpdateQueues(db, torrentQueues)
Storage.UpdateStorageTick(db, *torrentLocalStorage)
}
//RemoveDuplicatesFromQueues removes any duplicates from torrentQueues.QueuedTorrents (which will happen if it is read in from DB)
func RemoveDuplicatesFromQueues(db *storm.DB) {
torrentQueues := Storage.FetchQueues(db)
for _, torrentHash := range torrentQueues.ActiveTorrents {
for i, queuedHash := range torrentQueues.QueuedTorrents {
if torrentHash == queuedHash {
torrentQueues.QueuedTorrents = append(torrentQueues.QueuedTorrents[:i], torrentQueues.QueuedTorrents[i+1:]...)
}
}
}
Storage.UpdateQueues(db, torrentQueues)
}
//ValidateQueues is a sanity check that runs every tick to make sure the queues are in order... tried to avoid this but seems to be required
func ValidateQueues(db *storm.DB, config Settings.FullClientSettings, tclient *torrent.Client) {
torrentQueues := Storage.FetchQueues(db)
for len(torrentQueues.ActiveTorrents) > config.MaxActiveTorrents {
removeTorrent := torrentQueues.ActiveTorrents[:1]
for _, singleTorrent := range tclient.Torrents() {
if singleTorrent.InfoHash().String() == removeTorrent[0] {
singleTorrentFromStorage := Storage.FetchTorrentFromStorage(db, removeTorrent[0])
RemoveTorrentFromActive(&singleTorrentFromStorage, singleTorrent, db)
}
}
}
torrentQueues = Storage.FetchQueues(db)
for _, singleTorrent := range tclient.Torrents() {
singleTorrentFromStorage := Storage.FetchTorrentFromStorage(db, singleTorrent.InfoHash().String())
if singleTorrentFromStorage.TorrentStatus == "Stopped" {
continue
}
for _, queuedTorrent := range torrentQueues.QueuedTorrents { //If we have a queued torrent that is missing data, and an active torrent that is seeding, then prioritize the missing data one
if singleTorrent.InfoHash().String() == queuedTorrent {
if singleTorrent.BytesMissing() > 0 {
for _, activeTorrent := range torrentQueues.ActiveTorrents {
for _, singleActiveTorrent := range tclient.Torrents() {
if activeTorrent == singleActiveTorrent.InfoHash().String() {
if singleActiveTorrent.Seeding() == true {
singleActiveTFS := Storage.FetchTorrentFromStorage(db, activeTorrent)
Logger.WithFields(logrus.Fields{"TorrentName": singleActiveTFS.TorrentName}).Info("Seeding, Removing from active to add queued")
RemoveTorrentFromActive(&singleActiveTFS, singleActiveTorrent, db)
singleQueuedTFS := Storage.FetchTorrentFromStorage(db, queuedTorrent)
Logger.WithFields(logrus.Fields{"TorrentName": singleQueuedTFS.TorrentName}).Info("Adding torrent to the queue, not active")
AddTorrentToActive(&singleQueuedTFS, singleTorrent, db)
}
}
}
}
}
}
}
}
}
//CalculateTorrentStatus is used to determine what the STATUS column of the frontend will display ll2
func CalculateTorrentStatus(t *torrent.Torrent, c *ClientDB, config Settings.FullClientSettings, tFromStorage *storage.TorrentLocal, bytesCompleted int64, totalSize int64, torrentQueues Storage.TorrentQueues, db *storm.DB) {
if tFromStorage.TorrentStatus == "Stopped" {
c.Status = "Stopped"
return
}
//Only has 2 states in storage, stopped or running, so we know it should be running, and the websocket request handled updating the database with connections and status
for _, torrentHash := range torrentQueues.QueuedTorrents {
if tFromStorage.Hash == torrentHash {
c.Status = "Queued"
return
}
}
bytesMissing := totalSize - bytesCompleted
c.MaxConnections = 80
t.SetMaxEstablishedConns(80)
if t.Seeding() && t.Stats().ActivePeers > 0 && bytesMissing == 0 {
c.Status = "Seeding"
} else if t.Stats().ActivePeers > 0 && bytesMissing > 0 {
c.Status = "Downloading"
} else if t.Stats().ActivePeers == 0 && bytesMissing == 0 {
c.Status = "Completed"
} else if t.Stats().ActivePeers == 0 && bytesMissing > 0 {
c.Status = "Awaiting Peers"
} else {
c.Status = "Unknown"
}
}

View File

@@ -1,208 +0,0 @@
package engine
import (
"fmt"
"path/filepath"
"golang.org/x/time/rate"
"github.com/anacrolix/dht"
"github.com/anacrolix/torrent"
"github.com/sirupsen/logrus"
"github.com/spf13/viper"
)
//FullClientSettings contains all of the settings for our entire application
type FullClientSettings struct {
LoggingLevel logrus.Level
LoggingOutput string
HTTPAddr string
Version int
TorrentConfig torrent.Config
TFileUploadFolder string
SeedRatioStop float64
PushBulletToken string
DefaultMoveFolder string
TorrentWatchFolder string
}
//default is called if there is a parsing error
func defaultConfig() FullClientSettings {
var Config FullClientSettings
Config.Version = 1.0
Config.LoggingLevel = 3 //Warn level
Config.TorrentConfig.DataDir = "downloads" //the absolute or relative path of the default download directory for torrents
Config.TFileUploadFolder = "uploadedTorrents"
Config.TorrentConfig.Seed = true
Config.HTTPAddr = ":8000"
Config.SeedRatioStop = 1.50
Config.TorrentConfig.DHTConfig = dht.ServerConfig{
StartingNodes: dht.GlobalBootstrapAddrs,
}
return Config
}
func dhtServerSettings(dhtConfig dht.ServerConfig) dht.ServerConfig {
viper.UnmarshalKey("DHTConfig", &dhtConfig)
Logger.WithFields(logrus.Fields{"dhtConfig": dhtConfig}).Info("Displaying DHT Config")
return dhtConfig
}
func calculateRateLimiters(uploadRate, downloadRate string) (*rate.Limiter, *rate.Limiter) { //TODO reorg
var uploadRateLimiterSize int
var downloadRateLimiterSize int
switch uploadRate {
case "Low":
uploadRateLimiterSize = 50000
case "Medium":
uploadRateLimiterSize = 500000
case "High":
uploadRateLimiterSize = 1500000
default:
downloadRateLimiter := rate.NewLimiter(rate.Inf, 0)
uploadRateLimiter := rate.NewLimiter(rate.Inf, 0)
return downloadRateLimiter, uploadRateLimiter
}
switch downloadRate {
case "Low":
downloadRateLimiterSize = 50000
case "Medium":
downloadRateLimiterSize = 500000
case "High":
downloadRateLimiterSize = 1500000
default:
downloadRateLimiter := rate.NewLimiter(rate.Inf, 0)
uploadRateLimiter := rate.NewLimiter(rate.Inf, 0)
return downloadRateLimiter, uploadRateLimiter
}
var limitPerSecondUl = rate.Limit(uploadRateLimiterSize)
uploadRateLimiter := rate.NewLimiter(limitPerSecondUl, uploadRateLimiterSize)
var limitPerSecondDl = rate.Limit(uploadRateLimiterSize)
downloadRateLimiter := rate.NewLimiter(limitPerSecondDl, downloadRateLimiterSize)
return downloadRateLimiter, uploadRateLimiter
}
//FullClientSettingsNew creates a new set of setting from config.toml
func FullClientSettingsNew() FullClientSettings {
viper.SetConfigName("config")
viper.AddConfigPath("./")
err := viper.ReadInConfig()
if err != nil {
fmt.Println("Error reading in config, using defaults", err)
FullClientSettings := defaultConfig()
return FullClientSettings
}
var httpAddr string
httpAddrIP := viper.GetString("serverConfig.ServerAddr")
httpAddrPort := viper.GetString("serverConfig.ServerPort")
seedRatioStop := viper.GetFloat64("serverConfig.SeedRatioStop")
httpAddr = httpAddrIP + httpAddrPort
pushBulletToken := viper.GetString("notifications.PushBulletToken")
defaultMoveFolder := filepath.ToSlash(viper.GetString("serverConfig.DefaultMoveFolder")) //Converting the string literal into a filepath
defaultMoveFolderAbs, err := filepath.Abs(defaultMoveFolder)
if err != nil {
fmt.Println("Failed creating absolute path for defaultMoveFolder", err)
}
torrentWatchFolder := filepath.ToSlash(viper.GetString("serverConfig.TorrentWatchFolder"))
torrentWatchFolderAbs, err := filepath.Abs(torrentWatchFolder)
if err != nil {
fmt.Println("Failed creating absolute path for torrentWatchFolderAbs", err)
}
dataDir := filepath.ToSlash(viper.GetString("torrentClientConfig.DownloadDir")) //Converting the string literal into a filepath
dataDirAbs, err := filepath.Abs(dataDir) //Converting to an absolute file path
if err != nil {
fmt.Println("Failed creating absolute path for dataDir", err)
}
var uploadRateLimiter *rate.Limiter
var downloadRateLimiter *rate.Limiter
uploadRate := viper.GetString("serverConfig.UploadRateLimit")
downloadRate := viper.GetString("serverConfig.DownloadRateLimit")
downloadRateLimiter, uploadRateLimiter = calculateRateLimiters(uploadRate, downloadRate)
listenAddr := viper.GetString("torrentClientConfig.ListenAddr")
disablePex := viper.GetBool("torrentClientConfig.DisablePEX")
noDHT := viper.GetBool("torrentClientConfig.NoDHT")
noUpload := viper.GetBool("torrentClientConfig.NoUpload")
seed := viper.GetBool("torrentClientConfig.Seed")
peerID := viper.GetString("torrentClientConfig.PeerID")
disableUTP := viper.GetBool("torrentClientConfig.DisableUTP")
disableTCP := viper.GetBool("torrentClientConfig.DisableTCP")
disableIPv6 := viper.GetBool("torrentClientConfig.DisableIPv6")
debug := viper.GetBool("torrentClientConfig.Debug")
logLevelString := viper.GetString("serverConfig.LogLevel")
logOutput := viper.GetString("serverConfig.LogOutput")
var logLevel logrus.Level
switch logLevelString { //Options = Debug 5, Info 4, Warn 3, Error 2, Fatal 1, Panic 0
case "Panic":
logLevel = 0
case "Fatal":
logLevel = 1
case "Error":
logLevel = 2
case "Warn":
logLevel = 3
case "Info":
logLevel = 4
case "Debug":
logLevel = 5
default:
logLevel = 3
}
dhtServerConfig := dht.ServerConfig{
StartingNodes: dht.GlobalBootstrapAddrs,
}
if viper.IsSet("DHTConfig") {
fmt.Println("Reading in custom DHT config")
dhtServerConfig = dhtServerSettings(dhtServerConfig)
}
encryptionPolicy := torrent.EncryptionPolicy{
DisableEncryption: viper.GetBool("EncryptionPolicy.DisableEncryption"),
ForceEncryption: viper.GetBool("EncryptionPolicy.ForceEncryption"),
PreferNoEncryption: viper.GetBool("EncryptionPolicy.PreferNoEncryption"),
}
tConfig := torrent.Config{
DataDir: dataDirAbs,
ListenAddr: listenAddr,
DisablePEX: disablePex,
NoDHT: noDHT,
DHTConfig: dhtServerConfig,
NoUpload: noUpload,
Seed: seed,
UploadRateLimiter: uploadRateLimiter,
DownloadRateLimiter: downloadRateLimiter,
PeerID: peerID,
DisableUTP: disableUTP,
DisableTCP: disableTCP,
DisableIPv6: disableIPv6,
Debug: debug,
EncryptionPolicy: encryptionPolicy,
}
Config := FullClientSettings{
LoggingLevel: logLevel,
LoggingOutput: logOutput,
SeedRatioStop: seedRatioStop,
HTTPAddr: httpAddr,
TorrentConfig: tConfig,
TFileUploadFolder: "uploadedTorrents",
PushBulletToken: pushBulletToken,
DefaultMoveFolder: defaultMoveFolderAbs,
TorrentWatchFolder: torrentWatchFolderAbs,
}
return Config
}

30
go.mod Normal file
View File

@@ -0,0 +1,30 @@
module github.com/deranjer/goTorrent
go 1.12
require (
github.com/BurntSushi/toml v0.3.1 // indirect
github.com/DataDog/zstd v1.3.5 // indirect
github.com/PuerkitoBio/goquery v1.5.0 // indirect
github.com/Sereal/Sereal v0.0.0-20190226181601-237c2cca198f // indirect
github.com/anacrolix/dht v1.0.1
github.com/anacrolix/torrent v1.1.1
github.com/asdine/storm v2.1.2+incompatible
github.com/dgrijalva/jwt-go v3.2.0+incompatible
github.com/golang/protobuf v1.3.1 // indirect
github.com/gorilla/handlers v1.4.0
github.com/gorilla/mux v1.7.0
github.com/gorilla/websocket v1.4.0
github.com/mitsuse/pushbullet-go v0.1.0
github.com/mmcdole/gofeed v1.0.0-beta2
github.com/mmcdole/goxpp v0.0.0-20181012175147-0068e33feabf // indirect
github.com/otiai10/copy v1.0.1
github.com/otiai10/curr v0.0.0-20150429015615-9b4961190c95 // indirect
github.com/robfig/cron v0.0.0-20180505203441-b41be1df6967
github.com/sirupsen/logrus v1.4.0
github.com/spf13/viper v1.3.2
github.com/vmihailenco/msgpack v4.0.3+incompatible // indirect
go.etcd.io/bbolt v1.3.2 // indirect
golang.org/x/time v0.0.0-20190308202827-9d24e82272b4
google.golang.org/appengine v1.5.0 // indirect
)

230
go.sum Normal file
View File

@@ -0,0 +1,230 @@
bazil.org/fuse v0.0.0-20180421153158-65cc252bf669/go.mod h1:Xbm+BRKSBEpa4q4hTSxohYNQpsxXPbPry4JJWOB3LB8=
bou.ke/monkey v1.0.1 h1:zEMLInw9xvNakzUUPjfS4Ds6jYPqCFx3m7bRmG5NH2U=
bou.ke/monkey v1.0.1/go.mod h1:FgHuK96Rv2Nlf+0u1OOVDpCMdsWyOFmeeketDHE7LIg=
github.com/BurntSushi/toml v0.3.1 h1:WXkYYl6Yr3qBf1K79EBnL4mak0OimBfB0XUf9Vl28OQ=
github.com/BurntSushi/toml v0.3.1/go.mod h1:xHWCNGjB5oqiDr8zfno3MHue2Ht5sIBksp03qcyfWMU=
github.com/DataDog/zstd v1.3.5 h1:DtpNbljikUepEPD16hD4LvIcmhnhdLTiW/5pHgbmp14=
github.com/DataDog/zstd v1.3.5/go.mod h1:1jcaCB/ufaK+sKp1NBhlGmpz41jOoPQ35bpF36t7BBo=
github.com/PuerkitoBio/goquery v1.5.0 h1:uGvmFXOA73IKluu/F84Xd1tt/z07GYm8X49XKHP7EJk=
github.com/PuerkitoBio/goquery v1.5.0/go.mod h1:qD2PgZ9lccMbQlc7eEOjaeRlFQON7xY8kdmcsrnKqMg=
github.com/RoaringBitmap/roaring v0.4.7/go.mod h1:8khRDP4HmeXns4xIj9oGrKSz7XTQiJx2zgh7AcNke4w=
github.com/RoaringBitmap/roaring v0.4.17 h1:oCYFIFEMSQZrLHpywH7919esI1VSrQZ0pJXkZPGIJ78=
github.com/RoaringBitmap/roaring v0.4.17/go.mod h1:D3qVegWTmfCaX4Bl5CrBE9hfrSrrXIr8KVNvRsDi1NI=
github.com/Sereal/Sereal v0.0.0-20190226181601-237c2cca198f h1:99C4f5FJQChWyzMSpZPU4eUv3kjFmjxyWy8t2rlbUcs=
github.com/Sereal/Sereal v0.0.0-20190226181601-237c2cca198f/go.mod h1:D0JMgToj/WdxCgd30Kc1UcA9E+WdZoJqeVOuYW7iTBM=
github.com/anacrolix/dht v0.0.0-20180412060941-24cbf25b72a4/go.mod h1:hQfX2BrtuQsLQMYQwsypFAab/GvHg8qxwVi4OJdR1WI=
github.com/anacrolix/dht v0.0.0-20181129074040-b09db78595aa/go.mod h1:Ayu4t+5TsHQ07/P8XzRJqVofv7lU4R1ZTT7KW5+SPFA=
github.com/anacrolix/dht v1.0.1 h1:a7zVMiZWfPiToAUbjMZYeI3UvmsDP3j8vH5EDIAjM9c=
github.com/anacrolix/dht v1.0.1/go.mod h1:dtcIktBFD8YD/7ZcE5nQuuGGfLxcwa8+18mHl+GU+KA=
github.com/anacrolix/dht/v2 v2.0.1 h1:gOHJ+OKqJ4Eb48OYStZm4AlWr1/nSA2TWlzb/+t36SA=
github.com/anacrolix/dht/v2 v2.0.1/go.mod h1:GbTT8BaEtfqab/LPd5tY41f3GvYeii3mmDUK300Ycyo=
github.com/anacrolix/envpprof v0.0.0-20180404065416-323002cec2fa h1:xCaATLKmn39QqLs3tUZYr6eKvezJV+FYvVOLTklxK6U=
github.com/anacrolix/envpprof v0.0.0-20180404065416-323002cec2fa/go.mod h1:KgHhUaQMc8cC0+cEflSgCFNFbKwi5h54gqtVn8yhP7c=
github.com/anacrolix/go-libutp v0.0.0-20180522111405-6baeb806518d/go.mod h1:beQSaSxwH2d9Eeu5ijrEnHei5Qhk+J6cDm1QkWFru4E=
github.com/anacrolix/go-libutp v0.0.0-20180808010927-aebbeb60ea05 h1:Zoniih3jyqtr3I0xFoMvw1USWpg+CbI/zOrcLudr0lc=
github.com/anacrolix/go-libutp v0.0.0-20180808010927-aebbeb60ea05/go.mod h1:POY/GPlrFKRxnOKH1sGAB+NBWMoP+sI+hHJxgcgWbWw=
github.com/anacrolix/log v0.0.0-20180412014343-2323884b361d/go.mod h1:sf/7c2aTldL6sRQj/4UKyjgVZBu2+M2z9wf7MmwPiew=
github.com/anacrolix/log v0.1.0/go.mod h1:sf/7c2aTldL6sRQj/4UKyjgVZBu2+M2z9wf7MmwPiew=
github.com/anacrolix/log v0.2.0 h1:LzaW6XTEk2zcmLZkcZPkJ2mDdnZkOdOTeBH7Kt81ouU=
github.com/anacrolix/log v0.2.0/go.mod h1:sf/7c2aTldL6sRQj/4UKyjgVZBu2+M2z9wf7MmwPiew=
github.com/anacrolix/missinggo v0.0.0-20180522035225-b4a5853e62ff/go.mod h1:b0p+7cn+rWMIphK1gDH2hrDuwGOcbB6V4VXeSsEfHVk=
github.com/anacrolix/missinggo v0.0.0-20180725070939-60ef2fbf63df/go.mod h1:kwGiTUTZ0+p4vAz3VbAI5a30t2YbvemcmspjKwrAz5s=
github.com/anacrolix/missinggo v0.0.0-20181129073415-3237bf955fed/go.mod h1:IN+9GUe7OxKMIs/XeXEbT/rMUolmJzmlZiXHS7FwD/Y=
github.com/anacrolix/missinggo v0.2.1-0.20190310234110-9fbdc9f242a8/go.mod h1:MBJu3Sk/k3ZfGYcS7z18gwfu72Ey/xopPFJJbTi5yIo=
github.com/anacrolix/missinggo v1.1.0 h1:0lZbaNa6zTR1bELAIzCNmRGAtkHuLDPJqTiTtXoAIx8=
github.com/anacrolix/missinggo v1.1.0/go.mod h1:MBJu3Sk/k3ZfGYcS7z18gwfu72Ey/xopPFJJbTi5yIo=
github.com/anacrolix/mmsg v0.0.0-20180515031531-a4a3ba1fc8bb/go.mod h1:x2/ErsYUmT77kezS63+wzZp8E3byYB0gzirM/WMBLfw=
github.com/anacrolix/mmsg v0.0.0-20180808012353-5adb2c1127c0 h1:Fa1XqqLW62lQzEDlNA+QcdJbkfJcxQN0YC8983kj5tU=
github.com/anacrolix/mmsg v0.0.0-20180808012353-5adb2c1127c0/go.mod h1:x8kRaJY/dCrY9Al0PEcj1mb/uFHwP6GCJ9fLl4thEPc=
github.com/anacrolix/sync v0.0.0-20171108081538-eee974e4f8c1/go.mod h1:+u91KiUuf0lyILI6x3n/XrW7iFROCZCG+TjgK8nW52w=
github.com/anacrolix/sync v0.0.0-20180611022320-3c4cb11f5a01/go.mod h1:+u91KiUuf0lyILI6x3n/XrW7iFROCZCG+TjgK8nW52w=
github.com/anacrolix/sync v0.0.0-20180808010631-44578de4e778 h1:XpCDEixzXOB8yaTW/4YBzKrJdMcFI0DzpPTYNv75wzk=
github.com/anacrolix/sync v0.0.0-20180808010631-44578de4e778/go.mod h1:s735Etp3joe/voe2sdaXLcqDdJSay1O0OPnM0ystjqk=
github.com/anacrolix/tagflag v0.0.0-20180109131632-2146c8d41bf0/go.mod h1:1m2U/K6ZT+JZG0+bdMK6qauP49QT4wE5pmhJXOKKCHw=
github.com/anacrolix/tagflag v0.0.0-20180605133421-f477c8c2f14c/go.mod h1:1m2U/K6ZT+JZG0+bdMK6qauP49QT4wE5pmhJXOKKCHw=
github.com/anacrolix/tagflag v0.0.0-20180803105420-3a8ff5428f76/go.mod h1:1m2U/K6ZT+JZG0+bdMK6qauP49QT4wE5pmhJXOKKCHw=
github.com/anacrolix/torrent v0.0.0-20180622074351-fefeef4ee9eb/go.mod h1:3vcFVxgOASslNXHdivT8spyMRBanMCenHRpe0u5vpBs=
github.com/anacrolix/torrent v1.0.1/go.mod h1:ZYV1Z2Wx3jXYSh26mDvneAbk8XIUxfvoVil2GW962zY=
github.com/anacrolix/torrent v1.1.1 h1:f54cvN3950x72hOB8UvzRwEbF5AY3VMj4vPyntgt24Q=
github.com/anacrolix/torrent v1.1.1/go.mod h1:XdYEuC3KuxFQZrQ6iUBXnwKr3IyxeyUlVH6RT8FhyaU=
github.com/anacrolix/utp v0.0.0-20180219060659-9e0e1d1d0572 h1:kpt6TQTVi6gognY+svubHfxxpq0DLU9AfTQyZVc3UOc=
github.com/anacrolix/utp v0.0.0-20180219060659-9e0e1d1d0572/go.mod h1:MDwc+vsGEq7RMw6lr2GKOEqjWny5hO5OZXRVNaBJ2Dk=
github.com/andybalholm/cascadia v1.0.0 h1:hOCXnnZ5A+3eVDX8pvgl4kofXv2ELss0bKcqRySc45o=
github.com/andybalholm/cascadia v1.0.0/go.mod h1:GsXiBklL0woXo1j/WYWtSYYC4ouU9PqHO0sqidkEA4Y=
github.com/armon/consul-api v0.0.0-20180202201655-eb2c6b5be1b6/go.mod h1:grANhF5doyWs3UAsr3K4I6qtAmlQcZDesFNEHPZAzj8=
github.com/asdine/storm v2.1.2+incompatible h1:dczuIkyqwY2LrtXPz8ixMrU/OFgZp71kbKTHGrXYt/Q=
github.com/asdine/storm v2.1.2+incompatible/go.mod h1:RarYDc9hq1UPLImuiXK3BIWPJLdIygvV3PsInK0FbVQ=
github.com/boltdb/bolt v1.3.1 h1:JQmyP4ZBrce+ZQu0dY660FMfatumYDLun9hBCUVIkF4=
github.com/boltdb/bolt v1.3.1/go.mod h1:clJnj/oiGkjum5o1McbSZDSLxVThjynRyGBgiAx27Ps=
github.com/bradfitz/iter v0.0.0-20140124041915-454541ec3da2/go.mod h1:PyRFw1Lt2wKX4ZVSQ2mk+PeDa1rxyObEDlApuIsUKuo=
github.com/bradfitz/iter v0.0.0-20190303215204-33e6a9893b0c h1:FUUopH4brHNO2kJoNN3pV+OBEYmgraLT/KHZrMM69r0=
github.com/bradfitz/iter v0.0.0-20190303215204-33e6a9893b0c/go.mod h1:PyRFw1Lt2wKX4ZVSQ2mk+PeDa1rxyObEDlApuIsUKuo=
github.com/coreos/etcd v3.3.10+incompatible/go.mod h1:uF7uidLiAD3TWHmW31ZFd/JWoc32PjwdhPthX9715RE=
github.com/coreos/go-etcd v2.0.0+incompatible/go.mod h1:Jez6KQU2B/sWsbdaef3ED8NzMklzPG4d5KIOhIy30Tk=
github.com/coreos/go-semver v0.2.0/go.mod h1:nnelYz7RCh+5ahJtPPxZlU+153eP4D4r3EedlOD2RNk=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/dgrijalva/jwt-go v3.2.0+incompatible h1:7qlOGliEKZXTDg6OTjfoBKDXWrumCAMpl/TFQ4/5kLM=
github.com/dgrijalva/jwt-go v3.2.0+incompatible/go.mod h1:E3ru+11k8xSBh+hMPgOLZmtrrCbhqsmaPHjLKYnJCaQ=
github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815/go.mod h1:WwZ+bS3ebgob9U8Nd0kOddGdZWjyMGR8Wziv+TBNwSE=
github.com/dustin/go-humanize v0.0.0-20180421182945-02af3965c54e/go.mod h1:HtrtbFcZ19U5GC7JDqmcUSB87Iq5E25KnS6fMYU6eOk=
github.com/dustin/go-humanize v1.0.0 h1:VSnTsYCnlFHaM2/igO1h6X3HA71jcobQuxemgkq4zYo=
github.com/dustin/go-humanize v1.0.0/go.mod h1:HtrtbFcZ19U5GC7JDqmcUSB87Iq5E25KnS6fMYU6eOk=
github.com/edsrzf/mmap-go v0.0.0-20170320065105-0bce6a688712/go.mod h1:YO35OhQPt3KJa3ryjFM5Bs14WD66h8eGKpfaBNrHW5M=
github.com/edsrzf/mmap-go v1.0.0 h1:CEBF7HpRnUCSJgGUb5h1Gm7e3VkmVDrR8lvWVLtrOFw=
github.com/edsrzf/mmap-go v1.0.0/go.mod h1:YO35OhQPt3KJa3ryjFM5Bs14WD66h8eGKpfaBNrHW5M=
github.com/elgatito/upnp v0.0.0-20180711183757-2f244d205f9a h1:2Zw3pxDRTs4nX1WCLAEm27UN0hvjZSge7EaUUQexRZw=
github.com/elgatito/upnp v0.0.0-20180711183757-2f244d205f9a/go.mod h1:afkYpY8JAIL4341N7Zj9xJ5yTovsg6BkWfBFlCzIoF4=
github.com/fsnotify/fsnotify v1.4.7 h1:IXs+QLmnXW2CcXuY+8Mzv/fWEsPGWxqefPtCP5CnV9I=
github.com/fsnotify/fsnotify v1.4.7/go.mod h1:jwhsz4b93w/PPRr/qN1Yymfu8t87LnFCMoQvtojpjFo=
github.com/glycerine/go-unsnap-stream v0.0.0-20180323001048-9f0cb55181dd/go.mod h1:/20jfyN9Y5QPEAprSgKAUr+glWDY39ZiUEAYOEv5dsE=
github.com/glycerine/go-unsnap-stream v0.0.0-20181221182339-f9677308dec2 h1:Ujru1hufTHVb++eG6OuNDKMxZnGIvF6o/u8q/8h2+I4=
github.com/glycerine/go-unsnap-stream v0.0.0-20181221182339-f9677308dec2/go.mod h1:/20jfyN9Y5QPEAprSgKAUr+glWDY39ZiUEAYOEv5dsE=
github.com/glycerine/goconvey v0.0.0-20180728074245-46e3a41ad493/go.mod h1:Ogl1Tioa0aV7gstGFO7KhffUsb9M4ydbEbbxpcEDc24=
github.com/glycerine/goconvey v0.0.0-20190315024820-982ee783a72e h1:SiEs4J3BKVIeaWrH3tKaz3QLZhJ68iJ/A4xrzIoE5+Y=
github.com/glycerine/goconvey v0.0.0-20190315024820-982ee783a72e/go.mod h1:Ogl1Tioa0aV7gstGFO7KhffUsb9M4ydbEbbxpcEDc24=
github.com/golang/protobuf v1.2.0/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
github.com/golang/protobuf v1.3.1 h1:YF8+flBXS5eO826T4nzqPrxfhQThhXl0YzfuUPu4SBg=
github.com/golang/protobuf v1.3.1/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
github.com/golang/snappy v0.0.0-20180518054509-2e65f85255db/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
github.com/golang/snappy v0.0.1 h1:Qgr9rKW7uDUkrbSmQeiDsGa8SjGyCOGtuasMWwvp2P4=
github.com/golang/snappy v0.0.1/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
github.com/google/btree v0.0.0-20180124185431-e89373fe6b4a/go.mod h1:lNA+9X1NB3Zf8V7Ke586lFgjr2dZNuvo3lPJSGZ5JPQ=
github.com/google/btree v0.0.0-20180813153112-4030bb1f1f0c h1:964Od4U6p2jUkFxvCydnIczKteheJEzHRToSGK3Bnlw=
github.com/google/btree v0.0.0-20180813153112-4030bb1f1f0c/go.mod h1:lNA+9X1NB3Zf8V7Ke586lFgjr2dZNuvo3lPJSGZ5JPQ=
github.com/gopherjs/gopherjs v0.0.0-20181017120253-0766667cb4d1/go.mod h1:wJfORRmW1u3UXTncJ5qlYoELFm8eSnnEO6hX4iZ3EWY=
github.com/gopherjs/gopherjs v0.0.0-20181103185306-d547d1d9531e/go.mod h1:wJfORRmW1u3UXTncJ5qlYoELFm8eSnnEO6hX4iZ3EWY=
github.com/gopherjs/gopherjs v0.0.0-20190309154008-847fc94819f9 h1:Z0f701LpR4dqO92bP6TnIe3ZURClzJtBhds8R8u1HBE=
github.com/gopherjs/gopherjs v0.0.0-20190309154008-847fc94819f9/go.mod h1:wJfORRmW1u3UXTncJ5qlYoELFm8eSnnEO6hX4iZ3EWY=
github.com/gorilla/handlers v1.4.0 h1:XulKRWSQK5uChr4pEgSE4Tc/OcmnU9GJuSwdog/tZsA=
github.com/gorilla/handlers v1.4.0/go.mod h1:Qkdc/uu4tH4g6mTK6auzZ766c4CA0Ng8+o/OAirnOIQ=
github.com/gorilla/mux v1.7.0 h1:tOSd0UKHQd6urX6ApfOn4XdBMY6Sh1MfxV3kmaazO+U=
github.com/gorilla/mux v1.7.0/go.mod h1:1lud6UwP+6orDFRuTfBEV8e9/aOM/c4fVVCaMa2zaAs=
github.com/gorilla/websocket v1.4.0 h1:WDFjx/TMzVgy9VdMMQi2K2Emtwi2QcUQsztZ/zLaH/Q=
github.com/gorilla/websocket v1.4.0/go.mod h1:E7qHFY5m1UJ88s3WnNqhKjPHQ0heANvMoAMk2YaljkQ=
github.com/gosuri/uilive v0.0.0-20170323041506-ac356e6e42cd/go.mod h1:qkLSc0A5EXSP6B04TrN4oQoxqFI7A8XvoXSlJi8cwk8=
github.com/gosuri/uiprogress v0.0.0-20170224063937-d0567a9d84a1/go.mod h1:C1RTYn4Sc7iEyf6j8ft5dyoZ4212h8G1ol9QQluh5+0=
github.com/hashicorp/hcl v1.0.0 h1:0Anlzjpi4vEasTeNFn2mLJgTSwt0+6sfsiTG8qcWGx4=
github.com/hashicorp/hcl v1.0.0/go.mod h1:E5yfLk+7swimpb2L/Alb/PJmXilQ/rhwaUYs4T20WEQ=
github.com/huandu/xstrings v1.0.0/go.mod h1:4qWG/gcEcfX4z/mBDHJ++3ReCw9ibxbsNJbcucJdbSo=
github.com/huandu/xstrings v1.2.0 h1:yPeWdRnmynF7p+lLYz0H2tthW9lqhMJrQV/U7yy4wX0=
github.com/huandu/xstrings v1.2.0/go.mod h1:DvyZB1rfVYsBIigL8HwpZgxHwXozlTgGqn63UyNX5k4=
github.com/ipfs/go-ipfs v0.4.18/go.mod h1:iXzbK+Wa6eePj3jQg/uY6Uoq5iOwY+GToD/bgaRadto=
github.com/jessevdk/go-flags v1.4.0/go.mod h1:4FA24M0QyGHXBuZZK/XkWh8h0e1EYbRYJSGM75WSRxI=
github.com/jtolds/gls v4.2.1+incompatible/go.mod h1:QJZ7F/aHp+rZTRtaJ1ow/lLfFfVYBRgL+9YlvaHOwJU=
github.com/jtolds/gls v4.20.0+incompatible h1:xdiiI2gbIgH/gLH7ADydsJ1uDOEzR8yvV7C0MuV77Wo=
github.com/jtolds/gls v4.20.0+incompatible/go.mod h1:QJZ7F/aHp+rZTRtaJ1ow/lLfFfVYBRgL+9YlvaHOwJU=
github.com/konsorten/go-windows-terminal-sequences v1.0.1 h1:mweAR1A6xJ3oS2pRaGiHgQ4OO8tzTaLawm8vnODuwDk=
github.com/konsorten/go-windows-terminal-sequences v1.0.1/go.mod h1:T0+1ngSBFLxvqU3pZ+m/2kptfBszLMUkC4ZK/EgS/cQ=
github.com/magiconair/properties v1.8.0 h1:LLgXmsheXeRoUOBOjtwPQCWIYqM/LU1ayDtDePerRcY=
github.com/magiconair/properties v1.8.0/go.mod h1:PppfXfuXeibc/6YijjN8zIbojt8czPbwD3XqdrwzmxQ=
github.com/mattn/go-isatty v0.0.7/go.mod h1:Iq45c/XA43vh69/j3iqttzPXn0bhXyGjM0Hdxcsrc5s=
github.com/mattn/go-sqlite3 v1.7.0/go.mod h1:FPy6KqzDD04eiIsT53CuJW3U88zkxoIYsOqkbpncsNc=
github.com/mattn/go-sqlite3 v1.10.0 h1:jbhqpg7tQe4SupckyijYiy0mJJ/pRyHvXf7JdWK860o=
github.com/mattn/go-sqlite3 v1.10.0/go.mod h1:FPy6KqzDD04eiIsT53CuJW3U88zkxoIYsOqkbpncsNc=
github.com/mitchellh/mapstructure v1.1.2 h1:fmNYVwqnSfB9mZU6OS2O6GsXM+wcskZDuKQzvN1EDeE=
github.com/mitchellh/mapstructure v1.1.2/go.mod h1:FVVH3fgwuzCH5S8UJGiWEs2h04kUh9fWfEaFds41c1Y=
github.com/mitsuse/pushbullet-go v0.1.0 h1:W9izHOpz8uilRBgbYSnqb+LZK/l8Ad4slRTCBFpItG0=
github.com/mitsuse/pushbullet-go v0.1.0/go.mod h1:sJ6Y3IROSfSQNLY/8gtYjq4Gs49DFnrxaqxQA6DVgnM=
github.com/mmcdole/gofeed v1.0.0-beta2 h1:CjQ0ADhAwNSb08zknAkGOEYqr8zfZKfrzgk9BxpWP2E=
github.com/mmcdole/gofeed v1.0.0-beta2/go.mod h1:/BF9JneEL2/flujm8XHoxUcghdTV6vvb3xx/vKyChFU=
github.com/mmcdole/goxpp v0.0.0-20181012175147-0068e33feabf h1:sWGE2v+hO0Nd4yFU/S/mDBM5plIU8v/Qhfz41hkDIAI=
github.com/mmcdole/goxpp v0.0.0-20181012175147-0068e33feabf/go.mod h1:pasqhqstspkosTneA62Nc+2p9SOBBYAPbnmRRWPQ0V8=
github.com/mschoch/smat v0.0.0-20160514031455-90eadee771ae h1:VeRdUYdCw49yizlSbMEn2SZ+gT+3IUKx8BqxyQdz+BY=
github.com/mschoch/smat v0.0.0-20160514031455-90eadee771ae/go.mod h1:qAyveg+e4CE+eKJXWVjKXM4ck2QobLqTDytGJbLLhJg=
github.com/op/go-logging v0.0.0-20160315200505-970db520ece7 h1:lDH9UUVJtmYCjyT0CI4q8xvlXPxeZ0gYCVvWbmPlp88=
github.com/op/go-logging v0.0.0-20160315200505-970db520ece7/go.mod h1:HzydrMdWErDVzsI23lYNej1Htcns9BCg93Dk0bBINWk=
github.com/otiai10/copy v1.0.1 h1:gtBjD8aq4nychvRZ2CyJvFWAw0aja+VHazDdruZKGZA=
github.com/otiai10/copy v1.0.1/go.mod h1:8bMCJrAqOtN/d9oyh5HR7HhLQMvcGMpGdwRDYsfOCHc=
github.com/otiai10/curr v0.0.0-20150429015615-9b4961190c95 h1:+OLn68pqasWca0z5ryit9KGfp3sUsW4Lqg32iRMJyzs=
github.com/otiai10/curr v0.0.0-20150429015615-9b4961190c95/go.mod h1:9qAhocn7zKJG+0mI8eUu6xqkFDYS2kb2saOteoSB3cE=
github.com/otiai10/mint v1.2.3 h1:PsrRBmrxR68kyNu6YlqYHbNlItc5vOkuS6LBEsNttVA=
github.com/otiai10/mint v1.2.3/go.mod h1:YnfyPNhBvnY8bW4SGQHCs/aAFhkgySlMZbrF5U0bOVw=
github.com/pelletier/go-toml v1.2.0 h1:T5zMGML61Wp+FlcbWjRDT7yAxhJNAiPPLOFECq181zc=
github.com/pelletier/go-toml v1.2.0/go.mod h1:5z9KED0ma1S8pY6P1sdut58dfprrGBbd/94hg7ilaic=
github.com/philhofer/fwd v1.0.0 h1:UbZqGr5Y38ApvM/V/jEljVxwocdweyH+vmYvRPBnbqQ=
github.com/philhofer/fwd v1.0.0/go.mod h1:gk3iGcWd9+svBvR0sR+KPcfE+RNWozjowpeBVG3ZVNU=
github.com/pkg/errors v0.8.0/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pkg/errors v0.8.1 h1:iURUrRGxPUNPdy5/HRSm+Yj6okJ6UtLINN0Q9M4+h3I=
github.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/robfig/cron v0.0.0-20180505203441-b41be1df6967 h1:x7xEyJDP7Hv3LVgvWhzioQqbC/KtuUhTigKlH/8ehhE=
github.com/robfig/cron v0.0.0-20180505203441-b41be1df6967/go.mod h1:JGuDeoQd7Z6yL4zQhZ3OPEVHB7fL6Ka6skscFHfmt2k=
github.com/ryszard/goskiplist v0.0.0-20150312221310-2dfbae5fcf46 h1:GHRpF1pTW19a8tTFrMLUcfWwyC0pnifVo2ClaLq+hP8=
github.com/ryszard/goskiplist v0.0.0-20150312221310-2dfbae5fcf46/go.mod h1:uAQ5PCi+MFsC7HjREoAz1BU+Mq60+05gifQSsHSDG/8=
github.com/sirupsen/logrus v1.4.0 h1:yKenngtzGh+cUSSh6GWbxW2abRqhYUSR/t/6+2QqNvE=
github.com/sirupsen/logrus v1.4.0/go.mod h1:LxeOpSwHxABJmUn/MG1IvRgCAasNZTLOkJPxbbu5VWo=
github.com/smartystreets/assertions v0.0.0-20180927180507-b2de0cb4f26d/go.mod h1:OnSkiWE9lh6wB0YB77sQom3nweQdgAjqCqsofrRNTgc=
github.com/smartystreets/assertions v0.0.0-20190215210624-980c5ac6f3ac h1:wbW+Bybf9pXxnCFAOWZTqkRjAc7rAIwo2e1ArUhiHxg=
github.com/smartystreets/assertions v0.0.0-20190215210624-980c5ac6f3ac/go.mod h1:OnSkiWE9lh6wB0YB77sQom3nweQdgAjqCqsofrRNTgc=
github.com/smartystreets/goconvey v0.0.0-20181108003508-044398e4856c/go.mod h1:XDJAKZRPZ1CvBcN2aX5YOUTYGHki24fSF0Iv48Ibg0s=
github.com/smartystreets/goconvey v0.0.0-20190306220146-200a235640ff h1:86HlEv0yBCry9syNuylzqznKXDK11p6D0DT596yNMys=
github.com/smartystreets/goconvey v0.0.0-20190306220146-200a235640ff/go.mod h1:KSQcGKpxUMHk3nbYzs/tIBAM2iDooCn0BmttHOJEbLs=
github.com/spaolacci/murmur3 v0.0.0-20180118202830-f09979ecbc72/go.mod h1:JwIasOWyU6f++ZhiEuf87xNszmSA2myDM2Kzu9HwQUA=
github.com/spaolacci/murmur3 v1.1.0 h1:7c1g84S4BPRrfL5Xrdp6fOJ206sU9y293DDHaoy0bLI=
github.com/spaolacci/murmur3 v1.1.0/go.mod h1:JwIasOWyU6f++ZhiEuf87xNszmSA2myDM2Kzu9HwQUA=
github.com/spf13/afero v1.1.2 h1:m8/z1t7/fwjysjQRYbP0RD+bUIF/8tJwPdEZsI83ACI=
github.com/spf13/afero v1.1.2/go.mod h1:j4pytiNVoe2o6bmDsKpLACNPDBIoEAkihy7loJ1B0CQ=
github.com/spf13/cast v1.3.0 h1:oget//CVOEoFewqQxwr0Ej5yjygnqGkvggSE/gB35Q8=
github.com/spf13/cast v1.3.0/go.mod h1:Qx5cxh0v+4UWYiBimWS+eyWzqEqokIECu5etghLkUJE=
github.com/spf13/jwalterweatherman v1.0.0 h1:XHEdyB+EcvlqZamSM4ZOMGlc93t6AcsBEu9Gc1vn7yk=
github.com/spf13/jwalterweatherman v1.0.0/go.mod h1:cQK4TGJAtQXfYWX+Ddv3mKDzgVb68N+wFjFa4jdeBTo=
github.com/spf13/pflag v1.0.3 h1:zPAT6CGy6wXeQ7NtTnaTerfKOsV6V6F8agHXFiazDkg=
github.com/spf13/pflag v1.0.3/go.mod h1:DYY7MBk1bdzusC3SYhjObp+wFpr4gzcvqqNjLnInEg4=
github.com/spf13/viper v1.3.2 h1:VUFqw5KcqRf7i70GOzW7N+Q7+gxVBkSSqiXB12+JQ4M=
github.com/spf13/viper v1.3.2/go.mod h1:ZiWeW+zYFKm7srdB9IoDzzZXaJaI5eL9QjNiN/DMA2s=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/objx v0.1.1/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.2.1/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
github.com/stretchr/testify v1.3.0 h1:TivCn/peBQ7UY8ooIcPgZFpTNSz0Q2U6UrFlUfqbe0Q=
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
github.com/syncthing/syncthing v0.14.48-rc.4/go.mod h1:nw3siZwHPA6M8iSfjDCWQ402eqvEIasMQOE8nFOxy7M=
github.com/tinylib/msgp v1.0.2/go.mod h1:+d+yLhGm8mzTaHzB+wgMYrodPfmZrzkirds8fDWklFE=
github.com/tinylib/msgp v1.1.0 h1:9fQd+ICuRIu/ue4vxJZu6/LzxN0HwMds2nq/0cFvxHU=
github.com/tinylib/msgp v1.1.0/go.mod h1:+d+yLhGm8mzTaHzB+wgMYrodPfmZrzkirds8fDWklFE=
github.com/ugorji/go/codec v0.0.0-20181204163529-d75b2dcb6bc8/go.mod h1:VFNgLljTbGfSG7qAOspJ7OScBnGdDN/yBr0sguwnwf0=
github.com/vmihailenco/msgpack v4.0.3+incompatible h1:g+G529Dqo4BY2Gxn5GKENa/3NVK+mu/6hM7G3jEWszQ=
github.com/vmihailenco/msgpack v4.0.3+incompatible/go.mod h1:fy3FlTQTDXWkZ7Bh6AcGMlsjHatGryHQYUTf1ShIgkk=
github.com/willf/bitset v1.1.3/go.mod h1:RjeCKbqT1RxIR/KWY6phxZiaY1IyutSBfGjNPySAYV4=
github.com/willf/bitset v1.1.9/go.mod h1:RjeCKbqT1RxIR/KWY6phxZiaY1IyutSBfGjNPySAYV4=
github.com/willf/bitset v1.1.10 h1:NotGKqX0KwQ72NUzqrjZq5ipPNDQex9lo3WpaS8L2sc=
github.com/willf/bitset v1.1.10/go.mod h1:RjeCKbqT1RxIR/KWY6phxZiaY1IyutSBfGjNPySAYV4=
github.com/willf/bloom v0.0.0-20170505221640-54e3b963ee16/go.mod h1:MmAltL9pDMNTrvUkxdg0k0q5I0suxmuwp3KbyrZLOZ8=
github.com/willf/bloom v2.0.3+incompatible h1:QDacWdqcAUI1MPOwIQZRy9kOR7yxfyEmxX8Wdm2/JPA=
github.com/willf/bloom v2.0.3+incompatible/go.mod h1:MmAltL9pDMNTrvUkxdg0k0q5I0suxmuwp3KbyrZLOZ8=
github.com/xordataexchange/crypt v0.0.3-0.20170626215501-b2862e3d0a77/go.mod h1:aYKd//L2LvnjZzWKhF00oedf4jCCReLcmhLdhm1A27Q=
go.etcd.io/bbolt v1.3.2 h1:Z/90sZLPOeCy2PwprqkFa25PdkusRzaj9P8zm/KNyvk=
go.etcd.io/bbolt v1.3.2/go.mod h1:IbVyRI1SCnLcuJnV2u8VeU0CEYM7e686BmAb1XKL+uU=
golang.org/x/crypto v0.0.0-20180904163835-0709b304e793/go.mod h1:6SG95UA2DQfeDnfUPMdvaQW0Q7yPrPDi9nlGo2tz2b4=
golang.org/x/crypto v0.0.0-20181203042331-505ab145d0a9/go.mod h1:6SG95UA2DQfeDnfUPMdvaQW0Q7yPrPDi9nlGo2tz2b4=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2 h1:VklqNMn3ovrHsnt90PveolxSbWFaJdECFbxSq0Mqo2M=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/net v0.0.0-20180218175443-cbe0f9307d01/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20180524181706-dfa909b99c79/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20180724234803-3673e40ba225/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20181114220301-adae6a3d119a/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20181220203305-927f97764cc3/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20190318221613-d196dffd7c2b h1:ZWpVMTsK0ey5WJCu+vVdfMldWq7/ezaOcjnKWIHWVkE=
golang.org/x/net v0.0.0-20190318221613-d196dffd7c2b/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=
golang.org/x/sys v0.0.0-20180905080454-ebe1bf3edb33/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20181205085412-a5c9d58dba9a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190102155601-82a175fd1598/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190222072716-a9d3bda3a223/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190318195719-6c81ef8f67ca h1:o2TLx1bGN3W+Ei0EMU5fShLupLmTOU95KvJJmfYhAzM=
golang.org/x/sys v0.0.0-20190318195719-6c81ef8f67ca/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/text v0.3.0 h1:g61tztE5qeGQ89tm6NTjjM9VPIm088od1l6aSorWRWg=
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/time v0.0.0-20180412165947-fbb02b2291d2/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=
golang.org/x/time v0.0.0-20181108054448-85acf8d2951c/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=
golang.org/x/time v0.0.0-20190308202827-9d24e82272b4 h1:SvFZT6jyqRaOeXpc5h/JSfZenJ2O330aBsf7JfSUXmQ=
golang.org/x/time v0.0.0-20190308202827-9d24e82272b4/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=
google.golang.org/appengine v1.5.0 h1:KxkO13IPW4Lslp2bz+KHP2E3gtFlrIGNThxkZQ3g+4c=
google.golang.org/appengine v1.5.0/go.mod h1:xpcJRLb0r/rnEns0DIKYYv+WjYCduHsrkT7/EB5XEv4=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/yaml.v2 v2.2.2 h1:ZCJp+EgiOT7lHqUV2J862kp8Qj64Jo6az82+3Td9dZw=
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=

15
goTorrentWebUI/acorn Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/acorn/bin/acorn" "$@"
ret=$?
else
node "$basedir/node_modules/acorn/bin/acorn" "$@"
ret=$?
fi
exit $ret

7
goTorrentWebUI/acorn.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\acorn\bin\acorn" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\acorn\bin\acorn" %*
)

15
goTorrentWebUI/ansi-html Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/ansi-html/bin/ansi-html" "$@"
ret=$?
else
node "$basedir/node_modules/ansi-html/bin/ansi-html" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\ansi-html\bin\ansi-html" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\ansi-html\bin\ansi-html" %*
)

15
goTorrentWebUI/atob Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/atob/bin/atob.js" "$@"
ret=$?
else
node "$basedir/node_modules/atob/bin/atob.js" "$@"
ret=$?
fi
exit $ret

7
goTorrentWebUI/atob.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\atob\bin\atob.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\atob\bin\atob.js" %*
)

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/browserslist/cli.js" "$@"
ret=$?
else
node "$basedir/node_modules/browserslist/cli.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\browserslist\cli.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\browserslist\cli.js" %*
)

15
goTorrentWebUI/cssesc Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/cssesc/bin/cssesc" "$@"
ret=$?
else
node "$basedir/node_modules/cssesc/bin/cssesc" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\cssesc\bin\cssesc" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\cssesc\bin\cssesc" %*
)

15
goTorrentWebUI/csso Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/csso/bin/csso" "$@"
ret=$?
else
node "$basedir/node_modules/csso/bin/csso" "$@"
ret=$?
fi
exit $ret

7
goTorrentWebUI/csso.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\csso\bin\csso" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\csso\bin\csso" %*
)

15
goTorrentWebUI/detect Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/detect-port-alt/bin/detect-port" "$@"
ret=$?
else
node "$basedir/node_modules/detect-port-alt/bin/detect-port" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/detect-port-alt/bin/detect-port" "$@"
ret=$?
else
node "$basedir/node_modules/detect-port-alt/bin/detect-port" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\detect-port-alt\bin\detect-port" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\detect-port-alt\bin\detect-port" %*
)

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\detect-port-alt\bin\detect-port" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\detect-port-alt\bin\detect-port" %*
)

15
goTorrentWebUI/errno Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/errno/cli.js" "$@"
ret=$?
else
node "$basedir/node_modules/errno/cli.js" "$@"
ret=$?
fi
exit $ret

7
goTorrentWebUI/errno.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\errno\cli.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\errno\cli.js" %*
)

15
goTorrentWebUI/escodegen Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/escodegen/bin/escodegen.js" "$@"
ret=$?
else
node "$basedir/node_modules/escodegen/bin/escodegen.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\escodegen\bin\escodegen.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\escodegen\bin\escodegen.js" %*
)

15
goTorrentWebUI/esgenerate Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/escodegen/bin/esgenerate.js" "$@"
ret=$?
else
node "$basedir/node_modules/escodegen/bin/esgenerate.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\escodegen\bin\esgenerate.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\escodegen\bin\esgenerate.js" %*
)

15
goTorrentWebUI/eslint Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/eslint/bin/eslint.js" "$@"
ret=$?
else
node "$basedir/node_modules/eslint/bin/eslint.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\eslint\bin\eslint.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\eslint\bin\eslint.js" %*
)

15
goTorrentWebUI/esparse Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/esprima/bin/esparse.js" "$@"
ret=$?
else
node "$basedir/node_modules/esprima/bin/esparse.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\esprima\bin\esparse.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\esprima\bin\esparse.js" %*
)

15
goTorrentWebUI/esvalidate Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/esprima/bin/esvalidate.js" "$@"
ret=$?
else
node "$basedir/node_modules/esprima/bin/esvalidate.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\esprima\bin\esvalidate.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\esprima\bin\esvalidate.js" %*
)

15
goTorrentWebUI/handlebars Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/handlebars/bin/handlebars" "$@"
ret=$?
else
node "$basedir/node_modules/handlebars/bin/handlebars" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\handlebars\bin\handlebars" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\handlebars\bin\handlebars" %*
)

15
goTorrentWebUI/he Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/he/bin/he" "$@"
ret=$?
else
node "$basedir/node_modules/he/bin/he" "$@"
ret=$?
fi
exit $ret

7
goTorrentWebUI/he.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\he\bin\he" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\he\bin\he" %*
)

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/html-minifier/cli.js" "$@"
ret=$?
else
node "$basedir/node_modules/html-minifier/cli.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\html-minifier\cli.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\html-minifier\cli.js" %*
)

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/import-local/fixtures/cli.js" "$@"
ret=$?
else
node "$basedir/node_modules/import-local/fixtures/cli.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\import-local\fixtures\cli.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\import-local\fixtures\cli.js" %*
)

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/internal-ip/cli.js" "$@"
ret=$?
else
node "$basedir/node_modules/internal-ip/cli.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\internal-ip\cli.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\internal-ip\cli.js" %*
)

15
goTorrentWebUI/is-ci Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/is-ci/bin.js" "$@"
ret=$?
else
node "$basedir/node_modules/is-ci/bin.js" "$@"
ret=$?
fi
exit $ret

7
goTorrentWebUI/is-ci.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\is-ci\bin.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\is-ci\bin.js" %*
)

15
goTorrentWebUI/jest Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/jest/bin/jest.js" "$@"
ret=$?
else
node "$basedir/node_modules/jest/bin/jest.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/jest-runtime/bin/jest-runtime.js" "$@"
ret=$?
else
node "$basedir/node_modules/jest-runtime/bin/jest-runtime.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\jest-runtime\bin\jest-runtime.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\jest-runtime\bin\jest-runtime.js" %*
)

7
goTorrentWebUI/jest.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\jest\bin\jest.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\jest\bin\jest.js" %*
)

15
goTorrentWebUI/js-yaml Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/js-yaml/bin/js-yaml.js" "$@"
ret=$?
else
node "$basedir/node_modules/js-yaml/bin/js-yaml.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\js-yaml\bin\js-yaml.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\js-yaml\bin\js-yaml.js" %*
)

View File

@@ -128,7 +128,7 @@ var torrentListRequest = {
console.log("Logger data requested");
break;
case "rssListRequest":
case "rssList":
console.log("RSSListRequest recieved", evt.data);
RSSList = [];
for (var i = 0; i < serverMessage.TotalRSSFeeds; i++) {
@@ -191,7 +191,7 @@ var BackendSocket = function (_React$Component) {
case 1:
var peerListHashes = {
MessageType: "torrentPeerListRequest",
Payload: selectionHashes
Payload: {"PeerListHash": selectionHashes}
};
console.log("Peers tab information requested", peerListHashes);
ws.send(JSON.stringify(peerListHashes));
@@ -199,7 +199,7 @@ var BackendSocket = function (_React$Component) {
case 2:
var fileListHashes = {
MessageType: "torrentFileListRequest",
Payload: selectionHashes
Payload: {"FileListHash": selectionHashes[0]}
};
console.log("Files tab information requested", fileListHashes);
ws.send(JSON.stringify(fileListHashes));
@@ -256,7 +256,7 @@ var BackendSocket = function (_React$Component) {
case 1:
var peerListHashes = {
MessageType: "torrentPeerListRequest",
Payload: this.props.selectionHashes
Payload: {"PeerListHash": this.props.selectionHashes}
};
ws.send(JSON.stringify(peerListHashes));
this.props.newPeerList(peerList);
@@ -264,7 +264,7 @@ var BackendSocket = function (_React$Component) {
case 2:
var fileListHashes = {
MessageType: "torrentFileListRequest",
Payload: this.props.selectionHashes
Payload: {"FileListHash": this.props.selectionHashes[0]}
};
ws.send(JSON.stringify(fileListHashes));
this.props.newFileList(fileList);

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/miller-rabin/bin/miller-rabin" "$@"
ret=$?
else
node "$basedir/node_modules/miller-rabin/bin/miller-rabin" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\miller-rabin\bin\miller-rabin" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\miller-rabin\bin\miller-rabin" %*
)

15
goTorrentWebUI/mime Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/mime/cli.js" "$@"
ret=$?
else
node "$basedir/node_modules/mime/cli.js" "$@"
ret=$?
fi
exit $ret

7
goTorrentWebUI/mime.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\mime\cli.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\mime\cli.js" %*
)

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/multicast-dns/cli.js" "$@"
ret=$?
else
node "$basedir/node_modules/multicast-dns/cli.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\multicast-dns\cli.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\multicast-dns\cli.js" %*
)

View File

@@ -1,7 +1,7 @@
/**
* Bundle of @devexpress/dx-core
* Generated: 2017-11-10
* Version: 1.0.0-beta.1
* Generated: 2018-03-02
* Version: 1.0.3
* License: https://js.devexpress.com/Licensing
*/
@@ -186,22 +186,22 @@ var PluginHost = function () {
this.plugins.filter(function (plugin) {
return plugin.container;
}).forEach(function (plugin) {
if (knownOptionals.has(plugin.pluginName)) {
throw getDependencyError(knownOptionals.get(plugin.pluginName), plugin.pluginName);
if (knownOptionals.has(plugin.name)) {
throw getDependencyError(knownOptionals.get(plugin.name), plugin.name);
}
plugin.dependencies.forEach(function (dependency) {
if (defined.has(dependency.pluginName)) return;
if (defined.has(dependency.name)) return;
if (dependency.optional) {
if (!knownOptionals.has(dependency.pluginName)) {
knownOptionals.set(dependency.pluginName, plugin.pluginName);
if (!knownOptionals.has(dependency.name)) {
knownOptionals.set(dependency.name, plugin.name);
}
return;
}
throw getDependencyError(plugin.pluginName, dependency.pluginName);
throw getDependencyError(plugin.name, dependency.name);
});
defined.add(plugin.pluginName);
defined.add(plugin.name);
});
}
}, {

File diff suppressed because one or more lines are too long

View File

@@ -1,7 +1,7 @@
/**
* Bundle of @devexpress/dx-core
* Generated: 2017-11-10
* Version: 1.0.0-beta.1
* Generated: 2018-03-02
* Version: 1.0.3
* License: https://js.devexpress.com/Licensing
*/
@@ -192,22 +192,22 @@ var PluginHost = function () {
this.plugins.filter(function (plugin) {
return plugin.container;
}).forEach(function (plugin) {
if (knownOptionals.has(plugin.pluginName)) {
throw getDependencyError(knownOptionals.get(plugin.pluginName), plugin.pluginName);
if (knownOptionals.has(plugin.name)) {
throw getDependencyError(knownOptionals.get(plugin.name), plugin.name);
}
plugin.dependencies.forEach(function (dependency) {
if (defined.has(dependency.pluginName)) return;
if (defined.has(dependency.name)) return;
if (dependency.optional) {
if (!knownOptionals.has(dependency.pluginName)) {
knownOptionals.set(dependency.pluginName, plugin.pluginName);
if (!knownOptionals.has(dependency.name)) {
knownOptionals.set(dependency.name, plugin.name);
}
return;
}
throw getDependencyError(plugin.pluginName, dependency.pluginName);
throw getDependencyError(plugin.name, dependency.name);
});
defined.add(plugin.pluginName);
defined.add(plugin.name);
});
}
}, {

File diff suppressed because one or more lines are too long

View File

@@ -1,8 +1,8 @@
{
"_from": "@devexpress/dx-core",
"_id": "@devexpress/dx-core@1.0.0-beta.1",
"_id": "@devexpress/dx-core@1.0.3",
"_inBundle": false,
"_integrity": "sha512-4Kv5RTlmlK7o2DF5BB5r2yWgshvFrUSHWzJzdSyBtFxsQzvI3vJqS0Z0mAplZCyYfRk4xh9SRp6I9DML66v0EQ==",
"_integrity": "sha512-M1Kjju074ddAQmaFuKypM/LdhCZsDISqhGj4LST2ZGQPlGpH89BMBEV8p+8MedFQQCG/svuS25AKip1Gs9KJgA==",
"_location": "/@devexpress/dx-core",
"_phantomChildren": {},
"_requested": {
@@ -19,10 +19,10 @@
"_requiredBy": [
"#USER"
],
"_resolved": "https://registry.npmjs.org/@devexpress/dx-core/-/dx-core-1.0.0-beta.1.tgz",
"_shasum": "63383ec2bd3903d9a163c1316706cde32227d6b4",
"_resolved": "https://registry.npmjs.org/@devexpress/dx-core/-/dx-core-1.0.3.tgz",
"_shasum": "c310b540229f83d6be5797fb2a5da5491757d21b",
"_spec": "@devexpress/dx-core",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI",
"author": {
"name": "Developer Express Inc.",
"url": "https://www.devexpress.com/"
@@ -35,20 +35,20 @@
"description": "Core library for DevExtreme Reactive Components",
"devDependencies": {
"babel-core": "^6.26.0",
"babel-jest": "^21.2.0",
"babel-jest": "^22.1.0",
"babel-plugin-external-helpers": "^6.22.0",
"babel-plugin-transform-object-rest-spread": "^6.26.0",
"babel-plugin-transform-runtime": "^6.23.0",
"babel-preset-es2015": "^6.24.1",
"core-js": "^2.5.1",
"eslint": "^4.10.0",
"core-js": "^2.5.3",
"eslint": "^4.16.0",
"eslint-config-airbnb-base": "^12.1.0",
"eslint-plugin-filenames": "^1.2.0",
"eslint-plugin-import": "^2.8.0",
"eslint-plugin-jest": "^21.3.0",
"jest": "^21.2.1",
"eslint-plugin-jest": "^21.7.0",
"jest": "^22.1.4",
"rollup": "0.50.0",
"rollup-plugin-babel": "^3.0.2",
"rollup-plugin-babel": "^3.0.3",
"rollup-plugin-license": "^0.5.0"
},
"files": [
@@ -81,5 +81,5 @@
"test:coverage": "jest --coverage",
"test:watch": "jest --watch"
},
"version": "1.0.0-beta.1"
"version": "1.0.3"
}

View File

@@ -1,7 +1,7 @@
/**
* Bundle of @devexpress/dx-grid-core
* Generated: 2017-11-10
* Version: 1.0.0-beta.1
* Generated: 2018-03-02
* Version: 1.0.3
* License: https://js.devexpress.com/Licensing
*/
@@ -19,10 +19,12 @@ var rowIdGetter = function rowIdGetter(getRowId, rows) {
return getRowId;
};
var cellValueGetter = function cellValueGetter(getCellValue, columns) {
if (getCellValue) {
return getCellValue;
}
var defaultGetCellValue = function defaultGetCellValue(row, columnName) {
return row[columnName];
};
var cellValueGetter = function cellValueGetter() {
var getCellValue = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : defaultGetCellValue;
var columns = arguments[1];
var useFastAccessor = true;
var map = columns.reduce(function (acc, column) {
@@ -33,28 +35,29 @@ var cellValueGetter = function cellValueGetter(getCellValue, columns) {
return acc;
}, {});
return useFastAccessor ? function (row, columnName) {
return row[columnName];
} : function (row, columnName) {
return map[columnName] ? map[columnName](row, columnName) : row[columnName];
if (useFastAccessor) {
return getCellValue;
}
return function (row, columnName) {
return map[columnName] ? map[columnName](row, columnName) : getCellValue(row, columnName);
};
};
var setColumnSorting = function setColumnSorting(state, _ref) {
var changeColumnSorting = function changeColumnSorting(state, _ref) {
var columnName = _ref.columnName,
direction = _ref.direction,
keepOther = _ref.keepOther,
cancel = _ref.cancel,
sortIndex = _ref.sortIndex;
var sorting = state.sorting;
var nextSorting = [];
if (keepOther === true) {
nextSorting = Array.from(sorting).slice();
nextSorting = sorting.slice();
}
if (Array.isArray(keepOther)) {
nextSorting = Array.from(sorting).filter(function (columnSorting) {
nextSorting = sorting.slice().filter(function (columnSorting) {
return keepOther.indexOf(columnSorting.columnName) > -1;
});
}
@@ -72,7 +75,7 @@ var setColumnSorting = function setColumnSorting(state, _ref) {
nextSorting.splice(columnSortingIndex, 1);
}
if (!cancel) {
if (direction !== null) {
var newIndexFallback = columnSortingIndex > -1 ? columnSortingIndex : nextSorting.length;
var newIndex = sortIndex !== undefined ? sortIndex : newIndexFallback;
nextSorting.splice(newIndex, 0, newColumnSorting);
@@ -446,7 +449,7 @@ var defaultCompare = function defaultCompare(a, b) {
};
var createCompare = function createCompare(sorting, getColumnCompare, getComparableValue) {
return Array.from(sorting).reverse().reduce(function (prevCompare, columnSorting) {
return sorting.slice().reverse().reduce(function (prevCompare, columnSorting) {
var columnName = columnSorting.columnName;
var inverse = columnSorting.direction === 'desc';
@@ -495,7 +498,7 @@ var sortedRows = function sortedRows(rows, sorting, getCellValue, getColumnCompa
if (!getRowLevelKey) {
var _compare = createCompare(sorting, getColumnCompare, getCellValue);
return mergeSort(Array.from(rows), _compare);
return mergeSort(rows.slice(), _compare);
}
var compare = createCompare(sorting, getColumnCompare, function (row, columnName) {
@@ -510,14 +513,14 @@ var sortedRows = function sortedRows(rows, sorting, getCellValue, getColumnCompa
return sortHierarchicalRows(rows, compare, getRowLevelKey);
};
var setColumnFilter = function setColumnFilter(filters, _ref) {
var changeColumnFilter = function changeColumnFilter(filters, _ref) {
var columnName = _ref.columnName,
config = _ref.config;
var filterIndex = filters.findIndex(function (f) {
return f.columnName === columnName;
});
var nextState = Array.from(filters);
var nextState = filters.slice();
if (config) {
var filter = _extends({ columnName: columnName }, config);
@@ -624,56 +627,67 @@ var filteredRows = function filteredRows(rows, filters, getCellValue, getColumnP
var GROUP_KEY_SEPARATOR = '|';
var groupByColumn = function groupByColumn(state, _ref) {
var applyColumnGrouping = function applyColumnGrouping(grouping, _ref) {
var columnName = _ref.columnName,
groupIndex = _ref.groupIndex;
var grouping = Array.from(state.grouping);
var groupingIndex = grouping.findIndex(function (g) {
var nextGrouping = grouping.slice();
var groupingIndex = nextGrouping.findIndex(function (g) {
return g.columnName === columnName;
});
var targetIndex = groupIndex;
if (groupingIndex > -1) {
grouping.splice(groupingIndex, 1);
nextGrouping.splice(groupingIndex, 1);
} else if (groupIndex === undefined) {
targetIndex = grouping.length;
targetIndex = nextGrouping.length;
}
if (targetIndex > -1) {
grouping.splice(targetIndex, 0, {
nextGrouping.splice(targetIndex, 0, {
columnName: columnName
});
}
var ungroupedColumnIndex = state.grouping.findIndex(function (group, index) {
return !grouping[index] || group.columnName !== grouping[index].columnName;
return nextGrouping;
};
var changeColumnGrouping = function changeColumnGrouping(_ref2, _ref3) {
var grouping = _ref2.grouping,
expandedGroups = _ref2.expandedGroups;
var columnName = _ref3.columnName,
groupIndex = _ref3.groupIndex;
var nextGrouping = applyColumnGrouping(grouping, { columnName: columnName, groupIndex: groupIndex });
var ungroupedColumnIndex = grouping.findIndex(function (group, index) {
return !nextGrouping[index] || group.columnName !== nextGrouping[index].columnName;
});
if (ungroupedColumnIndex === -1) {
return {
grouping: grouping
grouping: nextGrouping
};
}
var filteredExpandedGroups = state.expandedGroups.filter(function (group) {
var filteredExpandedGroups = expandedGroups.filter(function (group) {
return group.split(GROUP_KEY_SEPARATOR).length <= ungroupedColumnIndex;
});
if (filteredExpandedGroups.length === state.expandedGroups.length) {
if (filteredExpandedGroups.length === expandedGroups.length) {
return {
grouping: grouping
grouping: nextGrouping
};
}
return {
grouping: grouping,
grouping: nextGrouping,
expandedGroups: filteredExpandedGroups
};
};
var toggleExpandedGroups = function toggleExpandedGroups(state, _ref2) {
var groupKey = _ref2.groupKey;
var toggleExpandedGroups = function toggleExpandedGroups(state, _ref4) {
var groupKey = _ref4.groupKey;
var expandedGroups = Array.from(state.expandedGroups);
var expandedGroups = state.expandedGroups.slice();
var groupKeyIndex = expandedGroups.indexOf(groupKey);
if (groupKeyIndex > -1) {
@@ -687,40 +701,20 @@ var toggleExpandedGroups = function toggleExpandedGroups(state, _ref2) {
};
};
var draftGroupingChange = function draftGroupingChange(state, _ref3) {
var columnName = _ref3.columnName,
groupIndex = _ref3.groupIndex;
return { groupingChange: { columnName: columnName, groupIndex: groupIndex } };
var draftColumnGrouping = function draftColumnGrouping(_ref5, _ref6) {
var grouping = _ref5.grouping,
draftGrouping = _ref5.draftGrouping;
var columnName = _ref6.columnName,
groupIndex = _ref6.groupIndex;
return {
draftGrouping: applyColumnGrouping(draftGrouping || grouping, { columnName: columnName, groupIndex: groupIndex })
};
};
var cancelGroupingChange = function cancelGroupingChange() {
return { groupingChange: null };
};
var draftGrouping = function draftGrouping(grouping, groupingChange) {
if (!groupingChange) return grouping;
var columnName = groupingChange.columnName,
groupIndex = groupingChange.groupIndex;
var result = Array.from(grouping);
if (groupIndex !== -1) {
result = result.filter(function (g) {
return g.columnName !== columnName;
});
result.splice(groupIndex, 0, {
columnName: columnName,
draft: true,
mode: grouping.length > result.length ? 'reorder' : 'add'
});
} else {
result = result.map(function (g) {
return g.columnName === columnName ? { columnName: columnName, draft: true, mode: 'remove' } : g;
});
}
return result;
var cancelColumnGroupingDraft = function cancelColumnGroupingDraft() {
return {
draftGrouping: null
};
};
var GRID_GROUP_TYPE = 'group';
@@ -735,26 +729,26 @@ var groupRowLevelKeyGetter = function groupRowLevelKeyGetter(row) {
return row[GRID_GROUP_LEVEL_KEY];
};
var defaultColumnIdentity = function defaultColumnIdentity(value) {
var defaultColumnCriteria = function defaultColumnCriteria(value) {
return {
key: String(value),
value: value
};
};
var groupedRows = function groupedRows(rows, grouping, getCellValue, getColumnIdentity) {
var groupedRows = function groupedRows(rows, grouping, getCellValue, getColumnCriteria) {
var keyPrefix = arguments.length > 4 && arguments[4] !== undefined ? arguments[4] : '';
if (!grouping.length) return rows;
var columnName = grouping[0].columnName;
var groupIdentity = getColumnIdentity && getColumnIdentity(columnName) || defaultColumnIdentity;
var groupCriteria = getColumnCriteria && getColumnCriteria(columnName) || defaultColumnCriteria;
var groups = rows.reduce(function (acc, row) {
var _groupIdentity = groupIdentity(getCellValue(row, columnName), row),
key = _groupIdentity.key,
_groupIdentity$value = _groupIdentity.value,
value = _groupIdentity$value === undefined ? key : _groupIdentity$value;
var _groupCriteria = groupCriteria(getCellValue(row, columnName), row),
key = _groupCriteria.key,
_groupCriteria$value = _groupCriteria.value,
value = _groupCriteria$value === undefined ? key : _groupCriteria$value;
var sameKeyItems = acc.get(key);
@@ -778,7 +772,7 @@ var groupedRows = function groupedRows(rows, grouping, getCellValue, getColumnId
var compoundKey = '' + keyPrefix + key;
acc.push((_acc$push = {}, defineProperty(_acc$push, GRID_GROUP_CHECK, true), defineProperty(_acc$push, GRID_GROUP_LEVEL_KEY, GRID_GROUP_TYPE + '_' + groupedBy), defineProperty(_acc$push, 'groupedBy', groupedBy), defineProperty(_acc$push, 'compoundKey', compoundKey), defineProperty(_acc$push, 'key', key), defineProperty(_acc$push, 'value', value), _acc$push));
acc.push.apply(acc, toConsumableArray(groupedRows(items, nestedGrouping, getCellValue, getColumnIdentity, '' + compoundKey + GROUP_KEY_SEPARATOR)));
acc.push.apply(acc, toConsumableArray(groupedRows(items, nestedGrouping, getCellValue, getColumnCriteria, '' + compoundKey + GROUP_KEY_SEPARATOR)));
return acc;
}, []);
};
@@ -789,6 +783,7 @@ var expandedGroupRows = function expandedGroupRows(rows, grouping, expandedGroup
var groupingColumnNames = grouping.map(function (columnGrouping) {
return columnGrouping.columnName;
});
var expandedGroupsSet = new Set(expandedGroups);
var currentGroupExpanded = true;
var currentGroupLevel = 0;
@@ -807,7 +802,7 @@ var expandedGroupRows = function expandedGroupRows(rows, grouping, expandedGroup
return acc;
}
currentGroupExpanded = expandedGroups.has(row.compoundKey);
currentGroupExpanded = expandedGroupsSet.has(row.compoundKey);
currentGroupLevel = groupLevel;
if (currentGroupExpanded) {
@@ -864,19 +859,34 @@ var customGroupingRowIdGetter = function customGroupingRowIdGetter(getRowId, row
};
};
var groupingPanelItems = function groupingPanelItems(columns, grouping) {
return grouping.map(function (_ref) {
var columnName = _ref.columnName,
draft = _ref.draft;
var column = columns.find(function (c) {
return c.name === columnName;
});
var groupingPanelItems = function groupingPanelItems(columns, grouping, draftGrouping) {
var items = draftGrouping.map(function (_ref) {
var columnName = _ref.columnName;
return {
column: column,
draft: draft
column: columns.find(function (c) {
return c.name === columnName;
}),
draft: !grouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
})
};
});
grouping.forEach(function (_ref2, index) {
var columnName = _ref2.columnName;
if (draftGrouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
})) return;
items.splice(index, 0, {
column: columns.find(function (c) {
return c.name === columnName;
}),
draft: true
});
});
return items;
};
var setCurrentPage = function setCurrentPage(prevPage, page) {
@@ -955,47 +965,21 @@ var lastRowOnPage = function lastRowOnPage(currentPage, pageSize, totalRowCount)
return result;
};
var setRowSelection = function setRowSelection(selection, _ref) {
var rowId = _ref.rowId,
selected = _ref.selected;
var selectedRows = Array.from(selection);
var selectedIndex = selectedRows.indexOf(rowId);
var isRowSelected = selected;
if (isRowSelected === undefined) {
isRowSelected = selectedIndex === -1;
}
if (selectedIndex > -1 && !isRowSelected) {
selectedRows.splice(selectedIndex, 1);
} else if (selectedIndex === -1 && isRowSelected) {
selectedRows.push(rowId);
}
return selectedRows;
};
var setRowsSelection = function setRowsSelection(selection, _ref2) {
var rowIds = _ref2.rowIds,
selected = _ref2.selected;
if (rowIds.length === 1) {
return setRowSelection(selection, { rowId: rowIds[0], selected: selected });
}
var toggleSelection = function toggleSelection(selection, _ref) {
var rowIds = _ref.rowIds,
state = _ref.state;
var rowIdsSet = new Set(rowIds);
var isRowsSelected = selected;
if (isRowsSelected === undefined) {
var rowsState = state;
if (rowsState === undefined) {
var availableSelection = selection.filter(function (rowId) {
return rowIdsSet.has(rowId);
});
isRowsSelected = availableSelection.length !== rowIdsSet.size;
rowsState = availableSelection.length !== rowIdsSet.size;
}
if (isRowsSelected) {
if (rowsState) {
var selectionSet = new Set(selection);
return [].concat(toConsumableArray(selection), toConsumableArray(rowIds.filter(function (rowId) {
return !selectionSet.has(rowId);
@@ -1007,41 +991,63 @@ var setRowsSelection = function setRowsSelection(selection, _ref2) {
});
};
var getAvailableToSelect = function getAvailableToSelect(rows, getRowId, isGroupRow) {
var rowsWithAvailableToSelect = function rowsWithAvailableToSelect(rows, getRowId, isGroupRow) {
var dataRows = rows;
if (isGroupRow) {
dataRows = dataRows.filter(function (row) {
return !isGroupRow(row);
});
}
return dataRows.map(function (row) {
return getRowId(row);
return { rows: rows, availableToSelect: dataRows.map(function (row) {
return getRowId(row);
}) };
};
var someSelected = function someSelected(_ref, selection) {
var availableToSelect = _ref.availableToSelect;
var selectionSet = new Set(selection);
return availableToSelect.length !== 0 && selectionSet.size !== 0 && availableToSelect.some(function (elem) {
return selectionSet.has(elem);
}) && availableToSelect.some(function (elem) {
return !selectionSet.has(elem);
});
};
var getAvailableSelection = function getAvailableSelection(selection, availableToSelect) {
var availableToSelectSet = new Set(availableToSelect);
return selection.filter(function (selected) {
return availableToSelectSet.has(selected);
var allSelected = function allSelected(_ref2, selection) {
var availableToSelect = _ref2.availableToSelect;
var selectionSet = new Set(selection);
return selectionSet.size !== 0 && availableToSelect.length !== 0 && !availableToSelect.some(function (elem) {
return !selectionSet.has(elem);
});
};
var startEditRows = function startEditRows(prevEditingRows, _ref) {
var unwrapSelectedRows = function unwrapSelectedRows(_ref3) {
var rows = _ref3.rows;
return rows;
};
var startEditRows = function startEditRows(prevEditingRowIds, _ref) {
var rowIds = _ref.rowIds;
return [].concat(toConsumableArray(prevEditingRows), toConsumableArray(rowIds));
return [].concat(toConsumableArray(prevEditingRowIds), toConsumableArray(rowIds));
};
var stopEditRows = function stopEditRows(prevEditingRows, _ref2) {
var stopEditRows = function stopEditRows(prevEditingRowIds, _ref2) {
var rowIds = _ref2.rowIds;
var rowIdSet = new Set(rowIds);
return prevEditingRows.filter(function (id) {
return prevEditingRowIds.filter(function (id) {
return !rowIdSet.has(id);
});
};
var addRow = function addRow(addedRows, _ref3) {
var row = _ref3.row;
var addRow = function addRow(addedRows) {
var _ref3 = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : { row: {} },
row = _ref3.row;
return [].concat(toConsumableArray(addedRows), [row]);
};
@@ -1049,7 +1055,7 @@ var changeAddedRow = function changeAddedRow(addedRows, _ref4) {
var rowId = _ref4.rowId,
change = _ref4.change;
var result = Array.from(addedRows);
var result = addedRows.slice();
result[rowId] = _extends({}, result[rowId], change);
return result;
};
@@ -1067,34 +1073,34 @@ var cancelAddedRows = function cancelAddedRows(addedRows, _ref5) {
return result;
};
var changeRow = function changeRow(prevChangedRows, _ref6) {
var changeRow = function changeRow(prevRowChanges, _ref6) {
var rowId = _ref6.rowId,
change = _ref6.change;
var prevChange = prevChangedRows[rowId] || {};
return _extends({}, prevChangedRows, defineProperty({}, rowId, _extends({}, prevChange, change)));
var prevChange = prevRowChanges[rowId] || {};
return _extends({}, prevRowChanges, defineProperty({}, rowId, _extends({}, prevChange, change)));
};
var cancelChanges = function cancelChanges(prevChangedRows, _ref7) {
var cancelChanges = function cancelChanges(prevRowChanges, _ref7) {
var rowIds = _ref7.rowIds;
var result = _extends({}, prevChangedRows);
var result = _extends({}, prevRowChanges);
rowIds.forEach(function (rowId) {
delete result[rowId];
});
return result;
};
var deleteRows = function deleteRows(deletedRows, _ref8) {
var deleteRows = function deleteRows(deletedRowIds, _ref8) {
var rowIds = _ref8.rowIds;
return [].concat(toConsumableArray(deletedRows), toConsumableArray(rowIds));
return [].concat(toConsumableArray(deletedRowIds), toConsumableArray(rowIds));
};
var cancelDeletedRows = function cancelDeletedRows(deletedRows, _ref9) {
var cancelDeletedRows = function cancelDeletedRows(deletedRowIds, _ref9) {
var rowIds = _ref9.rowIds;
var rowIdSet = new Set(rowIds);
return deletedRows.filter(function (rowId) {
return deletedRowIds.filter(function (rowId) {
return !rowIdSet.has(rowId);
});
};
@@ -1118,21 +1124,30 @@ var addedRowsByIds = function addedRowsByIds(addedRows, rowIds) {
return result;
};
var computedCreateRowChange = function computedCreateRowChange(columns) {
var map = columns.reduce(function (acc, column) {
if (column.createRowChange) {
acc[column.name] = column.createRowChange;
var defaultCreateRowChange = function defaultCreateRowChange(row, value, columnName) {
return defineProperty({}, columnName, value);
};
var createRowChangeGetter = function createRowChangeGetter() {
var createRowChange = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : defaultCreateRowChange;
var columnExtensions = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : [];
var map = columnExtensions.reduce(function (acc, columnExtension) {
if (columnExtension.createRowChange) {
acc[columnExtension.columnName] = columnExtension.createRowChange;
}
return acc;
}, {});
return function (row, columnName, value) {
return map[columnName] ? map[columnName](row, value, columnName) : defineProperty({}, columnName, value);
return function (row, value, columnName) {
if (map[columnName]) {
return map[columnName](row, value, columnName);
}
return createRowChange(row, value, columnName);
};
};
var getRowChange = function getRowChange(changedRows, rowId) {
return changedRows[rowId] || {};
var getRowChange = function getRowChange(rowChanges, rowId) {
return rowChanges[rowId] || {};
};
var TABLE_REORDERING_TYPE = 'reordering';
@@ -1143,7 +1158,7 @@ var changeColumnOrder = function changeColumnOrder(order, _ref) {
var sourceColumnIndex = order.indexOf(sourceColumnName);
var targetColumnIndex = order.indexOf(targetColumnName);
var newOrder = Array.from(order);
var newOrder = order.slice();
newOrder.splice(sourceColumnIndex, 1);
newOrder.splice(targetColumnIndex, 0, sourceColumnName);
@@ -1154,7 +1169,7 @@ var TABLE_DATA_TYPE = 'data';
var TABLE_NODATA_TYPE = 'nodata';
var orderedColumns = function orderedColumns(tableColumns, order) {
var result = Array.from(tableColumns);
var result = tableColumns.slice();
result.sort(function (a, b) {
if (a.type !== TABLE_DATA_TYPE || b.type !== TABLE_DATA_TYPE) return 0;
@@ -1194,7 +1209,11 @@ var tableColumnsWithWidths = function tableColumnsWithWidths(tableColumns, colum
return tableColumns.reduce(function (acc, tableColumn) {
if (tableColumn.type === 'data') {
var columnName = tableColumn.column.name;
var width = draftColumnWidths[columnName] || columnWidths[columnName];
var isCurrentColumn = function isCurrentColumn(elem) {
return elem.columnName === columnName;
};
var column = draftColumnWidths.find(isCurrentColumn) || columnWidths.find(isCurrentColumn);
var width = column && column.width;
if (width === undefined) {
throw new Error(UNSET_COLUMN_WIDTH_ERROR.replace('$1', columnName));
}
@@ -1208,36 +1227,43 @@ var tableColumnsWithWidths = function tableColumnsWithWidths(tableColumns, colum
var MIN_SIZE = 40;
var changeTableColumnWidths = function changeTableColumnWidths(state, _ref) {
var shifts = _ref.shifts;
var changeTableColumnWidth = function changeTableColumnWidth(state, _ref) {
var columnName = _ref.columnName,
shift = _ref.shift;
var columnWidths = state.columnWidths;
var updatedColumnWidths = Object.keys(shifts).reduce(function (acc, columnName) {
var size = Math.max(MIN_SIZE, columnWidths[columnName] + shifts[columnName]);
return Object.assign(acc, defineProperty({}, columnName, size));
}, {});
return _extends({}, state, {
columnWidths: _extends({}, columnWidths, updatedColumnWidths),
draftColumnWidths: {}
var nextColumnWidth = columnWidths.slice();
var index = nextColumnWidth.findIndex(function (elem) {
return elem.columnName === columnName;
});
var updatedColumn = nextColumnWidth[index];
var size = Math.max(MIN_SIZE, updatedColumn.width + shift);
nextColumnWidth.splice(index, 1, { columnName: columnName, width: size });
return {
columnWidths: nextColumnWidth
};
};
var changeDraftTableColumnWidths = function changeDraftTableColumnWidths(state, _ref2) {
var shifts = _ref2.shifts;
var columnWidths = state.columnWidths,
draftColumnWidths = state.draftColumnWidths;
var draftTableColumnWidth = function draftTableColumnWidth(state, _ref2) {
var columnName = _ref2.columnName,
shift = _ref2.shift;
var columnWidths = state.columnWidths;
var updatedDraftColumnWidths = Object.keys(shifts).reduce(function (acc, columnName) {
if (shifts[columnName] === null) {
delete acc[columnName];
return acc;
}
var size = Math.max(MIN_SIZE, columnWidths[columnName] + shifts[columnName]);
return Object.assign(acc, defineProperty({}, columnName, size));
}, Object.assign({}, draftColumnWidths));
return _extends({}, state, {
draftColumnWidths: updatedDraftColumnWidths
var updatedColumn = columnWidths.find(function (elem) {
return elem.columnName === columnName;
});
var size = Math.max(MIN_SIZE, updatedColumn.width + shift);
return {
draftColumnWidths: [{ columnName: updatedColumn.columnName, width: size }]
};
};
var cancelTableColumnWidthDraft = function cancelTableColumnWidthDraft() {
return {
draftColumnWidths: []
};
};
var TABLE_EDIT_COMMAND_TYPE = 'editCommand';
@@ -1268,8 +1294,8 @@ var isEditTableRow = function isEditTableRow(tableRow) {
return tableRow.type === TABLE_EDIT_TYPE;
};
var tableRowsWithEditing = function tableRowsWithEditing(tableRows, editingRows, addedRows, rowHeight) {
var rowIds = new Set(editingRows);
var tableRowsWithEditing = function tableRowsWithEditing(tableRows, editingRowIds, addedRows, rowHeight) {
var rowIds = new Set(editingRowIds);
var editedTableRows = tableRows.map(function (tableRow) {
return tableRow.type === TABLE_DATA_TYPE && rowIds.has(tableRow.rowId) ? _extends({}, tableRow, {
type: TABLE_EDIT_TYPE,
@@ -1315,37 +1341,44 @@ var isGroupTableRow = function isGroupTableRow(tableRow) {
return tableRow.type === TABLE_GROUP_TYPE;
};
var tableColumnsWithDraftGrouping = function tableColumnsWithDraftGrouping(tableColumns, draftGrouping, showColumnWhenGrouped) {
var tableColumnsWithDraftGrouping = function tableColumnsWithDraftGrouping(tableColumns, grouping, draftGrouping, showColumnWhenGrouped) {
return tableColumns.reduce(function (acc, tableColumn) {
var isDataColumn = tableColumn.type === TABLE_DATA_TYPE;
var tableColumnName = isDataColumn ? tableColumn.column.name : '';
var columnDraftGrouping = draftGrouping.find(function (grouping) {
return grouping.columnName === tableColumnName;
if (tableColumn.type !== TABLE_DATA_TYPE) {
acc.push(tableColumn);
return acc;
}
var columnName = tableColumn.column.name;
var columnGroupingExists = grouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
});
var columnDraftGroupingExists = draftGrouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
});
if (!columnDraftGrouping || showColumnWhenGrouped(tableColumnName)) {
return [].concat(toConsumableArray(acc), [tableColumn]);
} else if (columnDraftGrouping.mode === 'remove' || columnDraftGrouping.mode === 'add') {
return [].concat(toConsumableArray(acc), [_extends({}, tableColumn, {
if (!columnGroupingExists && !columnDraftGroupingExists || showColumnWhenGrouped(columnName)) {
acc.push(tableColumn);
} else if (!columnGroupingExists && columnDraftGroupingExists || columnGroupingExists && !columnDraftGroupingExists) {
acc.push(_extends({}, tableColumn, {
draft: true
})]);
}));
}
return acc;
}, []);
};
var tableColumnsWithGrouping = function tableColumnsWithGrouping(tableColumns, grouping, draftGrouping, groupIndentColumnWidth, showColumnWhenGrouped) {
var tableColumnsWithGrouping = function tableColumnsWithGrouping(columns, tableColumns, grouping, draftGrouping, indentColumnWidth, showColumnWhenGrouped) {
return [].concat(toConsumableArray(grouping.map(function (columnGrouping) {
var groupedColumn = tableColumns.find(function (tableColumn) {
return tableColumn.type === TABLE_DATA_TYPE && tableColumn.column.name === columnGrouping.columnName;
}).column;
var groupedColumn = columns.find(function (column) {
return column.name === columnGrouping.columnName;
});
return {
key: TABLE_GROUP_TYPE + '_' + groupedColumn.name,
type: TABLE_GROUP_TYPE,
column: groupedColumn,
width: groupIndentColumnWidth
width: indentColumnWidth
};
})), toConsumableArray(tableColumnsWithDraftGrouping(tableColumns, draftGrouping, showColumnWhenGrouped)));
})), toConsumableArray(tableColumnsWithDraftGrouping(tableColumns, grouping, draftGrouping, showColumnWhenGrouped)));
};
var tableRowsWithGrouping = function tableRowsWithGrouping(tableRows, isGroupRow) {
@@ -1375,8 +1408,8 @@ var tableRowsWithHeading = function tableRowsWithHeading(headerRows) {
var TABLE_DETAIL_TYPE = 'detail';
var isDetailRowExpanded = function isDetailRowExpanded(expandedRows, rowId) {
return expandedRows.indexOf(rowId) > -1;
var isDetailRowExpanded = function isDetailRowExpanded(expandedDetailRowIds, rowId) {
return expandedDetailRowIds.indexOf(rowId) > -1;
};
var isDetailToggleTableCell = function isDetailToggleTableCell(tableRow, tableColumn) {
return tableColumn.type === TABLE_DETAIL_TYPE && tableRow.type === TABLE_DATA_TYPE;
@@ -1385,26 +1418,26 @@ var isDetailTableRow = function isDetailTableRow(tableRow) {
return tableRow.type === TABLE_DETAIL_TYPE;
};
var setDetailRowExpanded = function setDetailRowExpanded(prevExpanded, _ref) {
var toggleDetailRowExpanded = function toggleDetailRowExpanded(prevExpanded, _ref) {
var rowId = _ref.rowId,
isExpanded = _ref.isExpanded;
state = _ref.state;
var expandedRows = Array.from(prevExpanded);
var expandedIndex = expandedRows.indexOf(rowId);
var isRowExpanded = isExpanded !== undefined ? isExpanded : expandedIndex === -1;
var expandedDetailRowIds = prevExpanded.slice();
var expandedIndex = expandedDetailRowIds.indexOf(rowId);
var rowState = state !== undefined ? state : expandedIndex === -1;
if (expandedIndex > -1 && !isRowExpanded) {
expandedRows.splice(expandedIndex, 1);
} else if (expandedIndex === -1 && isRowExpanded) {
expandedRows.push(rowId);
if (expandedIndex > -1 && !rowState) {
expandedDetailRowIds.splice(expandedIndex, 1);
} else if (expandedIndex === -1 && rowState) {
expandedDetailRowIds.push(rowId);
}
return expandedRows;
return expandedDetailRowIds;
};
var tableRowsWithExpandedDetail = function tableRowsWithExpandedDetail(tableRows, expandedRows, rowHeight) {
var tableRowsWithExpandedDetail = function tableRowsWithExpandedDetail(tableRows, expandedDetailRowIds, rowHeight) {
var result = tableRows;
expandedRows.forEach(function (expandedRowId) {
expandedDetailRowIds.forEach(function (expandedRowId) {
var rowIndex = result.findIndex(function (tableRow) {
return tableRow.type === TABLE_DATA_TYPE && tableRow.rowId === expandedRowId;
});
@@ -1426,8 +1459,8 @@ var tableRowsWithExpandedDetail = function tableRowsWithExpandedDetail(tableRows
return result;
};
var tableColumnsWithDetail = function tableColumnsWithDetail(tableColumns, detailToggleCellWidth) {
return [{ key: TABLE_DETAIL_TYPE, type: TABLE_DETAIL_TYPE, width: detailToggleCellWidth }].concat(toConsumableArray(tableColumns));
var tableColumnsWithDetail = function tableColumnsWithDetail(tableColumns, toggleColumnWidth) {
return [{ key: TABLE_DETAIL_TYPE, type: TABLE_DETAIL_TYPE, width: toggleColumnWidth }].concat(toConsumableArray(tableColumns));
};
var TABLE_SELECT_TYPE = 'select';
@@ -1456,12 +1489,29 @@ var isDataTableRow = function isDataTableRow(tableRow) {
return tableRow.type === TABLE_DATA_TYPE;
};
var tableColumnsWithDataRows = function tableColumnsWithDataRows(columns) {
var getColumnExtension = function getColumnExtension(columnExtensions, columnName) {
if (!columnExtensions) {
return {};
}
var columnExtension = columnExtensions.find(function (extension) {
return extension.columnName === columnName;
});
if (!columnExtension) {
return {};
}
return columnExtension;
};
var tableColumnsWithDataRows = function tableColumnsWithDataRows(columns, columnExtensions) {
return columns.map(function (column) {
var name = column.name;
var columnExtension = getColumnExtension(columnExtensions, name);
return {
key: TABLE_DATA_TYPE + '_' + column.name,
key: TABLE_DATA_TYPE + '_' + name,
type: TABLE_DATA_TYPE,
width: column.width,
width: columnExtension.width,
align: columnExtension.align,
column: column
};
});
@@ -1479,20 +1529,26 @@ var tableRowsWithDataRows = function tableRowsWithDataRows(rows, getRowId) {
});
};
var visibleTableColumns = function visibleTableColumns(tableColumns, hiddenColumns) {
var visibleTableColumns = function visibleTableColumns(tableColumns, hiddenColumnNames) {
return tableColumns.filter(function (tableColumn) {
return hiddenColumns.indexOf(tableColumn.column.name) === -1;
return tableColumn.type !== TABLE_DATA_TYPE || hiddenColumnNames.indexOf(tableColumn.column.name) === -1;
});
};
var columnChooserItems = function columnChooserItems(columns, hiddenColumns) {
var tableDataColumnsExist = function tableDataColumnsExist(tableColumns) {
return tableColumns.some(function (column) {
return column.type === TABLE_DATA_TYPE;
});
};
var columnChooserItems = function columnChooserItems(columns, hiddenColumnNames) {
return columns.map(function (column) {
return { column: column, hidden: hiddenColumns.indexOf(column.name) !== -1 };
return { column: column, hidden: hiddenColumnNames.indexOf(column.name) !== -1 };
});
};
var toggleColumn = function toggleColumn(hiddenColumns, columnName) {
return hiddenColumns.indexOf(columnName) === -1 ? [].concat(toConsumableArray(hiddenColumns), [columnName]) : hiddenColumns.filter(function (hiddenColumn) {
var toggleColumn = function toggleColumn(hiddenColumnNames, columnName) {
return hiddenColumnNames.indexOf(columnName) === -1 ? [].concat(toConsumableArray(hiddenColumnNames), [columnName]) : hiddenColumnNames.filter(function (hiddenColumn) {
return hiddenColumn !== columnName;
});
};
@@ -1652,15 +1708,36 @@ var isOnTheSameLine = function isOnTheSameLine(geometry, y) {
return y >= geometry.top && y <= geometry.bottom;
};
var getGroupCellTargetIndex = function getGroupCellTargetIndex(geometries, sourceIndex, _ref) {
var x = _ref.x,
y = _ref.y;
var rectToObject = function rectToObject(_ref) {
var top = _ref.top,
right = _ref.right,
bottom = _ref.bottom,
left = _ref.left;
return {
top: top, right: right, bottom: bottom, left: left
};
};
var collapseGapsBetweenItems = function collapseGapsBetweenItems(geometries) {
return geometries.map(function (geometry, index) {
if (index !== geometries.length - 1 && geometry.top === geometries[index + 1].top) {
return _extends({}, geometry, {
right: geometries[index + 1].left
});
}
return geometry;
});
};
var getGroupCellTargetIndex = function getGroupCellTargetIndex(geometries, sourceIndex, _ref2) {
var x = _ref2.x,
y = _ref2.y;
if (geometries.length === 0) return 0;
var targetGeometries = sourceIndex !== -1 ? getTargetColumnGeometries(geometries, sourceIndex) : geometries;
var targetGeometries = sourceIndex !== -1 ? getTargetColumnGeometries(geometries, sourceIndex) : geometries.map(rectToObject);
var targetIndex = targetGeometries.findIndex(function (geometry, index) {
var targetIndex = collapseGapsBetweenItems(targetGeometries).findIndex(function (geometry, index) {
var inVerticalBounds = isOnTheSameLine(geometry, y);
var inHorizontalBounds = x >= geometry.left && x <= geometry.right;
var shouldGoFirst = index === 0 && x < geometry.left;
@@ -1692,5 +1769,5 @@ var getMessagesFormatter = function getMessagesFormatter(messages) {
};
};
export { getTableRowColumnsWithColSpan, getTableColumnGeometries, getTableTargetColumnIndex, getAnimations, filterActiveAnimations, evalAnimations, getGroupCellTargetIndex, getMessagesFormatter, rowIdGetter, cellValueGetter, setColumnSorting, getColumnSortingDirection, sortedRows, setColumnFilter, getColumnFilterConfig, filteredRows, groupByColumn, toggleExpandedGroups, draftGroupingChange, cancelGroupingChange, draftGrouping, groupRowChecker, groupRowLevelKeyGetter, groupedRows, expandedGroupRows, customGroupedRows, customGroupingRowIdGetter, groupingPanelItems, setCurrentPage, setPageSize, paginatedRows, rowsWithPageHeaders, pageCount, rowCount, firstRowOnPage, lastRowOnPage, setRowsSelection, getAvailableToSelect, getAvailableSelection, startEditRows, stopEditRows, addRow, changeAddedRow, cancelAddedRows, changeRow, cancelChanges, deleteRows, cancelDeletedRows, changedRowsByIds, addedRowsByIds, computedCreateRowChange, getRowChange, TABLE_REORDERING_TYPE, changeColumnOrder, orderedColumns, tableHeaderRowsWithReordering, draftOrder, tableColumnsWithWidths, changeTableColumnWidths, changeDraftTableColumnWidths, TABLE_EDIT_COMMAND_TYPE, isHeadingEditCommandsTableCell, isEditCommandsTableCell, tableColumnsWithEditing, TABLE_ADDED_TYPE, TABLE_EDIT_TYPE, isEditTableCell, isAddedTableRow, isEditTableRow, tableRowsWithEditing, TABLE_FILTER_TYPE, isFilterTableCell, isFilterTableRow, tableHeaderRowsWithFilter, TABLE_GROUP_TYPE, isGroupTableCell, isGroupIndentTableCell, isGroupTableRow, tableColumnsWithGrouping, tableRowsWithGrouping, TABLE_HEADING_TYPE, isHeadingTableCell, isHeadingTableRow, tableRowsWithHeading, TABLE_DETAIL_TYPE, isDetailRowExpanded, isDetailToggleTableCell, isDetailTableRow, setDetailRowExpanded, tableRowsWithExpandedDetail, tableColumnsWithDetail, TABLE_SELECT_TYPE, isSelectTableCell, isSelectAllTableCell, tableColumnsWithSelection, TABLE_DATA_TYPE, TABLE_NODATA_TYPE, isNoDataTableRow, isDataTableCell, isHeaderStubTableCell, isDataTableRow, tableColumnsWithDataRows, tableRowsWithDataRows, visibleTableColumns, columnChooserItems, toggleColumn };
export { getColumnExtension, getTableRowColumnsWithColSpan, getTableColumnGeometries, getTableTargetColumnIndex, getAnimations, filterActiveAnimations, evalAnimations, getGroupCellTargetIndex, getMessagesFormatter, rowIdGetter, cellValueGetter, changeColumnSorting, getColumnSortingDirection, sortedRows, changeColumnFilter, getColumnFilterConfig, filteredRows, GROUP_KEY_SEPARATOR, changeColumnGrouping, toggleExpandedGroups, draftColumnGrouping, cancelColumnGroupingDraft, groupRowChecker, groupRowLevelKeyGetter, groupedRows, expandedGroupRows, customGroupedRows, customGroupingRowIdGetter, groupingPanelItems, setCurrentPage, setPageSize, paginatedRows, rowsWithPageHeaders, pageCount, rowCount, firstRowOnPage, lastRowOnPage, toggleSelection, rowsWithAvailableToSelect, someSelected, allSelected, unwrapSelectedRows, startEditRows, stopEditRows, addRow, changeAddedRow, cancelAddedRows, changeRow, cancelChanges, deleteRows, cancelDeletedRows, changedRowsByIds, addedRowsByIds, createRowChangeGetter, getRowChange, TABLE_REORDERING_TYPE, changeColumnOrder, orderedColumns, tableHeaderRowsWithReordering, draftOrder, tableColumnsWithWidths, changeTableColumnWidth, draftTableColumnWidth, cancelTableColumnWidthDraft, TABLE_EDIT_COMMAND_TYPE, isHeadingEditCommandsTableCell, isEditCommandsTableCell, tableColumnsWithEditing, TABLE_ADDED_TYPE, TABLE_EDIT_TYPE, isEditTableCell, isAddedTableRow, isEditTableRow, tableRowsWithEditing, TABLE_FILTER_TYPE, isFilterTableCell, isFilterTableRow, tableHeaderRowsWithFilter, TABLE_GROUP_TYPE, isGroupTableCell, isGroupIndentTableCell, isGroupTableRow, tableColumnsWithGrouping, tableRowsWithGrouping, TABLE_HEADING_TYPE, isHeadingTableCell, isHeadingTableRow, tableRowsWithHeading, TABLE_DETAIL_TYPE, isDetailRowExpanded, isDetailToggleTableCell, isDetailTableRow, toggleDetailRowExpanded, tableRowsWithExpandedDetail, tableColumnsWithDetail, TABLE_SELECT_TYPE, isSelectTableCell, isSelectAllTableCell, tableColumnsWithSelection, TABLE_DATA_TYPE, TABLE_NODATA_TYPE, isNoDataTableRow, isDataTableCell, isHeaderStubTableCell, isDataTableRow, tableColumnsWithDataRows, tableRowsWithDataRows, visibleTableColumns, tableDataColumnsExist, columnChooserItems, toggleColumn };
//# sourceMappingURL=dx-grid-core.es.js.map

File diff suppressed because one or more lines are too long

View File

@@ -1,7 +1,7 @@
/**
* Bundle of @devexpress/dx-grid-core
* Generated: 2017-11-10
* Version: 1.0.0-beta.1
* Generated: 2018-03-02
* Version: 1.0.3
* License: https://js.devexpress.com/Licensing
*/
@@ -23,10 +23,12 @@ var rowIdGetter = function rowIdGetter(getRowId, rows) {
return getRowId;
};
var cellValueGetter = function cellValueGetter(getCellValue, columns) {
if (getCellValue) {
return getCellValue;
}
var defaultGetCellValue = function defaultGetCellValue(row, columnName) {
return row[columnName];
};
var cellValueGetter = function cellValueGetter() {
var getCellValue = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : defaultGetCellValue;
var columns = arguments[1];
var useFastAccessor = true;
var map = columns.reduce(function (acc, column) {
@@ -37,28 +39,29 @@ var cellValueGetter = function cellValueGetter(getCellValue, columns) {
return acc;
}, {});
return useFastAccessor ? function (row, columnName) {
return row[columnName];
} : function (row, columnName) {
return map[columnName] ? map[columnName](row, columnName) : row[columnName];
if (useFastAccessor) {
return getCellValue;
}
return function (row, columnName) {
return map[columnName] ? map[columnName](row, columnName) : getCellValue(row, columnName);
};
};
var setColumnSorting = function setColumnSorting(state, _ref) {
var changeColumnSorting = function changeColumnSorting(state, _ref) {
var columnName = _ref.columnName,
direction = _ref.direction,
keepOther = _ref.keepOther,
cancel = _ref.cancel,
sortIndex = _ref.sortIndex;
var sorting = state.sorting;
var nextSorting = [];
if (keepOther === true) {
nextSorting = Array.from(sorting).slice();
nextSorting = sorting.slice();
}
if (Array.isArray(keepOther)) {
nextSorting = Array.from(sorting).filter(function (columnSorting) {
nextSorting = sorting.slice().filter(function (columnSorting) {
return keepOther.indexOf(columnSorting.columnName) > -1;
});
}
@@ -76,7 +79,7 @@ var setColumnSorting = function setColumnSorting(state, _ref) {
nextSorting.splice(columnSortingIndex, 1);
}
if (!cancel) {
if (direction !== null) {
var newIndexFallback = columnSortingIndex > -1 ? columnSortingIndex : nextSorting.length;
var newIndex = sortIndex !== undefined ? sortIndex : newIndexFallback;
nextSorting.splice(newIndex, 0, newColumnSorting);
@@ -450,7 +453,7 @@ var defaultCompare = function defaultCompare(a, b) {
};
var createCompare = function createCompare(sorting, getColumnCompare, getComparableValue) {
return Array.from(sorting).reverse().reduce(function (prevCompare, columnSorting) {
return sorting.slice().reverse().reduce(function (prevCompare, columnSorting) {
var columnName = columnSorting.columnName;
var inverse = columnSorting.direction === 'desc';
@@ -499,7 +502,7 @@ var sortedRows = function sortedRows(rows, sorting, getCellValue, getColumnCompa
if (!getRowLevelKey) {
var _compare = createCompare(sorting, getColumnCompare, getCellValue);
return mergeSort(Array.from(rows), _compare);
return mergeSort(rows.slice(), _compare);
}
var compare = createCompare(sorting, getColumnCompare, function (row, columnName) {
@@ -514,14 +517,14 @@ var sortedRows = function sortedRows(rows, sorting, getCellValue, getColumnCompa
return sortHierarchicalRows(rows, compare, getRowLevelKey);
};
var setColumnFilter = function setColumnFilter(filters, _ref) {
var changeColumnFilter = function changeColumnFilter(filters, _ref) {
var columnName = _ref.columnName,
config = _ref.config;
var filterIndex = filters.findIndex(function (f) {
return f.columnName === columnName;
});
var nextState = Array.from(filters);
var nextState = filters.slice();
if (config) {
var filter = _extends({ columnName: columnName }, config);
@@ -628,56 +631,67 @@ var filteredRows = function filteredRows(rows, filters, getCellValue, getColumnP
var GROUP_KEY_SEPARATOR = '|';
var groupByColumn = function groupByColumn(state, _ref) {
var applyColumnGrouping = function applyColumnGrouping(grouping, _ref) {
var columnName = _ref.columnName,
groupIndex = _ref.groupIndex;
var grouping = Array.from(state.grouping);
var groupingIndex = grouping.findIndex(function (g) {
var nextGrouping = grouping.slice();
var groupingIndex = nextGrouping.findIndex(function (g) {
return g.columnName === columnName;
});
var targetIndex = groupIndex;
if (groupingIndex > -1) {
grouping.splice(groupingIndex, 1);
nextGrouping.splice(groupingIndex, 1);
} else if (groupIndex === undefined) {
targetIndex = grouping.length;
targetIndex = nextGrouping.length;
}
if (targetIndex > -1) {
grouping.splice(targetIndex, 0, {
nextGrouping.splice(targetIndex, 0, {
columnName: columnName
});
}
var ungroupedColumnIndex = state.grouping.findIndex(function (group, index) {
return !grouping[index] || group.columnName !== grouping[index].columnName;
return nextGrouping;
};
var changeColumnGrouping = function changeColumnGrouping(_ref2, _ref3) {
var grouping = _ref2.grouping,
expandedGroups = _ref2.expandedGroups;
var columnName = _ref3.columnName,
groupIndex = _ref3.groupIndex;
var nextGrouping = applyColumnGrouping(grouping, { columnName: columnName, groupIndex: groupIndex });
var ungroupedColumnIndex = grouping.findIndex(function (group, index) {
return !nextGrouping[index] || group.columnName !== nextGrouping[index].columnName;
});
if (ungroupedColumnIndex === -1) {
return {
grouping: grouping
grouping: nextGrouping
};
}
var filteredExpandedGroups = state.expandedGroups.filter(function (group) {
var filteredExpandedGroups = expandedGroups.filter(function (group) {
return group.split(GROUP_KEY_SEPARATOR).length <= ungroupedColumnIndex;
});
if (filteredExpandedGroups.length === state.expandedGroups.length) {
if (filteredExpandedGroups.length === expandedGroups.length) {
return {
grouping: grouping
grouping: nextGrouping
};
}
return {
grouping: grouping,
grouping: nextGrouping,
expandedGroups: filteredExpandedGroups
};
};
var toggleExpandedGroups = function toggleExpandedGroups(state, _ref2) {
var groupKey = _ref2.groupKey;
var toggleExpandedGroups = function toggleExpandedGroups(state, _ref4) {
var groupKey = _ref4.groupKey;
var expandedGroups = Array.from(state.expandedGroups);
var expandedGroups = state.expandedGroups.slice();
var groupKeyIndex = expandedGroups.indexOf(groupKey);
if (groupKeyIndex > -1) {
@@ -691,40 +705,20 @@ var toggleExpandedGroups = function toggleExpandedGroups(state, _ref2) {
};
};
var draftGroupingChange = function draftGroupingChange(state, _ref3) {
var columnName = _ref3.columnName,
groupIndex = _ref3.groupIndex;
return { groupingChange: { columnName: columnName, groupIndex: groupIndex } };
var draftColumnGrouping = function draftColumnGrouping(_ref5, _ref6) {
var grouping = _ref5.grouping,
draftGrouping = _ref5.draftGrouping;
var columnName = _ref6.columnName,
groupIndex = _ref6.groupIndex;
return {
draftGrouping: applyColumnGrouping(draftGrouping || grouping, { columnName: columnName, groupIndex: groupIndex })
};
};
var cancelGroupingChange = function cancelGroupingChange() {
return { groupingChange: null };
};
var draftGrouping = function draftGrouping(grouping, groupingChange) {
if (!groupingChange) return grouping;
var columnName = groupingChange.columnName,
groupIndex = groupingChange.groupIndex;
var result = Array.from(grouping);
if (groupIndex !== -1) {
result = result.filter(function (g) {
return g.columnName !== columnName;
});
result.splice(groupIndex, 0, {
columnName: columnName,
draft: true,
mode: grouping.length > result.length ? 'reorder' : 'add'
});
} else {
result = result.map(function (g) {
return g.columnName === columnName ? { columnName: columnName, draft: true, mode: 'remove' } : g;
});
}
return result;
var cancelColumnGroupingDraft = function cancelColumnGroupingDraft() {
return {
draftGrouping: null
};
};
var GRID_GROUP_TYPE = 'group';
@@ -739,26 +733,26 @@ var groupRowLevelKeyGetter = function groupRowLevelKeyGetter(row) {
return row[GRID_GROUP_LEVEL_KEY];
};
var defaultColumnIdentity = function defaultColumnIdentity(value) {
var defaultColumnCriteria = function defaultColumnCriteria(value) {
return {
key: String(value),
value: value
};
};
var groupedRows = function groupedRows(rows, grouping, getCellValue, getColumnIdentity) {
var groupedRows = function groupedRows(rows, grouping, getCellValue, getColumnCriteria) {
var keyPrefix = arguments.length > 4 && arguments[4] !== undefined ? arguments[4] : '';
if (!grouping.length) return rows;
var columnName = grouping[0].columnName;
var groupIdentity = getColumnIdentity && getColumnIdentity(columnName) || defaultColumnIdentity;
var groupCriteria = getColumnCriteria && getColumnCriteria(columnName) || defaultColumnCriteria;
var groups = rows.reduce(function (acc, row) {
var _groupIdentity = groupIdentity(getCellValue(row, columnName), row),
key = _groupIdentity.key,
_groupIdentity$value = _groupIdentity.value,
value = _groupIdentity$value === undefined ? key : _groupIdentity$value;
var _groupCriteria = groupCriteria(getCellValue(row, columnName), row),
key = _groupCriteria.key,
_groupCriteria$value = _groupCriteria.value,
value = _groupCriteria$value === undefined ? key : _groupCriteria$value;
var sameKeyItems = acc.get(key);
@@ -782,7 +776,7 @@ var groupedRows = function groupedRows(rows, grouping, getCellValue, getColumnId
var compoundKey = '' + keyPrefix + key;
acc.push((_acc$push = {}, defineProperty(_acc$push, GRID_GROUP_CHECK, true), defineProperty(_acc$push, GRID_GROUP_LEVEL_KEY, GRID_GROUP_TYPE + '_' + groupedBy), defineProperty(_acc$push, 'groupedBy', groupedBy), defineProperty(_acc$push, 'compoundKey', compoundKey), defineProperty(_acc$push, 'key', key), defineProperty(_acc$push, 'value', value), _acc$push));
acc.push.apply(acc, toConsumableArray(groupedRows(items, nestedGrouping, getCellValue, getColumnIdentity, '' + compoundKey + GROUP_KEY_SEPARATOR)));
acc.push.apply(acc, toConsumableArray(groupedRows(items, nestedGrouping, getCellValue, getColumnCriteria, '' + compoundKey + GROUP_KEY_SEPARATOR)));
return acc;
}, []);
};
@@ -793,6 +787,7 @@ var expandedGroupRows = function expandedGroupRows(rows, grouping, expandedGroup
var groupingColumnNames = grouping.map(function (columnGrouping) {
return columnGrouping.columnName;
});
var expandedGroupsSet = new Set(expandedGroups);
var currentGroupExpanded = true;
var currentGroupLevel = 0;
@@ -811,7 +806,7 @@ var expandedGroupRows = function expandedGroupRows(rows, grouping, expandedGroup
return acc;
}
currentGroupExpanded = expandedGroups.has(row.compoundKey);
currentGroupExpanded = expandedGroupsSet.has(row.compoundKey);
currentGroupLevel = groupLevel;
if (currentGroupExpanded) {
@@ -868,19 +863,34 @@ var customGroupingRowIdGetter = function customGroupingRowIdGetter(getRowId, row
};
};
var groupingPanelItems = function groupingPanelItems(columns, grouping) {
return grouping.map(function (_ref) {
var columnName = _ref.columnName,
draft = _ref.draft;
var column = columns.find(function (c) {
return c.name === columnName;
});
var groupingPanelItems = function groupingPanelItems(columns, grouping, draftGrouping) {
var items = draftGrouping.map(function (_ref) {
var columnName = _ref.columnName;
return {
column: column,
draft: draft
column: columns.find(function (c) {
return c.name === columnName;
}),
draft: !grouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
})
};
});
grouping.forEach(function (_ref2, index) {
var columnName = _ref2.columnName;
if (draftGrouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
})) return;
items.splice(index, 0, {
column: columns.find(function (c) {
return c.name === columnName;
}),
draft: true
});
});
return items;
};
var setCurrentPage = function setCurrentPage(prevPage, page) {
@@ -959,47 +969,21 @@ var lastRowOnPage = function lastRowOnPage(currentPage, pageSize, totalRowCount)
return result;
};
var setRowSelection = function setRowSelection(selection, _ref) {
var rowId = _ref.rowId,
selected = _ref.selected;
var selectedRows = Array.from(selection);
var selectedIndex = selectedRows.indexOf(rowId);
var isRowSelected = selected;
if (isRowSelected === undefined) {
isRowSelected = selectedIndex === -1;
}
if (selectedIndex > -1 && !isRowSelected) {
selectedRows.splice(selectedIndex, 1);
} else if (selectedIndex === -1 && isRowSelected) {
selectedRows.push(rowId);
}
return selectedRows;
};
var setRowsSelection = function setRowsSelection(selection, _ref2) {
var rowIds = _ref2.rowIds,
selected = _ref2.selected;
if (rowIds.length === 1) {
return setRowSelection(selection, { rowId: rowIds[0], selected: selected });
}
var toggleSelection = function toggleSelection(selection, _ref) {
var rowIds = _ref.rowIds,
state = _ref.state;
var rowIdsSet = new Set(rowIds);
var isRowsSelected = selected;
if (isRowsSelected === undefined) {
var rowsState = state;
if (rowsState === undefined) {
var availableSelection = selection.filter(function (rowId) {
return rowIdsSet.has(rowId);
});
isRowsSelected = availableSelection.length !== rowIdsSet.size;
rowsState = availableSelection.length !== rowIdsSet.size;
}
if (isRowsSelected) {
if (rowsState) {
var selectionSet = new Set(selection);
return [].concat(toConsumableArray(selection), toConsumableArray(rowIds.filter(function (rowId) {
return !selectionSet.has(rowId);
@@ -1011,41 +995,63 @@ var setRowsSelection = function setRowsSelection(selection, _ref2) {
});
};
var getAvailableToSelect = function getAvailableToSelect(rows, getRowId, isGroupRow) {
var rowsWithAvailableToSelect = function rowsWithAvailableToSelect(rows, getRowId, isGroupRow) {
var dataRows = rows;
if (isGroupRow) {
dataRows = dataRows.filter(function (row) {
return !isGroupRow(row);
});
}
return dataRows.map(function (row) {
return getRowId(row);
return { rows: rows, availableToSelect: dataRows.map(function (row) {
return getRowId(row);
}) };
};
var someSelected = function someSelected(_ref, selection) {
var availableToSelect = _ref.availableToSelect;
var selectionSet = new Set(selection);
return availableToSelect.length !== 0 && selectionSet.size !== 0 && availableToSelect.some(function (elem) {
return selectionSet.has(elem);
}) && availableToSelect.some(function (elem) {
return !selectionSet.has(elem);
});
};
var getAvailableSelection = function getAvailableSelection(selection, availableToSelect) {
var availableToSelectSet = new Set(availableToSelect);
return selection.filter(function (selected) {
return availableToSelectSet.has(selected);
var allSelected = function allSelected(_ref2, selection) {
var availableToSelect = _ref2.availableToSelect;
var selectionSet = new Set(selection);
return selectionSet.size !== 0 && availableToSelect.length !== 0 && !availableToSelect.some(function (elem) {
return !selectionSet.has(elem);
});
};
var startEditRows = function startEditRows(prevEditingRows, _ref) {
var unwrapSelectedRows = function unwrapSelectedRows(_ref3) {
var rows = _ref3.rows;
return rows;
};
var startEditRows = function startEditRows(prevEditingRowIds, _ref) {
var rowIds = _ref.rowIds;
return [].concat(toConsumableArray(prevEditingRows), toConsumableArray(rowIds));
return [].concat(toConsumableArray(prevEditingRowIds), toConsumableArray(rowIds));
};
var stopEditRows = function stopEditRows(prevEditingRows, _ref2) {
var stopEditRows = function stopEditRows(prevEditingRowIds, _ref2) {
var rowIds = _ref2.rowIds;
var rowIdSet = new Set(rowIds);
return prevEditingRows.filter(function (id) {
return prevEditingRowIds.filter(function (id) {
return !rowIdSet.has(id);
});
};
var addRow = function addRow(addedRows, _ref3) {
var row = _ref3.row;
var addRow = function addRow(addedRows) {
var _ref3 = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : { row: {} },
row = _ref3.row;
return [].concat(toConsumableArray(addedRows), [row]);
};
@@ -1053,7 +1059,7 @@ var changeAddedRow = function changeAddedRow(addedRows, _ref4) {
var rowId = _ref4.rowId,
change = _ref4.change;
var result = Array.from(addedRows);
var result = addedRows.slice();
result[rowId] = _extends({}, result[rowId], change);
return result;
};
@@ -1071,34 +1077,34 @@ var cancelAddedRows = function cancelAddedRows(addedRows, _ref5) {
return result;
};
var changeRow = function changeRow(prevChangedRows, _ref6) {
var changeRow = function changeRow(prevRowChanges, _ref6) {
var rowId = _ref6.rowId,
change = _ref6.change;
var prevChange = prevChangedRows[rowId] || {};
return _extends({}, prevChangedRows, defineProperty({}, rowId, _extends({}, prevChange, change)));
var prevChange = prevRowChanges[rowId] || {};
return _extends({}, prevRowChanges, defineProperty({}, rowId, _extends({}, prevChange, change)));
};
var cancelChanges = function cancelChanges(prevChangedRows, _ref7) {
var cancelChanges = function cancelChanges(prevRowChanges, _ref7) {
var rowIds = _ref7.rowIds;
var result = _extends({}, prevChangedRows);
var result = _extends({}, prevRowChanges);
rowIds.forEach(function (rowId) {
delete result[rowId];
});
return result;
};
var deleteRows = function deleteRows(deletedRows, _ref8) {
var deleteRows = function deleteRows(deletedRowIds, _ref8) {
var rowIds = _ref8.rowIds;
return [].concat(toConsumableArray(deletedRows), toConsumableArray(rowIds));
return [].concat(toConsumableArray(deletedRowIds), toConsumableArray(rowIds));
};
var cancelDeletedRows = function cancelDeletedRows(deletedRows, _ref9) {
var cancelDeletedRows = function cancelDeletedRows(deletedRowIds, _ref9) {
var rowIds = _ref9.rowIds;
var rowIdSet = new Set(rowIds);
return deletedRows.filter(function (rowId) {
return deletedRowIds.filter(function (rowId) {
return !rowIdSet.has(rowId);
});
};
@@ -1122,21 +1128,30 @@ var addedRowsByIds = function addedRowsByIds(addedRows, rowIds) {
return result;
};
var computedCreateRowChange = function computedCreateRowChange(columns) {
var map = columns.reduce(function (acc, column) {
if (column.createRowChange) {
acc[column.name] = column.createRowChange;
var defaultCreateRowChange = function defaultCreateRowChange(row, value, columnName) {
return defineProperty({}, columnName, value);
};
var createRowChangeGetter = function createRowChangeGetter() {
var createRowChange = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : defaultCreateRowChange;
var columnExtensions = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : [];
var map = columnExtensions.reduce(function (acc, columnExtension) {
if (columnExtension.createRowChange) {
acc[columnExtension.columnName] = columnExtension.createRowChange;
}
return acc;
}, {});
return function (row, columnName, value) {
return map[columnName] ? map[columnName](row, value, columnName) : defineProperty({}, columnName, value);
return function (row, value, columnName) {
if (map[columnName]) {
return map[columnName](row, value, columnName);
}
return createRowChange(row, value, columnName);
};
};
var getRowChange = function getRowChange(changedRows, rowId) {
return changedRows[rowId] || {};
var getRowChange = function getRowChange(rowChanges, rowId) {
return rowChanges[rowId] || {};
};
var TABLE_REORDERING_TYPE = 'reordering';
@@ -1147,7 +1162,7 @@ var changeColumnOrder = function changeColumnOrder(order, _ref) {
var sourceColumnIndex = order.indexOf(sourceColumnName);
var targetColumnIndex = order.indexOf(targetColumnName);
var newOrder = Array.from(order);
var newOrder = order.slice();
newOrder.splice(sourceColumnIndex, 1);
newOrder.splice(targetColumnIndex, 0, sourceColumnName);
@@ -1158,7 +1173,7 @@ var TABLE_DATA_TYPE = 'data';
var TABLE_NODATA_TYPE = 'nodata';
var orderedColumns = function orderedColumns(tableColumns, order) {
var result = Array.from(tableColumns);
var result = tableColumns.slice();
result.sort(function (a, b) {
if (a.type !== TABLE_DATA_TYPE || b.type !== TABLE_DATA_TYPE) return 0;
@@ -1198,7 +1213,11 @@ var tableColumnsWithWidths = function tableColumnsWithWidths(tableColumns, colum
return tableColumns.reduce(function (acc, tableColumn) {
if (tableColumn.type === 'data') {
var columnName = tableColumn.column.name;
var width = draftColumnWidths[columnName] || columnWidths[columnName];
var isCurrentColumn = function isCurrentColumn(elem) {
return elem.columnName === columnName;
};
var column = draftColumnWidths.find(isCurrentColumn) || columnWidths.find(isCurrentColumn);
var width = column && column.width;
if (width === undefined) {
throw new Error(UNSET_COLUMN_WIDTH_ERROR.replace('$1', columnName));
}
@@ -1212,36 +1231,43 @@ var tableColumnsWithWidths = function tableColumnsWithWidths(tableColumns, colum
var MIN_SIZE = 40;
var changeTableColumnWidths = function changeTableColumnWidths(state, _ref) {
var shifts = _ref.shifts;
var changeTableColumnWidth = function changeTableColumnWidth(state, _ref) {
var columnName = _ref.columnName,
shift = _ref.shift;
var columnWidths = state.columnWidths;
var updatedColumnWidths = Object.keys(shifts).reduce(function (acc, columnName) {
var size = Math.max(MIN_SIZE, columnWidths[columnName] + shifts[columnName]);
return Object.assign(acc, defineProperty({}, columnName, size));
}, {});
return _extends({}, state, {
columnWidths: _extends({}, columnWidths, updatedColumnWidths),
draftColumnWidths: {}
var nextColumnWidth = columnWidths.slice();
var index = nextColumnWidth.findIndex(function (elem) {
return elem.columnName === columnName;
});
var updatedColumn = nextColumnWidth[index];
var size = Math.max(MIN_SIZE, updatedColumn.width + shift);
nextColumnWidth.splice(index, 1, { columnName: columnName, width: size });
return {
columnWidths: nextColumnWidth
};
};
var changeDraftTableColumnWidths = function changeDraftTableColumnWidths(state, _ref2) {
var shifts = _ref2.shifts;
var columnWidths = state.columnWidths,
draftColumnWidths = state.draftColumnWidths;
var draftTableColumnWidth = function draftTableColumnWidth(state, _ref2) {
var columnName = _ref2.columnName,
shift = _ref2.shift;
var columnWidths = state.columnWidths;
var updatedDraftColumnWidths = Object.keys(shifts).reduce(function (acc, columnName) {
if (shifts[columnName] === null) {
delete acc[columnName];
return acc;
}
var size = Math.max(MIN_SIZE, columnWidths[columnName] + shifts[columnName]);
return Object.assign(acc, defineProperty({}, columnName, size));
}, Object.assign({}, draftColumnWidths));
return _extends({}, state, {
draftColumnWidths: updatedDraftColumnWidths
var updatedColumn = columnWidths.find(function (elem) {
return elem.columnName === columnName;
});
var size = Math.max(MIN_SIZE, updatedColumn.width + shift);
return {
draftColumnWidths: [{ columnName: updatedColumn.columnName, width: size }]
};
};
var cancelTableColumnWidthDraft = function cancelTableColumnWidthDraft() {
return {
draftColumnWidths: []
};
};
var TABLE_EDIT_COMMAND_TYPE = 'editCommand';
@@ -1272,8 +1298,8 @@ var isEditTableRow = function isEditTableRow(tableRow) {
return tableRow.type === TABLE_EDIT_TYPE;
};
var tableRowsWithEditing = function tableRowsWithEditing(tableRows, editingRows, addedRows, rowHeight) {
var rowIds = new Set(editingRows);
var tableRowsWithEditing = function tableRowsWithEditing(tableRows, editingRowIds, addedRows, rowHeight) {
var rowIds = new Set(editingRowIds);
var editedTableRows = tableRows.map(function (tableRow) {
return tableRow.type === TABLE_DATA_TYPE && rowIds.has(tableRow.rowId) ? _extends({}, tableRow, {
type: TABLE_EDIT_TYPE,
@@ -1319,37 +1345,44 @@ var isGroupTableRow = function isGroupTableRow(tableRow) {
return tableRow.type === TABLE_GROUP_TYPE;
};
var tableColumnsWithDraftGrouping = function tableColumnsWithDraftGrouping(tableColumns, draftGrouping, showColumnWhenGrouped) {
var tableColumnsWithDraftGrouping = function tableColumnsWithDraftGrouping(tableColumns, grouping, draftGrouping, showColumnWhenGrouped) {
return tableColumns.reduce(function (acc, tableColumn) {
var isDataColumn = tableColumn.type === TABLE_DATA_TYPE;
var tableColumnName = isDataColumn ? tableColumn.column.name : '';
var columnDraftGrouping = draftGrouping.find(function (grouping) {
return grouping.columnName === tableColumnName;
if (tableColumn.type !== TABLE_DATA_TYPE) {
acc.push(tableColumn);
return acc;
}
var columnName = tableColumn.column.name;
var columnGroupingExists = grouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
});
var columnDraftGroupingExists = draftGrouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
});
if (!columnDraftGrouping || showColumnWhenGrouped(tableColumnName)) {
return [].concat(toConsumableArray(acc), [tableColumn]);
} else if (columnDraftGrouping.mode === 'remove' || columnDraftGrouping.mode === 'add') {
return [].concat(toConsumableArray(acc), [_extends({}, tableColumn, {
if (!columnGroupingExists && !columnDraftGroupingExists || showColumnWhenGrouped(columnName)) {
acc.push(tableColumn);
} else if (!columnGroupingExists && columnDraftGroupingExists || columnGroupingExists && !columnDraftGroupingExists) {
acc.push(_extends({}, tableColumn, {
draft: true
})]);
}));
}
return acc;
}, []);
};
var tableColumnsWithGrouping = function tableColumnsWithGrouping(tableColumns, grouping, draftGrouping, groupIndentColumnWidth, showColumnWhenGrouped) {
var tableColumnsWithGrouping = function tableColumnsWithGrouping(columns, tableColumns, grouping, draftGrouping, indentColumnWidth, showColumnWhenGrouped) {
return [].concat(toConsumableArray(grouping.map(function (columnGrouping) {
var groupedColumn = tableColumns.find(function (tableColumn) {
return tableColumn.type === TABLE_DATA_TYPE && tableColumn.column.name === columnGrouping.columnName;
}).column;
var groupedColumn = columns.find(function (column) {
return column.name === columnGrouping.columnName;
});
return {
key: TABLE_GROUP_TYPE + '_' + groupedColumn.name,
type: TABLE_GROUP_TYPE,
column: groupedColumn,
width: groupIndentColumnWidth
width: indentColumnWidth
};
})), toConsumableArray(tableColumnsWithDraftGrouping(tableColumns, draftGrouping, showColumnWhenGrouped)));
})), toConsumableArray(tableColumnsWithDraftGrouping(tableColumns, grouping, draftGrouping, showColumnWhenGrouped)));
};
var tableRowsWithGrouping = function tableRowsWithGrouping(tableRows, isGroupRow) {
@@ -1379,8 +1412,8 @@ var tableRowsWithHeading = function tableRowsWithHeading(headerRows) {
var TABLE_DETAIL_TYPE = 'detail';
var isDetailRowExpanded = function isDetailRowExpanded(expandedRows, rowId) {
return expandedRows.indexOf(rowId) > -1;
var isDetailRowExpanded = function isDetailRowExpanded(expandedDetailRowIds, rowId) {
return expandedDetailRowIds.indexOf(rowId) > -1;
};
var isDetailToggleTableCell = function isDetailToggleTableCell(tableRow, tableColumn) {
return tableColumn.type === TABLE_DETAIL_TYPE && tableRow.type === TABLE_DATA_TYPE;
@@ -1389,26 +1422,26 @@ var isDetailTableRow = function isDetailTableRow(tableRow) {
return tableRow.type === TABLE_DETAIL_TYPE;
};
var setDetailRowExpanded = function setDetailRowExpanded(prevExpanded, _ref) {
var toggleDetailRowExpanded = function toggleDetailRowExpanded(prevExpanded, _ref) {
var rowId = _ref.rowId,
isExpanded = _ref.isExpanded;
state = _ref.state;
var expandedRows = Array.from(prevExpanded);
var expandedIndex = expandedRows.indexOf(rowId);
var isRowExpanded = isExpanded !== undefined ? isExpanded : expandedIndex === -1;
var expandedDetailRowIds = prevExpanded.slice();
var expandedIndex = expandedDetailRowIds.indexOf(rowId);
var rowState = state !== undefined ? state : expandedIndex === -1;
if (expandedIndex > -1 && !isRowExpanded) {
expandedRows.splice(expandedIndex, 1);
} else if (expandedIndex === -1 && isRowExpanded) {
expandedRows.push(rowId);
if (expandedIndex > -1 && !rowState) {
expandedDetailRowIds.splice(expandedIndex, 1);
} else if (expandedIndex === -1 && rowState) {
expandedDetailRowIds.push(rowId);
}
return expandedRows;
return expandedDetailRowIds;
};
var tableRowsWithExpandedDetail = function tableRowsWithExpandedDetail(tableRows, expandedRows, rowHeight) {
var tableRowsWithExpandedDetail = function tableRowsWithExpandedDetail(tableRows, expandedDetailRowIds, rowHeight) {
var result = tableRows;
expandedRows.forEach(function (expandedRowId) {
expandedDetailRowIds.forEach(function (expandedRowId) {
var rowIndex = result.findIndex(function (tableRow) {
return tableRow.type === TABLE_DATA_TYPE && tableRow.rowId === expandedRowId;
});
@@ -1430,8 +1463,8 @@ var tableRowsWithExpandedDetail = function tableRowsWithExpandedDetail(tableRows
return result;
};
var tableColumnsWithDetail = function tableColumnsWithDetail(tableColumns, detailToggleCellWidth) {
return [{ key: TABLE_DETAIL_TYPE, type: TABLE_DETAIL_TYPE, width: detailToggleCellWidth }].concat(toConsumableArray(tableColumns));
var tableColumnsWithDetail = function tableColumnsWithDetail(tableColumns, toggleColumnWidth) {
return [{ key: TABLE_DETAIL_TYPE, type: TABLE_DETAIL_TYPE, width: toggleColumnWidth }].concat(toConsumableArray(tableColumns));
};
var TABLE_SELECT_TYPE = 'select';
@@ -1460,12 +1493,29 @@ var isDataTableRow = function isDataTableRow(tableRow) {
return tableRow.type === TABLE_DATA_TYPE;
};
var tableColumnsWithDataRows = function tableColumnsWithDataRows(columns) {
var getColumnExtension = function getColumnExtension(columnExtensions, columnName) {
if (!columnExtensions) {
return {};
}
var columnExtension = columnExtensions.find(function (extension) {
return extension.columnName === columnName;
});
if (!columnExtension) {
return {};
}
return columnExtension;
};
var tableColumnsWithDataRows = function tableColumnsWithDataRows(columns, columnExtensions) {
return columns.map(function (column) {
var name = column.name;
var columnExtension = getColumnExtension(columnExtensions, name);
return {
key: TABLE_DATA_TYPE + '_' + column.name,
key: TABLE_DATA_TYPE + '_' + name,
type: TABLE_DATA_TYPE,
width: column.width,
width: columnExtension.width,
align: columnExtension.align,
column: column
};
});
@@ -1483,20 +1533,26 @@ var tableRowsWithDataRows = function tableRowsWithDataRows(rows, getRowId) {
});
};
var visibleTableColumns = function visibleTableColumns(tableColumns, hiddenColumns) {
var visibleTableColumns = function visibleTableColumns(tableColumns, hiddenColumnNames) {
return tableColumns.filter(function (tableColumn) {
return hiddenColumns.indexOf(tableColumn.column.name) === -1;
return tableColumn.type !== TABLE_DATA_TYPE || hiddenColumnNames.indexOf(tableColumn.column.name) === -1;
});
};
var columnChooserItems = function columnChooserItems(columns, hiddenColumns) {
var tableDataColumnsExist = function tableDataColumnsExist(tableColumns) {
return tableColumns.some(function (column) {
return column.type === TABLE_DATA_TYPE;
});
};
var columnChooserItems = function columnChooserItems(columns, hiddenColumnNames) {
return columns.map(function (column) {
return { column: column, hidden: hiddenColumns.indexOf(column.name) !== -1 };
return { column: column, hidden: hiddenColumnNames.indexOf(column.name) !== -1 };
});
};
var toggleColumn = function toggleColumn(hiddenColumns, columnName) {
return hiddenColumns.indexOf(columnName) === -1 ? [].concat(toConsumableArray(hiddenColumns), [columnName]) : hiddenColumns.filter(function (hiddenColumn) {
var toggleColumn = function toggleColumn(hiddenColumnNames, columnName) {
return hiddenColumnNames.indexOf(columnName) === -1 ? [].concat(toConsumableArray(hiddenColumnNames), [columnName]) : hiddenColumnNames.filter(function (hiddenColumn) {
return hiddenColumn !== columnName;
});
};
@@ -1656,15 +1712,36 @@ var isOnTheSameLine = function isOnTheSameLine(geometry, y) {
return y >= geometry.top && y <= geometry.bottom;
};
var getGroupCellTargetIndex = function getGroupCellTargetIndex(geometries, sourceIndex, _ref) {
var x = _ref.x,
y = _ref.y;
var rectToObject = function rectToObject(_ref) {
var top = _ref.top,
right = _ref.right,
bottom = _ref.bottom,
left = _ref.left;
return {
top: top, right: right, bottom: bottom, left: left
};
};
var collapseGapsBetweenItems = function collapseGapsBetweenItems(geometries) {
return geometries.map(function (geometry, index) {
if (index !== geometries.length - 1 && geometry.top === geometries[index + 1].top) {
return _extends({}, geometry, {
right: geometries[index + 1].left
});
}
return geometry;
});
};
var getGroupCellTargetIndex = function getGroupCellTargetIndex(geometries, sourceIndex, _ref2) {
var x = _ref2.x,
y = _ref2.y;
if (geometries.length === 0) return 0;
var targetGeometries = sourceIndex !== -1 ? getTargetColumnGeometries(geometries, sourceIndex) : geometries;
var targetGeometries = sourceIndex !== -1 ? getTargetColumnGeometries(geometries, sourceIndex) : geometries.map(rectToObject);
var targetIndex = targetGeometries.findIndex(function (geometry, index) {
var targetIndex = collapseGapsBetweenItems(targetGeometries).findIndex(function (geometry, index) {
var inVerticalBounds = isOnTheSameLine(geometry, y);
var inHorizontalBounds = x >= geometry.left && x <= geometry.right;
var shouldGoFirst = index === 0 && x < geometry.left;
@@ -1696,6 +1773,7 @@ var getMessagesFormatter = function getMessagesFormatter(messages) {
};
};
exports.getColumnExtension = getColumnExtension;
exports.getTableRowColumnsWithColSpan = getTableRowColumnsWithColSpan;
exports.getTableColumnGeometries = getTableColumnGeometries;
exports.getTableTargetColumnIndex = getTableTargetColumnIndex;
@@ -1706,17 +1784,17 @@ exports.getGroupCellTargetIndex = getGroupCellTargetIndex;
exports.getMessagesFormatter = getMessagesFormatter;
exports.rowIdGetter = rowIdGetter;
exports.cellValueGetter = cellValueGetter;
exports.setColumnSorting = setColumnSorting;
exports.changeColumnSorting = changeColumnSorting;
exports.getColumnSortingDirection = getColumnSortingDirection;
exports.sortedRows = sortedRows;
exports.setColumnFilter = setColumnFilter;
exports.changeColumnFilter = changeColumnFilter;
exports.getColumnFilterConfig = getColumnFilterConfig;
exports.filteredRows = filteredRows;
exports.groupByColumn = groupByColumn;
exports.GROUP_KEY_SEPARATOR = GROUP_KEY_SEPARATOR;
exports.changeColumnGrouping = changeColumnGrouping;
exports.toggleExpandedGroups = toggleExpandedGroups;
exports.draftGroupingChange = draftGroupingChange;
exports.cancelGroupingChange = cancelGroupingChange;
exports.draftGrouping = draftGrouping;
exports.draftColumnGrouping = draftColumnGrouping;
exports.cancelColumnGroupingDraft = cancelColumnGroupingDraft;
exports.groupRowChecker = groupRowChecker;
exports.groupRowLevelKeyGetter = groupRowLevelKeyGetter;
exports.groupedRows = groupedRows;
@@ -1732,9 +1810,11 @@ exports.pageCount = pageCount;
exports.rowCount = rowCount;
exports.firstRowOnPage = firstRowOnPage;
exports.lastRowOnPage = lastRowOnPage;
exports.setRowsSelection = setRowsSelection;
exports.getAvailableToSelect = getAvailableToSelect;
exports.getAvailableSelection = getAvailableSelection;
exports.toggleSelection = toggleSelection;
exports.rowsWithAvailableToSelect = rowsWithAvailableToSelect;
exports.someSelected = someSelected;
exports.allSelected = allSelected;
exports.unwrapSelectedRows = unwrapSelectedRows;
exports.startEditRows = startEditRows;
exports.stopEditRows = stopEditRows;
exports.addRow = addRow;
@@ -1746,7 +1826,7 @@ exports.deleteRows = deleteRows;
exports.cancelDeletedRows = cancelDeletedRows;
exports.changedRowsByIds = changedRowsByIds;
exports.addedRowsByIds = addedRowsByIds;
exports.computedCreateRowChange = computedCreateRowChange;
exports.createRowChangeGetter = createRowChangeGetter;
exports.getRowChange = getRowChange;
exports.TABLE_REORDERING_TYPE = TABLE_REORDERING_TYPE;
exports.changeColumnOrder = changeColumnOrder;
@@ -1754,8 +1834,9 @@ exports.orderedColumns = orderedColumns;
exports.tableHeaderRowsWithReordering = tableHeaderRowsWithReordering;
exports.draftOrder = draftOrder;
exports.tableColumnsWithWidths = tableColumnsWithWidths;
exports.changeTableColumnWidths = changeTableColumnWidths;
exports.changeDraftTableColumnWidths = changeDraftTableColumnWidths;
exports.changeTableColumnWidth = changeTableColumnWidth;
exports.draftTableColumnWidth = draftTableColumnWidth;
exports.cancelTableColumnWidthDraft = cancelTableColumnWidthDraft;
exports.TABLE_EDIT_COMMAND_TYPE = TABLE_EDIT_COMMAND_TYPE;
exports.isHeadingEditCommandsTableCell = isHeadingEditCommandsTableCell;
exports.isEditCommandsTableCell = isEditCommandsTableCell;
@@ -1784,7 +1865,7 @@ exports.TABLE_DETAIL_TYPE = TABLE_DETAIL_TYPE;
exports.isDetailRowExpanded = isDetailRowExpanded;
exports.isDetailToggleTableCell = isDetailToggleTableCell;
exports.isDetailTableRow = isDetailTableRow;
exports.setDetailRowExpanded = setDetailRowExpanded;
exports.toggleDetailRowExpanded = toggleDetailRowExpanded;
exports.tableRowsWithExpandedDetail = tableRowsWithExpandedDetail;
exports.tableColumnsWithDetail = tableColumnsWithDetail;
exports.TABLE_SELECT_TYPE = TABLE_SELECT_TYPE;
@@ -1800,6 +1881,7 @@ exports.isDataTableRow = isDataTableRow;
exports.tableColumnsWithDataRows = tableColumnsWithDataRows;
exports.tableRowsWithDataRows = tableRowsWithDataRows;
exports.visibleTableColumns = visibleTableColumns;
exports.tableDataColumnsExist = tableDataColumnsExist;
exports.columnChooserItems = columnChooserItems;
exports.toggleColumn = toggleColumn;

File diff suppressed because one or more lines are too long

View File

@@ -1,8 +1,8 @@
{
"_from": "@devexpress/dx-grid-core",
"_id": "@devexpress/dx-grid-core@1.0.0-beta.1",
"_id": "@devexpress/dx-grid-core@1.0.3",
"_inBundle": false,
"_integrity": "sha512-3hKM7JUKKHJGJ8C/B20SDfCbkxr7R6ADhKb/IfkWrepJQ78uPDce9wWxwkjl8EqSd8r1jKMWbg5dgXMU6zQwWw==",
"_integrity": "sha512-k+mzGd1Gjqbq92BwZdr+UMQcTFfezk2usEaSRqBO30b6+THYYAIx5kFzCbkcv1H37CtFNju29t52ZTNDJZixVQ==",
"_location": "/@devexpress/dx-grid-core",
"_phantomChildren": {},
"_requested": {
@@ -17,12 +17,13 @@
"fetchSpec": "latest"
},
"_requiredBy": [
"#USER"
"#USER",
"/@devexpress/dx-react-grid"
],
"_resolved": "https://registry.npmjs.org/@devexpress/dx-grid-core/-/dx-grid-core-1.0.0-beta.1.tgz",
"_shasum": "48f76255c7192e7727f2c9b97efb2bf70774471d",
"_resolved": "https://registry.npmjs.org/@devexpress/dx-grid-core/-/dx-grid-core-1.0.3.tgz",
"_shasum": "e6b2708593c10c6dfab2cbc4c2c3f82b5ab910c2",
"_spec": "@devexpress/dx-grid-core",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI",
"author": {
"name": "Developer Express Inc.",
"url": "https://www.devexpress.com/"
@@ -34,24 +35,23 @@
"deprecated": false,
"description": "Core library for the DevExtreme Reactive Grid component",
"devDependencies": {
"@devexpress/dx-core": "1.0.0-beta.1",
"@devexpress/dx-core": "1.0.3",
"babel-core": "^6.26.0",
"babel-jest": "^21.2.0",
"babel-jest": "^22.1.0",
"babel-plugin-external-helpers": "^6.22.0",
"babel-plugin-transform-object-rest-spread": "^6.26.0",
"babel-plugin-transform-runtime": "^6.23.0",
"babel-preset-es2015": "^6.24.1",
"core-js": "^2.5.1",
"eslint": "^4.10.0",
"core-js": "^2.5.3",
"eslint": "^4.16.0",
"eslint-config-airbnb-base": "^12.1.0",
"eslint-plugin-filenames": "^1.2.0",
"eslint-plugin-import": "^2.8.0",
"eslint-plugin-jest": "^21.3.0",
"jest": "^21.2.1",
"eslint-plugin-jest": "^21.7.0",
"jest": "^22.1.4",
"rollup": "0.50.0",
"rollup-plugin-babel": "^3.0.2",
"rollup-plugin-license": "^0.5.0",
"seamless-immutable": "^7.1.2"
"rollup-plugin-babel": "^3.0.3",
"rollup-plugin-license": "^0.5.0"
},
"files": [
"dist"
@@ -68,7 +68,7 @@
"module": "dist/dx-grid-core.es.js",
"name": "@devexpress/dx-grid-core",
"peerDependencies": {
"@devexpress/dx-core": "1.0.0-beta.1"
"@devexpress/dx-core": "1.0.3"
},
"publishConfig": {
"access": "public"
@@ -86,5 +86,5 @@
"test:coverage": "jest --coverage",
"test:watch": "jest --watch"
},
"version": "1.0.0-beta.1"
"version": "1.0.3"
}

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

View File

@@ -1,7 +1,7 @@
/**
* Bundle of @devexpress/dx-core
* Generated: 2017-11-10
* Version: 1.0.0-beta.1
* Generated: 2018-03-02
* Version: 1.0.3
* License: https://js.devexpress.com/Licensing
*/
@@ -186,22 +186,22 @@ var PluginHost = function () {
this.plugins.filter(function (plugin) {
return plugin.container;
}).forEach(function (plugin) {
if (knownOptionals.has(plugin.pluginName)) {
throw getDependencyError(knownOptionals.get(plugin.pluginName), plugin.pluginName);
if (knownOptionals.has(plugin.name)) {
throw getDependencyError(knownOptionals.get(plugin.name), plugin.name);
}
plugin.dependencies.forEach(function (dependency) {
if (defined.has(dependency.pluginName)) return;
if (defined.has(dependency.name)) return;
if (dependency.optional) {
if (!knownOptionals.has(dependency.pluginName)) {
knownOptionals.set(dependency.pluginName, plugin.pluginName);
if (!knownOptionals.has(dependency.name)) {
knownOptionals.set(dependency.name, plugin.name);
}
return;
}
throw getDependencyError(plugin.pluginName, dependency.pluginName);
throw getDependencyError(plugin.name, dependency.name);
});
defined.add(plugin.pluginName);
defined.add(plugin.name);
});
}
}, {

File diff suppressed because one or more lines are too long

View File

@@ -1,7 +1,7 @@
/**
* Bundle of @devexpress/dx-core
* Generated: 2017-11-10
* Version: 1.0.0-beta.1
* Generated: 2018-03-02
* Version: 1.0.3
* License: https://js.devexpress.com/Licensing
*/
@@ -192,22 +192,22 @@ var PluginHost = function () {
this.plugins.filter(function (plugin) {
return plugin.container;
}).forEach(function (plugin) {
if (knownOptionals.has(plugin.pluginName)) {
throw getDependencyError(knownOptionals.get(plugin.pluginName), plugin.pluginName);
if (knownOptionals.has(plugin.name)) {
throw getDependencyError(knownOptionals.get(plugin.name), plugin.name);
}
plugin.dependencies.forEach(function (dependency) {
if (defined.has(dependency.pluginName)) return;
if (defined.has(dependency.name)) return;
if (dependency.optional) {
if (!knownOptionals.has(dependency.pluginName)) {
knownOptionals.set(dependency.pluginName, plugin.pluginName);
if (!knownOptionals.has(dependency.name)) {
knownOptionals.set(dependency.name, plugin.name);
}
return;
}
throw getDependencyError(plugin.pluginName, dependency.pluginName);
throw getDependencyError(plugin.name, dependency.name);
});
defined.add(plugin.pluginName);
defined.add(plugin.name);
});
}
}, {

File diff suppressed because one or more lines are too long

View File

@@ -1,28 +1,28 @@
{
"_from": "@devexpress/dx-core@1.0.0-beta.1",
"_id": "@devexpress/dx-core@1.0.0-beta.1",
"_from": "@devexpress/dx-core@1.0.3",
"_id": "@devexpress/dx-core@1.0.3",
"_inBundle": false,
"_integrity": "sha512-4Kv5RTlmlK7o2DF5BB5r2yWgshvFrUSHWzJzdSyBtFxsQzvI3vJqS0Z0mAplZCyYfRk4xh9SRp6I9DML66v0EQ==",
"_integrity": "sha512-M1Kjju074ddAQmaFuKypM/LdhCZsDISqhGj4LST2ZGQPlGpH89BMBEV8p+8MedFQQCG/svuS25AKip1Gs9KJgA==",
"_location": "/@devexpress/dx-react-core/@devexpress/dx-core",
"_phantomChildren": {},
"_requested": {
"type": "version",
"registry": true,
"raw": "@devexpress/dx-core@1.0.0-beta.1",
"raw": "@devexpress/dx-core@1.0.3",
"name": "@devexpress/dx-core",
"escapedName": "@devexpress%2fdx-core",
"scope": "@devexpress",
"rawSpec": "1.0.0-beta.1",
"rawSpec": "1.0.3",
"saveSpec": null,
"fetchSpec": "1.0.0-beta.1"
"fetchSpec": "1.0.3"
},
"_requiredBy": [
"/@devexpress/dx-react-core"
],
"_resolved": "https://registry.npmjs.org/@devexpress/dx-core/-/dx-core-1.0.0-beta.1.tgz",
"_shasum": "63383ec2bd3903d9a163c1316706cde32227d6b4",
"_spec": "@devexpress/dx-core@1.0.0-beta.1",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core",
"_resolved": "https://registry.npmjs.org/@devexpress/dx-core/-/dx-core-1.0.3.tgz",
"_shasum": "c310b540229f83d6be5797fb2a5da5491757d21b",
"_spec": "@devexpress/dx-core@1.0.3",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core",
"author": {
"name": "Developer Express Inc.",
"url": "https://www.devexpress.com/"
@@ -35,20 +35,20 @@
"description": "Core library for DevExtreme Reactive Components",
"devDependencies": {
"babel-core": "^6.26.0",
"babel-jest": "^21.2.0",
"babel-jest": "^22.1.0",
"babel-plugin-external-helpers": "^6.22.0",
"babel-plugin-transform-object-rest-spread": "^6.26.0",
"babel-plugin-transform-runtime": "^6.23.0",
"babel-preset-es2015": "^6.24.1",
"core-js": "^2.5.1",
"eslint": "^4.10.0",
"core-js": "^2.5.3",
"eslint": "^4.16.0",
"eslint-config-airbnb-base": "^12.1.0",
"eslint-plugin-filenames": "^1.2.0",
"eslint-plugin-import": "^2.8.0",
"eslint-plugin-jest": "^21.3.0",
"jest": "^21.2.1",
"eslint-plugin-jest": "^21.7.0",
"jest": "^22.1.4",
"rollup": "0.50.0",
"rollup-plugin-babel": "^3.0.2",
"rollup-plugin-babel": "^3.0.3",
"rollup-plugin-license": "^0.5.0"
},
"files": [
@@ -81,5 +81,5 @@
"test:coverage": "jest --coverage",
"test:watch": "jest --watch"
},
"version": "1.0.0-beta.1"
"version": "1.0.3"
}

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/asap/-/asap-2.0.6.tgz",
"_shasum": "e50347611d7e690943208bbdafebcbc2fb866d46",
"_spec": "asap@~2.0.3",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\promise",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\promise",
"browser": {
"./asap": "./browser-asap.js",
"./asap.js": "./browser-asap.js",

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/core-js/-/core-js-1.2.7.tgz",
"_shasum": "652294c14651db28fa93bd2d5ff2983a4f08c636",
"_spec": "core-js@^1.0.0",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\fbjs",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\fbjs",
"bugs": {
"url": "https://github.com/zloirock/core-js/issues"
},

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/encoding/-/encoding-0.1.12.tgz",
"_shasum": "538b66f3ee62cd1ab51ec323829d1f9480c74beb",
"_spec": "encoding@^0.1.11",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\node-fetch",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\node-fetch",
"author": {
"name": "Andris Reinman"
},

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/fbjs/-/fbjs-0.8.16.tgz",
"_shasum": "5e67432f550dc41b572bf55847b8aca64e5337db",
"_spec": "fbjs@^0.8.16",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\prop-types",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\prop-types",
"browserify": {
"transform": [
"loose-envify"

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.4.19.tgz",
"_shasum": "f7468f60135f5e5dad3399c0a81be9a1603a082b",
"_spec": "iconv-lite@~0.4.13",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\encoding",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\encoding",
"author": {
"name": "Alexander Shtuchkin",
"email": "ashtuchkin@gmail.com"

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/is-stream/-/is-stream-1.1.0.tgz",
"_shasum": "12d4a3dd4e68e0b79ceb8dbc84173ae80d91ca44",
"_spec": "is-stream@^1.0.1",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\node-fetch",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\node-fetch",
"author": {
"name": "Sindre Sorhus",
"email": "sindresorhus@gmail.com",

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/isomorphic-fetch/-/isomorphic-fetch-2.2.1.tgz",
"_shasum": "611ae1acf14f5e81f729507472819fe9733558a9",
"_spec": "isomorphic-fetch@^2.1.1",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\fbjs",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\fbjs",
"author": {
"name": "Matt Andrews",
"email": "matt@mattandre.ws"

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-3.0.2.tgz",
"_shasum": "9866df395102130e38f7f996bceb65443209c25b",
"_spec": "js-tokens@^3.0.0",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\loose-envify",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\loose-envify",
"author": {
"name": "Simon Lydell"
},

View File

@@ -22,7 +22,7 @@
"_resolved": "https://registry.npmjs.org/loose-envify/-/loose-envify-1.3.1.tgz",
"_shasum": "d1a8ad33fa9ce0e713d65fdd0ac8b748d478c848",
"_spec": "loose-envify@^1.3.1",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\prop-types",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\prop-types",
"author": {
"name": "Andres Suarez",
"email": "zertosh@gmail.com"

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-1.7.3.tgz",
"_shasum": "980f6f72d85211a5347c6b2bc18c5b84c3eb47ef",
"_spec": "node-fetch@^1.0.1",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\isomorphic-fetch",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\isomorphic-fetch",
"author": {
"name": "David Frank"
},

View File

@@ -22,7 +22,7 @@
"_resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz",
"_shasum": "2109adc7965887cfc05cbbd442cac8bfbb360863",
"_spec": "object-assign@^4.1.1",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\prop-types",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\prop-types",
"author": {
"name": "Sindre Sorhus",
"email": "sindresorhus@gmail.com",

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/promise/-/promise-7.3.1.tgz",
"_shasum": "064b72602b18f90f29192b8b1bc418ffd1ebd3bf",
"_spec": "promise@^7.1.1",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\fbjs",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\fbjs",
"author": {
"name": "ForbesLindesay"
},

Some files were not shown because too many files have changed in this diff Show More