33 Commits

Author SHA1 Message Date
aee3516682 fix folder permissions in donetorrentactions 2018-04-02 21:21:41 -04:00
a7881a14c7 fixing path issue with starting torrent 2018-03-30 20:05:18 -04:00
128ec774bd changing how the start API command works to start torrents 2018-03-27 15:39:02 -04:00
bc612bf5e4 removing symlink option only copy for now 2018-03-26 21:06:14 -04:00
3f1f9e7104 separate thread for torrent list 2018-03-25 23:07:22 -04:00
eeb6e102f1 fixing log not writing to file 2018-03-25 21:23:15 -04:00
0a0f0cd577 fixing notification issue, parallelizing startTorrent, verifying torrent after move 2018-03-25 09:34:32 -04:00
10399cc6e5 updating readme with new documentation link 2018-03-23 15:17:35 -04:00
9363649df0 Merge branch 'master' of https://github.com/deranjer/goTorrent 2018-03-22 22:48:36 -04:00
a804b401a7 Update README.md
Updating roadmap..
2018-03-22 22:46:47 -04:00
3b2c392bdf Changed to force manual IP address entry 2018-03-20 21:54:38 -04:00
fa46ba6025 rewriting how file prio works, adding token generation to backend, minor fixes 2018-03-19 21:22:57 -04:00
a56a507ca2 some modal changes, adding memory leak to fix stop/drop issue 2018-03-05 22:48:16 -05:00
ca1ed925d3 a few js changes for react upgrades 2018-03-04 21:23:57 -05:00
34e5f5139a Completely updated React, fixed #11, (hopefully) 2018-03-04 19:11:49 -05:00
6e0afd6e2a Fixing some API issues, adding a few API responses 2018-03-01 15:31:11 -05:00
fb71ca9b4e Fixing some API calls to accept optional payload 2018-02-24 12:25:09 -05:00
4015a48454 Getting ready to release 0.3.0, changing to new documentation system 2018-02-20 22:11:11 -05:00
840a965877 Added Settings Webui (view only), rewrite of API, Fixes #14, Fixes #2, now Testing 2018-02-20 21:51:49 -05:00
d4966f597b Fixes #15, started seperating Settings into their own package 2018-02-17 11:52:38 -05:00
ba0f076c66 cleaning up an issue with client config generation 2018-02-16 20:41:09 -05:00
3978be8a40 Adding ReverseProxy settings File 2018-02-15 22:55:47 -05:00
c5b86597cb File prio code added, API rewrite completed, some core features rewritten for clarity 2018-02-15 22:49:11 -05:00
b843cfc11b adding frontend authentication, starting file priority code 2018-02-10 09:53:02 -05:00
42f4ecc81b Reverse Proxy with SSL support, Generated client Configs, JWT client to server auth, closes #13 2018-02-07 21:42:35 -05:00
d6288f4aaa Reverse Proxy with SSL support, Generated client Configs, JWT client to server auth, closes #13 2018-02-07 21:41:00 -05:00
0abe1620c6 closes #9, closes #8, closes #3, closes #4, added new notification features, search torrents, change directory, force seed torrent, updated Readme 2018-02-03 14:22:21 -05:00
3ab66456a1 cleaning up old files 2018-01-31 22:29:51 -05:00
8db9a43b0f testing rate limiting, making API changes 2018-01-31 22:28:45 -05:00
6af49b317d Adding logic to change torrent storage path 2018-01-25 23:08:10 -05:00
f58ca5bb09 Finished Frontend notifications, added file prio (needs test), started Settings Button work 2018-01-23 23:22:25 -05:00
52e245d11f Finished Frontend notifications, added file prio (needs test), started Settings Button work 2018-01-23 23:21:25 -05:00
5856052f82 Started adding frontend notifications, fixing firefox file upload bug 2018-01-22 19:03:06 -05:00
13381 changed files with 317421 additions and 455090 deletions

6
.gitignore vendored
View File

@@ -1,8 +1,10 @@
downloads/
downloading/
downloaded/
uploadedTorrents/
storage.db.lock
storage.db
storage.db.old
.torrent.bolt.db.lock
.torrent.bolt.db
.idea/torrent-project.iml
@@ -16,5 +18,9 @@ boltbrowser.win64.exe
logs/server.log
.goreleaser.yml
config.toml.backup
config.1.toml
config.toml.old
/public/static/js/kickwebsocket.js.backup
/public/static/js/kickwebsocket-generated.js
clientAuth.txt
dist

11
.vscode/settings.json vendored
View File

@@ -1,3 +1,12 @@
{
"git.ignoreLimitWarning": true
"git.ignoreLimitWarning": true,
"cSpell.words": [
"anacrolix",
"asdine",
"btih",
"gofeed",
"logrus",
"mmcdole",
"otiai"
]
}

4
.vscode/tasks.json vendored
View File

@@ -12,9 +12,9 @@
]
},
{
"taskName": "Build GopherJS",
"taskName": "goReleaser Snapshot",
"type": "shell",
"command": "C:/Users/deranjer/go/bin/gopherjs.exe build C:/Users/deranjer/GoglandProjects/torrent-project/public/static/js/frontend-websocket.go",
"command": "C:/Users/deranjer/go/bin/goreleaser.exe -rm-dist -snapshot",
"problemMatcher": [
"$go"
]

134
README.md
View File

@@ -26,136 +26,46 @@ Image of the frontend UI
- Automatic stop after seeding ratio reached
- Pushbullet notification on torrent complete
- Automatic move of completed torrent to new directory (leave symlink behind for seeding)
- Doesn't work on Windows yet, have to copy file for now
- Symlinks don't work on Windows yet, have to copy file for now
## Roadmap
- Early-Mid 2018
- [X] Ability to modify storage path of torrent after it has been added
- [X] Backend to frontend notification messages
- [X] Global Rate Limiting for Upload/Download Speed
- [X] Add torrents from watch folder (cron job every 5 minutes)
- [X] Authentication from client to server (done via JWT, will add functionality for 3rd party clients later)
- [X] Reverse Proxy Support with SSL upgrade added (with provided config for nginx)
- [X] Mostly generated client config from toml.config on first run
- [X] Ability to view TOML settings from WebUI (and perhaps change a few as well)
- [X] Ability to set priority for individual files (needs more testing!)
- [ ] Unit testing completed for a large portion of the package
- [ ] Stability/bug fixing/Optimization rewrite of some of the core structures of the WebUI and base server
- [ ] Put the "Move torrent after download" into own goroutine with checks so the WebUI doesn't freeze when moving torrent
- [ ] Ability to set priority for individual files (just added to anacrolix/torrent so coming soon, already added to my UI)
- [ ] Ability to view TOML settings from WebUI (and perhaps change a few as well)
- [ ] Ability to modify storage path of torrent after it has been added
- Late 2018
- [ ] Define the websocket API for users to write their own clients/extensions
- [X] Define the websocket API for users to write their own clients/extensions
- [ ] React-native Android app (I don't own any Mac products so there will be no iPhone version)
# Installation:
# Documentation
## Linux (tested on Debian)
You can watch a YouTube video of me setting it up:
<a href="http://www.youtube.com/watch?feature=player_embedded&v=G0gO_cm_Oks
" target="_blank"><img src="http://img.youtube.com/vi/G0gO_cm_Oks/0.jpg"
alt="goTorrent Alpha Setup Video" width="240" height="180" border="10" /></a>
### Configuring the backend
Download the latest release from the releases tab, it will be in a tar.gz format.
Create a directory where goTorrent will run from
sudo mkdir /opt/goTorrent
Put the tar.gz release into the folder, and extract it.
tar -zxvf goTorrent_release_64-git.tar.gz
You can then remove the tar.gz if you wish. You should have something similar to the following files:
drwxr-xr-x 5 root root 9 Jan 21 14:56 .
drwxr-xr-x 5 root root 5 Jan 21 14:54 ..
-rw-rw-rw- 1 root root 1086 Dec 1 01:42 LICENSE
-rw-rw-rw- 1 root root 69 Dec 1 01:01 README.md
-rw-rw-rw- 1 root root 4466 Jan 21 03:48 config.toml
drwxr-xr-x 3 root root 3 Jan 21 14:55 dist-specific-files
-rw-rw-rw- 1 root root 12503552 Jan 21 03:53 goTorrent
drwxr-xr-x 3 root root 3 Jan 21 14:55 public
drwxr-xr-x 2 root root 3 Jan 21 14:55 templates
The `config.toml` file contains all of the settings for the server part of the application. Most of the important settings are at the top of the file, so open it with your prefered text editor.
[serverConfig]
ServerPort = ":8000" #leave format as is it expects a string with colon
ServerAddr = "" #blank will bind to default IP address, usually fine to leave be
LogLevel = "Warn" # Options = Debug, Info, Warn, Error, Fatal, Panic
LogOutput = "file" #Options = file, stdout #file will print it to logs/server.log
SeedRatioStop = 1.50 #automatically stops the torrent after it reaches this seeding ratio
#Relative or absolute path accepted, the server will convert any relative path to an absolute path.
DefaultMoveFolder = 'downloaded' #default path that a finished torrent is symlinked to after completion. Torrents added via RSS will default here
[notifications]
PushBulletToken = "" #add your pushbullet api token here to notify of torrent completion to pushbullet
Usually you don't need to change anything in this file, goTorrent will use your default IP address and bind to it. You can change the port if you wish.
Next, we need to make sure that the executable runs, so run the following:
chmod +x goTorrent
This will make the program executable.
###Connecting the Frontend to the Backend
We need to connect our react frontend to our Golang backend, for this we only need to edit one JS file.
nano public/static/js/kickwebsocket.js
var ws = new WebSocket("ws://192.168.1.141:8000/websocket"); //creating websocket
Just change the IP address after ws:// to your server IP address, and change the port if you changed the port in the `config.toml` file.
Then save that file and return to `/opt/goTorrent`.
Now we can test the server. For testing I recommend going into the `config.toml` file and changing the `LogOutput` to `stdout`, and the `LogLevel` to `Info`.
Then start the server:
./goTorrent
If you have `LogLevel` set to `Info`, you should see the confirmation that the client config has been generated.
You can then open your browser and connect to IP:Port (http) and you should see the main page. You will see an error for retrieving RSS feeds in stdout, but this is expected for first load.
You can press `F12` if using Chrome to open the console and click around the UI to see the logging available for the frontend.
### Running goTorrent as a Service
If you are on a linux system that uses systemd, in the `dist-specific-files\Linux-systemd\` folder there is a `goTorrent.service` file that can be used to setup systemd for goTorrent. A quick overview of what is needed.
1. Edit the systemd file to specify your specific implementation
2. Copy the file to your systemd folder, i.e. `/etc/systemd/system`
3. Enable the service `systemctl enable goTorrent.service`
4. If using a new user, create that user and assign permissions:
a. `useradd goTorrent`
b. `sudo chown -R goTorrent:goTorrent /opt/goTorrent`
c. If you want to test server: `su goTorrent` then run the executable
5. Set your `config.toml` file to the values you want.
6. Start your server: `systemctl start goTorrent`
7. Check for errors: `systemctl status goTorrent`. You can also check `logs\server.log`.
### Windows
Please see the linux instructions as they are similar, for running it as a service I havn't tried out any of the programs that claim to do that, but perhaps try [NSSM](http://nssm.cc/download)
All the documentation is available [here](https://deranjer.github.io/goTorrentDocs/)
# Special Thanks

View File

@@ -1,36 +1,55 @@
[serverConfig]
ServerPort = ":8000" #leave format as is it expects a string with colon
ServerAddr = "" #blank will bind to default IP address, usually fine to leave be
LogLevel = "Warn" # Options = Debug, Info, Warn, Error, Fatal, Panic
ServerAddr = "192.168.1.6" #Put in the IP address you want to bind to
LogLevel = "Debug" # Options = Debug, Info, Warn, Error, Fatal, Panic
LogOutput = "file" #Options = file, stdout #file will print it to logs/server.log
SeedRatioStop = 1.50 #automatically stops the torrent after it reaches this seeding ratio
#Relative or absolute path accepted, the server will convert any relative path to an absolute path.
DefaultMoveFolder = 'downloaded' #default path that a finished torrent is symlinked to after completion. Torrents added via RSS will default here
DefaultMoveFolder = 'Z:\downloads' #default path that a finished torrent is symlinked to after completion. Torrents added via RSS will default here
TorrentWatchFolder = 'torrentUpload' #folder path that is watched for .torrent files and adds them automatically every 5 minutes
#Limits your upload and download speed globally, all are averages and not burst protected (usually burst on start).
#Low = ~.05MB/s, Medium = ~.5MB/s, High = ~1.5MB/s
UploadRateLimit = "Unlimited" #Options are "Low", "Medium", "High", "Unlimited" #Unlimited is default
DownloadRateLimit = "Unlimited"
[goTorrentWebUI]
#Basic goTorrentWebUI authentication (not terribly secure, implemented in JS, password is hashed to SHA256, not salted, basically don't depend on this if you require very good security)
WebUIAuth = false # bool, if false no authentication is required for the webUI
WebUIUser = "admin"
WebUIPassword = "Password1"
[notifications]
PushBulletToken = "" #add your pushbullet api token here to notify of torrent completion to pushbullet
PushBulletToken = "o.8sUHemPkTCaty3u7KnyvEBN19EkeT63g" #add your pushbullet api token here to notify of torrent completion to pushbullet
[reverseProxy]
#This is for setting up goTorrent behind a reverse Proxy (with SSL, reverse proxy with no SSL will require editing the WSS connection to a WS connection manually)
ProxyEnabled = true #bool, either false or true
#URL is CASE SENSITIVE
BaseURL = "derajnet.duckdns.org/gopher/" # MUST be in the format (if you have a subdomain, and must have trailing slash) "yoursubdomain.domain.org/subroute/"
[EncryptionPolicy]
DisableEncryption = false
ForceEncryption = false
PreferNoEncryption = true
PreferNoEncryption = false
[torrentClientConfig]
DownloadDir = 'downloading' #the full OR relative path where the torrent server stores in-progress torrents
Seed = true #boolean #seed after download
Seed = false #boolean #seed after download
# Never send chunks to peers.
NoUpload = false #boolean
#User-provided Client peer ID. If not present, one is generated automatically.
PeerID = "" #string
#The address to listen for new uTP and TCP bittorrent protocol connections. DHT shares a UDP socket with uTP unless configured otherwise.
ListenAddr = "" #Leave Blank for default, syntax "HOST:PORT"
@@ -42,16 +61,6 @@
# Don't create a DHT.
NoDHT = false #boolean
# Events are data bytes sent in pieces. The burst must be large enough to fit a whole chunk.
UploadRateLimiter = "" #*rate.Limiter
#The events are bytes read from connections. The burst must be biggerthan the largest Read performed on a Conn minus one. This is likely to
#be the larger of the main read loop buffer (~4096), and the requested chunk size (~16KiB).
DownloadRateLimiter = "" #*rate.Limiter
#User-provided Client peer ID. If not present, one is generated automatically.
PeerID = "" #string
#For the bittorrent protocol.
DisableUTP = false #bool

122
config.toml.bk Normal file
View File

@@ -0,0 +1,122 @@
[serverConfig]
ServerPort = ":8000" #leave format as is it expects a string with colon
ServerAddr = "192.168.1.100" #Put in the IP address you want to bind to
LogLevel = "Info" # Options = Debug, Info, Warn, Error, Fatal, Panic
LogOutput = "stdout" #Options = file, stdout #file will print it to logs/server.log
SeedRatioStop = 1.50 #automatically stops the torrent after it reaches this seeding ratio
#Relative or absolute path accepted, the server will convert any relative path to an absolute path.
DefaultMoveFolder = 'downloads' #default path that a finished torrent is symlinked to after completion. Torrents added via RSS will default here
TorrentWatchFolder = 'torrentUpload' #folder path that is watched for .torrent files and adds them automatically every 5 minutes
#Limits your upload and download speed globally, all are averages and not burst protected (usually burst on start).
#Low = ~.05MB/s, Medium = ~.5MB/s, High = ~1.5MB/s
UploadRateLimit = "Unlimited" #Options are "Low", "Medium", "High", "Unlimited" #Unlimited is default
DownloadRateLimit = "Unlimited"
[goTorrentWebUI]
#Basic goTorrentWebUI authentication (not terribly secure, implemented in JS, password is hashed to SHA256, not salted, basically don't depend on this if you require very good security)
WebUIAuth = false # bool, if false no authentication is required for the webUI
WebUIUser = "admin"
WebUIPassword = "Password1"
[notifications]
PushBulletToken = "" #add your pushbullet api token here to notify of torrent completion to pushbullet
[reverseProxy]
#This is for setting up goTorrent behind a reverse Proxy (with SSL, reverse proxy with no SSL will require editing the WSS connection to a WS connection manually)
ProxyEnabled = false #bool, either false or true
#URL is CASE SENSITIVE
BaseURL = "domain.com/subroute/" # MUST be in the format (if you have a subdomain, and must have trailing slash) "yoursubdomain.domain.org/subroute/"
[EncryptionPolicy]
DisableEncryption = false
ForceEncryption = false
PreferNoEncryption = true
[torrentClientConfig]
DownloadDir = 'downloading' #the full OR relative path where the torrent server stores in-progress torrents
Seed = true #boolean #seed after download
# Never send chunks to peers.
NoUpload = false #boolean
#User-provided Client peer ID. If not present, one is generated automatically.
PeerID = "" #string
#The address to listen for new uTP and TCP bittorrent protocol connections. DHT shares a UDP socket with uTP unless configured otherwise.
ListenAddr = "" #Leave Blank for default, syntax "HOST:PORT"
#Don't announce to trackers. This only leaves DHT to discover peers.
DisableTrackers = false #boolean
DisablePEX = false # boolean
# Don't create a DHT.
NoDHT = false #boolean
#For the bittorrent protocol.
DisableUTP = false #bool
#For the bittorrent protocol.
DisableTCP = false #bool
#Called to instantiate storage for each added torrent. Builtin backends
# are in the storage package. If not set, the "file" implementation is used.
DefaultStorage = "storage.ClientImpl"
#encryption policy
IPBlocklist = "" #of type iplist.Ranger
DisableIPv6 = false #boolean
Debug = false #boolean
#HTTP *http.Client
HTTPUserAgent = "" # HTTPUserAgent changes default UserAgent for HTTP requests
ExtendedHandshakeClientVersion = ""
Bep20 = ""
# Overrides the default DHT configuration, see dhtServerConfig #advanced.. so be careful
DHTConfig = "" # default is "dht.ServerConfig"
[dhtServerConfig]
# Set NodeId Manually. Caller must ensure that if NodeId does not conform to DHT Security Extensions, that NoSecurity is also set.
NodeId = "" #[20]byte
Conn = "" # https:#godoc.org/net#PacketConn #not implemented
# Don't respond to queries from other nodes.
Passive = false # boolean
# the default addresses are "router.utorrent.com:6881","router.bittorrent.com:6881","dht.transmissionbt.com:6881","dht.aelitis.com:6881",
#https:#github.com/anacrolix/dht/blob/master/dht.go
StartingNodes = "dht.GlobalBootstrapAddrs"
#Disable the DHT security extension: http:#www.libtorrent.org/dht_sec.html.
NoSecurity = false
#Initial IP blocklist to use. Applied before serving and bootstrapping begins.
IPBlocklist = "" #of type iplist.Ranger
#Used to secure the server's ID. Defaults to the Conn's LocalAddr(). Set to the IP that remote nodes will see,
#as that IP is what they'll use to validate our ID.
PublicIP = "" #net.IP
#Hook received queries. Return true if you don't want to propagate to the default handlers.
OnQuery = "func(query *krpc.Msg, source net.Addr) (propagate bool)"
#Called when a peer successfully announces to us.
OnAnnouncePeer = "func(infoHash metainfo.Hash, peer Peer)"
#How long to wait before re-sending queries that haven't received a response. Defaults to a random value between 4.5 and 5.5s.
QueryResendDelay = "func() time.Duration"

View File

@@ -0,0 +1,12 @@
location ^~ /gotorrent/ {
proxy_pass http://192.168.1.100:8000/;
proxy_redirect http:// https://;
proxy_pass_header Server;
proxy_set_header Host $http_host;
proxy_set_header X-Real-IP $http_address;
proxy_set_header X-Scheme $scheme;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "Upgrade";
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 67 KiB

After

Width:  |  Height:  |  Size: 63 KiB

View File

@@ -5,20 +5,33 @@ import (
"github.com/anacrolix/torrent"
"github.com/anacrolix/torrent/metainfo"
Settings "github.com/deranjer/goTorrent/settings"
Storage "github.com/deranjer/goTorrent/storage"
)
//All the message types are first, first the server handling messages from the client
//Message contains the JSON messages from the client, we first unmarshal to get the messagetype, then each module unmarshalls the actual message once we know the type
//Message contains the JSON messages from the client, we first unmarshal to get the messagetype, then pass it on to each module
type Message struct {
MessageType string
MessageDetail string `json:",omitempty"`
MessageDetailTwo string `json:",omitempty"`
Payload []string
Payload interface{}
}
//Next are the messages the server sends to the client
//AuthResponse is sent when the client fails to perform authentication correctly
type AuthResponse struct {
MessageType string
Payload string
}
//ServerPushMessage is information (usually logs and status messages) that the server pushes to the client
type ServerPushMessage struct {
MessageType string
MessageLevel string //can be "success", "error", "warn", "info"
Payload string //the actual message
}
//RSSJSONList is a slice of gofeed.Feeds sent to the client
type RSSJSONList struct {
MessageType string
@@ -32,8 +45,17 @@ type RSSFeedsNames struct {
RSSFeedURL string
}
//SingleRSSFeedMessage contains the torrents/name/etc of a single torrent feed
type SingleRSSFeedMessage struct { //TODO had issues with getting this to work with Storage or Engine
MessageType string
URL string //the URL of the individual RSS feed
Name string
TotalTorrents int
Torrents []Storage.SingleRSSTorrent //name of the torrents
}
//TorrentList struct contains the torrent list that is sent to the client
type TorrentList struct { //helps create the JSON structure that react expects to recieve
type TorrentList struct { //helps create the JSON structure that react expects to receive
MessageType string `json:"MessageType"`
Totaltorrents int `json:"total"`
ClientDBstruct []ClientDB `json:"data"`
@@ -48,48 +70,53 @@ type TorrentFileList struct {
//PeerFileList returns a slice of peers
type PeerFileList struct {
MessageType string `json:"MessageType"`
TotalPeers int `json:"TotalPeers"`
PeerList []torrent.Peer `json:"PeerList"`
MessageType string
TotalPeers int
PeerList []torrent.Peer
}
//TorrentFile describes a single file that a torrent client is downloading for a single torrent
type TorrentFile struct {
TorrentHashString string //Used to tie the file to a torrent //TODO not sure if neededs
FileName string
FilePath string
FileSize string
FilePercent string
FilePriority string
TorrentHashString string //Used to tie the file to a torrent //TODO not sure if needed
FileName string //The name of the file
FilePath string //The relative filepath to the file
FileSize string //Humanized file size display
FilePercent string //String value of percent of individual file percent done
FilePriority string //Currently "High", "Normal", or "Cancel"
}
type SettingsFile struct {
MessageType string
Config Settings.FullClientSettings
}
//ClientDB struct contains the struct that is used to compose the torrentlist
type ClientDB struct { //TODO maybe seperate out the internal bits into another client struct
TorrentHashString string `json:"TorrentHashString"` //Passed to client for displaying hash and is used to uniquly identify all torrents
TorrentName string `json:"TorrentName"`
DownloadedSize string `json:"DownloadedSize"` //how much the client has downloaded total
Size string `json:"Size"` //total size of the torrent
DownloadSpeed string `json:"DownloadSpeed"` //the dl speed of the torrent
Status string `json:"Status"` //Passed to client for display
PercentDone string `json:"PercentDone"` //Passed to client to show percent done
ActivePeers string `json:"ActivePeers"` //passed to client
UploadSpeed string `json:"UploadSpeed"` //passed to client to show Uploadspeed
StoragePath string `json:"StoragePath"` //Passed to client (and stored in stormdb)
type ClientDB struct { //TODO maybe separate out the internal bits into another client struct
TorrentHashString string //Passed to client for displaying hash and is used to uniquely identify all torrents
TorrentName string //String of the name of the torrent
DownloadedSize string //how much the client has downloaded total
Size string //total size of the torrent
DownloadSpeed string //the dl speed of the torrent
Status string //Passed to client for display
PercentDone string //Passed to client to show percent done
ActivePeers string //passed to client
UploadSpeed string //passed to client to show Uploadspeed
StoragePath string //Passed to client (and stored in stormdb)
DateAdded string //Passed to client (and stored in stormdb)
ETA string `json:"ETA"` //Passed to client
Label string //Passed to client and stored in stormdb
SourceType string `json:"SourceType"` //Stores whether the torrent came from a torrent file or a magnet link
ETA string //Passed to client
TorrentLabel string //Passed to client and stored in stormdb
SourceType string //Stores whether the torrent came from a torrent file or a magnet link
KnownSwarm []torrent.Peer //Passed to client for Peer Tab
UploadRatio string //Passed to client, stores the string for uploadratio stored in stormdb
TotalUploadedSize string //Humanized version of TotalUploadedBytes to pass to the client
TotalUploadedBytes int64 //includes bytes that happened before reboot (from stormdb)
TotalUploadedBytes int64 `json:"-"` //includes bytes that happened before reboot (from stormdb)
downloadSpeedInt int64 //Internal used for calculating dl speed
BytesCompleted int64 //Internal used for calculating the dl speed
DataBytesWritten int64 //Internal used for calculating dl speed
DataBytesRead int64 //Internal used for calculating dl speed
UpdatedAt time.Time //Internal used for calculating speeds of upload and download
TorrentHash metainfo.Hash //Used to create string for TorrentHashString... not sure why I have it... make that a TODO I guess
NumberofFiles int
NumberofPieces int
BytesCompleted int64 `json:"-"` //Internal used for calculating the dl speed
DataBytesWritten int64 `json:"-"` //Internal used for calculating dl speed
DataBytesRead int64 `json:"-"` //Internal used for calculating dl speed
UpdatedAt time.Time `json:"-"` //Internal used for calculating speeds of upload and download
TorrentHash metainfo.Hash `json:"-"` //Used to create string for TorrentHashString... not sure why I have it... make that a TODO I guess
NumberofFiles int //Number of files in the torrent
NumberofPieces int //Total number of pieces in the torrent (Not currently used)
MaxConnections int //Used to stop the torrent by limiting the max allowed connections
}

View File

@@ -1,17 +1,19 @@
package engine
import (
"io/ioutil"
"os"
"path/filepath"
"github.com/anacrolix/torrent"
"github.com/asdine/storm"
Settings "github.com/deranjer/goTorrent/settings"
Storage "github.com/deranjer/goTorrent/storage"
"github.com/mmcdole/gofeed"
"github.com/robfig/cron"
"github.com/sirupsen/logrus"
)
//Logger is the global variable pulled in from main.go
var Logger *logrus.Logger
//InitializeCronEngine initializes and starts the cron engine so we can add tasks as needed, returns pointer to the engine
func InitializeCronEngine() *cron.Cron {
c := cron.New()
@@ -19,8 +21,44 @@ func InitializeCronEngine() *cron.Cron {
return c
}
//CheckTorrentWatchFolder adds torrents from a watch folder //TODO see if you can use filepath.Abs instead of changing directory
func CheckTorrentWatchFolder(c *cron.Cron, db *storm.DB, tclient *torrent.Client, torrentLocalStorage Storage.TorrentLocal, config Settings.FullClientSettings) {
c.AddFunc("@every 5m", func() {
Logger.WithFields(logrus.Fields{"Watch Folder": config.TorrentWatchFolder}).Info("Running the watch folder cron job")
torrentFiles, err := ioutil.ReadDir(config.TorrentWatchFolder)
if err != nil {
Logger.WithFields(logrus.Fields{"Folder": config.TorrentWatchFolder, "Error": err}).Error("Unable to read from the torrent upload folder")
return
}
for _, file := range torrentFiles {
if filepath.Ext(file.Name()) != ".torrent" {
Logger.WithFields(logrus.Fields{"File": file.Name(), "error": err}).Error("Not a torrent file..")
continue
} else {
fullFilePath := filepath.Join(config.TorrentWatchFolder, file.Name())
fullFilePathAbs, err := filepath.Abs(fullFilePath)
fullNewFilePath := filepath.Join(config.TFileUploadFolder, file.Name())
fullNewFilePathAbs, err := filepath.Abs(fullNewFilePath)
Logger.WithFields(logrus.Fields{"Name": file.Name(), "FullFilePath": fullFilePathAbs, "newFullFilePath": fullNewFilePathAbs}).Info("Attempting to add the following file... and copy to")
CopyFile(fullFilePathAbs, fullNewFilePathAbs)
clientTorrent, err := tclient.AddTorrentFromFile(fullNewFilePathAbs)
if err != nil {
Logger.WithFields(logrus.Fields{"err": err, "Torrent": file.Name()}).Warn("Unable to add torrent to torrent client!")
continue
}
os.Remove(fullFilePathAbs) //delete the torrent after adding it and copying it over
Logger.WithFields(logrus.Fields{"Source Folder": fullFilePathAbs, "Destination Folder": fullNewFilePathAbs, "Torrent": file.Name()}).Info("Added torrent from watch folder, and moved torrent file")
StartTorrent(clientTorrent, torrentLocalStorage, db, "file", fullNewFilePathAbs, config.DefaultMoveFolder, "default", config)
}
}
})
}
//RefreshRSSCron refreshes all of the RSS feeds on an hourly basis
func RefreshRSSCron(c *cron.Cron, db *storm.DB, tclient *torrent.Client, torrentLocalStorage Storage.TorrentLocal, dataDir string) {
func RefreshRSSCron(c *cron.Cron, db *storm.DB, tclient *torrent.Client, torrentLocalStorage Storage.TorrentLocal, config Settings.FullClientSettings) {
c.AddFunc("@hourly", func() {
torrentHashHistory := Storage.FetchHashHistory(db)
RSSFeedStore := Storage.FetchRSSFeeds(db)
@@ -48,7 +86,7 @@ func RefreshRSSCron(c *cron.Cron, db *storm.DB, tclient *torrent.Client, torrent
Logger.WithFields(logrus.Fields{"err": err, "Torrent": RSSTorrent.Title}).Warn("Unable to add torrent to torrent client!")
break //break out of the loop entirely for this message since we hit an error
}
StartTorrent(clientTorrent, torrentLocalStorage, db, dataDir, "magnet", "", dataDir) //TODO let user specify torrent default storage location and let change on fly
StartTorrent(clientTorrent, torrentLocalStorage, db, "magnet", "", config.DefaultMoveFolder, "RSS", config) //TODO let user specify torrent default storage location and let change on fly
singleFeed.Torrents = append(singleFeed.Torrents, singleRSSTorrent)
}

View File

@@ -1,13 +1,11 @@
package engine
import (
"io"
"os"
"path/filepath"
"runtime"
"github.com/anacrolix/torrent"
"github.com/asdine/storm"
Settings "github.com/deranjer/goTorrent/settings"
Storage "github.com/deranjer/goTorrent/storage"
pushbullet "github.com/mitsuse/pushbullet-go"
"github.com/mitsuse/pushbullet-go/requests"
@@ -16,88 +14,97 @@ import (
)
//MoveAndLeaveSymlink takes the file from the default download dir and moves it to the user specified directory and then leaves a symlink behind.
func MoveAndLeaveSymlink(config FullClientSettings, singleTorrent *torrent.Torrent, db *storm.DB) {
Logger.WithFields(logrus.Fields{"Torrent Name": singleTorrent.Name()}).Info("Move and Create symlink started for torrent")
tStorage := Storage.FetchTorrentFromStorage(db, singleTorrent.InfoHash().String())
oldFilePath := filepath.Join(config.TorrentConfig.DataDir, singleTorrent.Name())
newFilePath := filepath.Join(tStorage.StoragePath, singleTorrent.Name())
_, err := os.Stat(tStorage.StoragePath)
func MoveAndLeaveSymlink(config Settings.FullClientSettings, tHash string, db *storm.DB, moveDone bool, oldPath string) error { //moveDone and oldPath are for moving a completed torrent
tStorage := Storage.FetchTorrentFromStorage(db, tHash)
Logger.WithFields(logrus.Fields{"Torrent Name": tStorage.TorrentName}).Info("Move and Create symlink started for torrent")
var oldFilePath string
if moveDone { //only occurs on manual move
oldFilePathTemp := filepath.Join(oldPath, tStorage.TorrentName)
var err error
oldFilePath, err = filepath.Abs(oldFilePathTemp)
if err != nil {
Logger.WithFields(logrus.Fields{"Torrent Name": tStorage.TorrentName, "Filepath": oldFilePath}).Error("Cannot create absolute file path!")
moveDone = false
return err
}
} else {
oldFilePathTemp := filepath.Join(config.TorrentConfig.DataDir, tStorage.TorrentName)
var err error
oldFilePath, err = filepath.Abs(oldFilePathTemp)
if err != nil {
Logger.WithFields(logrus.Fields{"Torrent Name": tStorage.TorrentName, "Filepath": oldFilePath}).Error("Cannot create absolute file path!")
moveDone = false
return err
}
}
newFilePathTemp := filepath.Join(tStorage.StoragePath, tStorage.TorrentName)
newFilePath, err := filepath.Abs(newFilePathTemp)
if err != nil {
Logger.WithFields(logrus.Fields{"Torrent Name": tStorage.TorrentName, "Filepath": newFilePath}).Error("Cannot create absolute file path for new file path!")
moveDone = false
return err
}
_, err = os.Stat(tStorage.StoragePath)
if os.IsNotExist(err) {
err := os.MkdirAll(tStorage.StoragePath, 0755)
err := os.MkdirAll(tStorage.StoragePath, 0777)
if err != nil {
Logger.WithFields(logrus.Fields{"New File Path": newFilePath, "error": err}).Error("Cannot create new directory")
moveDone = false
return err
}
}
oldFileInfo, err := os.Stat(oldFilePath)
if err != nil {
Logger.WithFields(logrus.Fields{"Old File info": oldFileInfo, "error": err}).Error("Cannot find the old file to copy/symlink!")
return
Logger.WithFields(logrus.Fields{"Old File info": oldFileInfo, "Old File Path": oldFilePath, "error": err}).Error("Cannot find the old file to copy/symlink!")
moveDone = false
return err
}
if oldFilePath != newFilePath {
if runtime.GOOS == "windows" { //TODO the windows symlink is broken on windows 10 creator edition, so doing a copy for now until Go 1.11
if oldFileInfo.IsDir() {
os.Mkdir(newFilePath, 0755)
folderCopy.Copy(oldFilePath, newFilePath) //copy the folder to the new location
newFilePathDir := filepath.Dir(newFilePath)
os.Mkdir(newFilePathDir, 0777)
err := folderCopy.Copy(oldFilePath, newFilePath) //copy the folder to the new location
if err != nil {
Logger.WithFields(logrus.Fields{"Old File Path": oldFilePath, "New File Path": newFilePath, "error": err}).Error("Error Copying Folder!")
return err
}
os.Chmod(newFilePath, 0777)
notifyUser(tStorage, config, singleTorrent, db)
return
}
srcFile, err := os.Open(oldFilePath)
defer srcFile.Close()
if err != nil {
Logger.WithFields(logrus.Fields{"Old File Path": oldFilePath, "error": err}).Error("Windows: Cannot open old file for copy")
return
}
destFile, err := os.Create(newFilePath)
defer destFile.Close()
if err != nil {
Logger.WithFields(logrus.Fields{"New File Path": newFilePath, "error": err}).Error("Windows: Cannot open new file for copying into")
return
}
bytesWritten, err := io.Copy(destFile, srcFile)
if err != nil {
Logger.WithFields(logrus.Fields{"Old File Path": oldFilePath, "New File Path": newFilePath, "error": err}).Error("Windows: Cannot copy old file into new")
return
}
err = destFile.Sync()
if err != nil {
Logger.WithFields(logrus.Fields{"Old File Path": oldFilePath, "New File Path": newFilePath, "error": err}).Error("Windows: Error syncing new file to disk")
}
Logger.WithFields(logrus.Fields{"Old File Path": oldFilePath, "New File Path": newFilePath, "bytesWritten": bytesWritten}).Info("Windows Torrent Copy Completed")
notifyUser(tStorage, config, singleTorrent, db)
} else {
folderCopy.Copy(oldFilePath, newFilePath)
os.Chmod(newFilePath, 0777) //changing permissions on the new file to be permissive
/* if runtime.GOOS != "windows" { //TODO the windows symlink is broken on windows 10 creator edition, so on the other platforms create symlink (windows will copy) until Go1.11
os.RemoveAll(oldFilePath)
err := os.Symlink(newFilePath, oldFilePath) //For all other OS's create a symlink
err = os.Symlink(newFilePath, oldFilePath)
if err != nil {
Logger.WithFields(logrus.Fields{"Old File Path": oldFilePath, "New File Path": newFilePath, "error": err}).Error("Error creating symlink")
return
moveDone = false
return err
}
} */
if moveDone == false {
tStorage.TorrentMoved = true //TODO error handling instead of just saying torrent was moved when it was not
notifyUser(tStorage, config, db) //Only notify if we haven't moved yet, don't want to push notify user every time user uses change storage button
}
notifyUser(tStorage, config, singleTorrent, db)
Logger.WithFields(logrus.Fields{"Old File Path": oldFilePath, "New File Path": newFilePath}).Info("Moving completed torrent")
tStorage.StoragePath = filepath.Dir(newFilePath)
Storage.UpdateStorageTick(db, tStorage)
}
return nil
}
}
func notifyUser(tStorage Storage.TorrentLocal, config FullClientSettings, singleTorrent *torrent.Torrent, db *storm.DB) {
Logger.WithFields(logrus.Fields{"New File Path": tStorage.StoragePath, "Torrent Name": singleTorrent.Name()}).Info("Attempting to notify user..")
func notifyUser(tStorage Storage.TorrentLocal, config Settings.FullClientSettings, db *storm.DB) {
Logger.WithFields(logrus.Fields{"New File Path": tStorage.StoragePath, "Torrent Name": tStorage.TorrentName}).Info("Attempting to notify user..")
tStorage.TorrentMoved = true
Storage.AddTorrentLocalStorage(db, tStorage) //Updating the fact that we moved the torrent
//Storage.AddTorrentLocalStorage(db, tStorage) //Updating the fact that we moved the torrent
Storage.UpdateStorageTick(db, tStorage)
if config.PushBulletToken != "" {
pb := pushbullet.New(config.PushBulletToken)
n := requests.NewNote()
n.Title = singleTorrent.Name()
n.Title = tStorage.TorrentName
n.Body = "Completed and moved to " + tStorage.StoragePath
if _, err := pb.PostPushesNote(n); err != nil {
Logger.WithFields(logrus.Fields{"Torrent": singleTorrent.Name(), "New File Path": tStorage.StoragePath, "error": err}).Error("Error pushing PushBullet Note")
Logger.WithFields(logrus.Fields{"Torrent": tStorage.TorrentName, "New File Path": tStorage.StoragePath, "error": err}).Error("Error pushing PushBullet Note")
return
}
Logger.WithFields(logrus.Fields{"Torrent": singleTorrent.Name(), "New File Path": tStorage.StoragePath}).Info("Pushbullet note sent")
Logger.WithFields(logrus.Fields{"Torrent": tStorage.TorrentName, "New File Path": tStorage.StoragePath}).Info("Pushbullet note sent")
} else {
Logger.WithFields(logrus.Fields{"New File Path": tStorage.StoragePath, "Torrent Name": singleTorrent.Name()}).Info("No pushbullet API key set, not notifying")
Logger.WithFields(logrus.Fields{"New File Path": tStorage.StoragePath, "Torrent Name": tStorage.TorrentName}).Info("No pushbullet API key set, not notifying")
}
}

View File

@@ -0,0 +1,47 @@
package engine
import (
"testing"
"github.com/asdine/storm"
Settings "github.com/deranjer/goTorrent/settings"
Storage "github.com/deranjer/goTorrent/storage"
)
func TestMoveAndLeaveSymlink(t *testing.T) {
type args struct {
config Settings.FullClientSettings
tStorage Storage.TorrentLocal
db *storm.DB
}
tests := []struct {
name string
args args
}{
// TODO: Add test cases.
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
MoveAndLeaveSymlink(tt.args.config, tt.args.tStorage, tt.args.db)
})
}
}
func Test_notifyUser(t *testing.T) {
type args struct {
tStorage Storage.TorrentLocal
config Settings.FullClientSettings
db *storm.DB
}
tests := []struct {
name string
args args
}{
// TODO: Add test cases.
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
notifyUser(tt.args.tStorage, tt.args.config, tt.args.db)
})
}
}

View File

@@ -11,11 +11,27 @@ import (
"github.com/anacrolix/torrent"
"github.com/anacrolix/torrent/metainfo"
"github.com/asdine/storm"
Settings "github.com/deranjer/goTorrent/settings"
Storage "github.com/deranjer/goTorrent/storage"
"github.com/gorilla/websocket"
"github.com/mmcdole/gofeed"
"github.com/sirupsen/logrus"
)
//Logger is the injected variable for global logger
var Logger *logrus.Logger
//Config is the injected variable for the torrent config
var Config Settings.FullClientSettings
//Conn is the injected variable for the websocket connection
var Conn *websocket.Conn
//CreateServerPushMessage Pushes a message from the server to the client
func CreateServerPushMessage(message ServerPushMessage, conn *websocket.Conn) {
conn.WriteJSON(message)
}
//RefreshSingleRSSFeed refreshing a single RSS feed to send to the client (so no updating database) mainly by updating the torrent list to display any changes
func RefreshSingleRSSFeed(db *storm.DB, RSSFeed Storage.SingleRSSFeed) Storage.SingleRSSFeed { //Todo.. duplicate as cron job... any way to merge these to reduce duplication?
singleRSSFeed := Storage.SingleRSSFeed{URL: RSSFeed.URL, Name: RSSFeed.Name}
@@ -24,6 +40,7 @@ func RefreshSingleRSSFeed(db *storm.DB, RSSFeed Storage.SingleRSSFeed) Storage.S
feed, err := fp.ParseURL(RSSFeed.URL)
if err != nil {
Logger.WithFields(logrus.Fields{"RSSFeedURL": RSSFeed.URL, "error": err}).Error("Unable to parse URL")
CreateServerPushMessage(ServerPushMessage{MessageType: "serverPushMessage", MessageLevel: "error", Payload: "Unable to add Storage Path"}, Conn)
}
for _, RSSTorrent := range feed.Items {
singleRSSTorrent.Link = RSSTorrent.Link
@@ -45,7 +62,8 @@ func ForceRSSRefresh(db *storm.DB, RSSFeedStore Storage.RSSFeedStore) { //Todo..
for _, singleFeed := range RSSFeedStore.RSSFeeds {
feed, err := fp.ParseURL(singleFeed.URL)
if err != nil {
Logger.WithFields(logrus.Fields{"RSSFeedURL": singleFeed.URL, "error": err}).Error("Unable to parse URL")
Logger.WithFields(logrus.Fields{"RSSFeedURL": singleFeed.URL, "error": err}).Error("Unable to parse RSS URL")
CreateServerPushMessage(ServerPushMessage{MessageType: "serverPushMessage", MessageLevel: "error", Payload: "Unable to parse RSS URL"}, Conn)
}
for _, RSSTorrent := range feed.Items {
singleRSSTorrent.Link = RSSTorrent.Link
@@ -69,26 +87,28 @@ func timeOutInfo(clientTorrent *torrent.Torrent, seconds time.Duration) (deleted
}()
select {
case <-clientTorrent.GotInfo(): //attempting to retrieve info for torrent
Logger.WithFields(logrus.Fields{"clientTorrentName": clientTorrent.Name()}).Debug("Recieved torrent info for torrent")
clientTorrent.DownloadAll()
Logger.WithFields(logrus.Fields{"clientTorrentName": clientTorrent.Name()}).Debug("Received torrent info for torrent")
return false
case <-timeout: // getting info for torrent has timed out so purging the torrent
Logger.WithFields(logrus.Fields{"clientTorrentName": clientTorrent.Name()}).Error("Forced to drop torrent from timeout waiting for info")
CreateServerPushMessage(ServerPushMessage{MessageType: "serverPushMessage", MessageLevel: "error", Payload: "Timout waiting for torrent info... dropping"}, Conn)
clientTorrent.Drop()
return true
}
}
func readTorrentFileFromDB(element *Storage.TorrentLocal, tclient *torrent.Client, db *storm.DB) (singleTorrent *torrent.Torrent) {
func readTorrentFileFromDB(element *Storage.TorrentLocal, tclient *torrent.Client, db *storm.DB) (singleTorrent *torrent.Torrent, err error) {
tempFile, err := ioutil.TempFile("", "TorrentFileTemp")
if err != nil {
Logger.WithFields(logrus.Fields{"tempfile": tempFile, "err": err}).Error("Unable to create tempfile")
return nil, err
}
//defer tempFile.Close() //Todo.. if we remove this do we need to close it?
defer os.Remove(tempFile.Name())
if _, err := tempFile.Write(element.TorrentFile); err != nil { //writing out out the entire file back into the temp dir from boltdb
Logger.WithFields(logrus.Fields{"tempfile": tempFile, "err": err}).Error("Unable to write to tempfile")
return nil, err
}
if err := tempFile.Close(); err != nil { //close the tempfile so that we can add it back into the torrent client
Logger.WithFields(logrus.Fields{"tempfile": tempFile, "err": err}).Error("Unable to close tempfile")
@@ -96,18 +116,22 @@ func readTorrentFileFromDB(element *Storage.TorrentLocal, tclient *torrent.Clien
_, err = os.Stat(element.TorrentFileName) //if we CAN find the torrent, add it
if err != nil {
Logger.WithFields(logrus.Fields{"tempfile": tempFile, "err": err}).Error("Unable to find file")
Storage.DelTorrentLocalStorage(db, element.Hash) //purge the torrent
return nil, err
}
singleTorrent, err = tclient.AddTorrentFromFile(element.TorrentFileName)
if err != nil {
Logger.WithFields(logrus.Fields{"tempfile": element.TorrentFileName, "err": err}).Error("Unable to add Torrent from file!")
CreateServerPushMessage(ServerPushMessage{MessageType: "serverPushMessage", MessageLevel: "error", Payload: "Unable to add Torrent from file!"}, Conn)
Storage.DelTorrentLocalStorage(db, element.Hash) //purge the torrent
return nil, err
}
return singleTorrent
return singleTorrent, nil
}
//StartTorrent creates the storage.db entry and starts A NEW TORRENT and adds to the running torrent array
func StartTorrent(clientTorrent *torrent.Torrent, torrentLocalStorage Storage.TorrentLocal, torrentDbStorage *storm.DB, dataDir string, torrentType string, torrentFileName string, torrentStoragePath string) {
timedOut := timeOutInfo(clientTorrent, 45) //seeing if adding the torrrent times out (giving 45 seconds)
func StartTorrent(clientTorrent *torrent.Torrent, torrentLocalStorage Storage.TorrentLocal, torrentDbStorage *storm.DB, torrentType, torrentFilePathAbs, torrentStoragePath, labelValue string, config Settings.FullClientSettings) {
timedOut := timeOutInfo(clientTorrent, 45) //seeing if adding the torrent times out (giving 45 seconds)
if timedOut { //if we fail to add the torrent return
return
}
@@ -116,20 +140,25 @@ func StartTorrent(clientTorrent *torrent.Torrent, torrentLocalStorage Storage.To
allStoredTorrents := Storage.FetchAllStoredTorrents(torrentDbStorage)
for _, runningTorrentHashes := range allStoredTorrents {
if runningTorrentHashes.Hash == TempHash.String() {
Logger.WithFields(logrus.Fields{"Hash": TempHash.String()}).Error("Torrent has duplicate hash to already running torrent... will not add to storage")
Logger.WithFields(logrus.Fields{"Hash": TempHash.String()}).Info("Torrent has duplicate hash to already running torrent... will not add to storage")
return
}
}
torrentLocalStorage.Hash = TempHash.String() // we will store the infohash to add it back later on client restart (if needed)
torrentLocalStorage.InfoBytes = clientTorrent.Metainfo().InfoBytes
torrentLocalStorage.Label = labelValue
torrentLocalStorage.DateAdded = time.Now().Format("Jan _2 2006")
torrentLocalStorage.StoragePath = torrentStoragePath
torrentLocalStorage.TempStoragePath = config.TorrentConfig.DataDir
torrentLocalStorage.TorrentName = clientTorrent.Name()
torrentLocalStorage.TorrentUploadLimit = true //by default all of the torrents will stop uploading after the global rate is set.
torrentLocalStorage.TorrentMoved = false //by default the torrent has no been moved.
torrentLocalStorage.TorrentStatus = "Running" //by default start all the torrents as downloading.
torrentLocalStorage.TorrentType = torrentType //either "file" or "magnet" maybe more in the future
torrentLocalStorage.TorrentSize = clientTorrent.Length() //Length will change as we cancel files so store it in DB
if torrentType == "file" { //if it is a file read the entire file into the database for us to spit out later
torrentLocalStorage.TorrentFileName = torrentFileName
torrentfile, err := ioutil.ReadFile(torrentFileName)
torrentfile, err := ioutil.ReadFile(torrentFilePathAbs)
torrentLocalStorage.TorrentFileName = torrentFilePathAbs
if err != nil {
Logger.WithFields(logrus.Fields{"torrentFile": torrentfile, "error": err}).Error("Unable to read the torrent file")
}
@@ -142,103 +171,164 @@ func StartTorrent(clientTorrent *torrent.Torrent, torrentLocalStorage Storage.To
var torrentFilePriority = Storage.TorrentFilePriority{}
torrentFilePriority.TorrentFilePath = singleFile.DisplayPath()
torrentFilePriority.TorrentFilePriority = "Normal"
torrentFilePriority.TorrentFileSize = singleFile.Length()
TorrentFilePriorityArray = append(TorrentFilePriorityArray, torrentFilePriority)
}
torrentLocalStorage.TorrentFilePriority = TorrentFilePriorityArray
Storage.AddTorrentLocalStorage(torrentDbStorage, torrentLocalStorage) //writing all of the data to the database
clientTorrent.DownloadAll() //starting the download
clientTorrent.DownloadAll() //set all pieces to download
NumPieces := clientTorrent.NumPieces() //find the number of pieces
clientTorrent.CancelPieces(1, NumPieces) //cancel all of the pieces to use file priority
for _, singleFile := range clientTorrent.Files() { //setting all of the file priorities to normal
singleFile.SetPriority(torrent.PiecePriorityNormal)
}
CreateServerPushMessage(ServerPushMessage{MessageType: "serverPushMessage", MessageLevel: "success", Payload: "Torrent added!"}, Conn)
}
//CreateRunningTorrentArray creates the entire torrent list to pass to client
func CreateRunningTorrentArray(tclient *torrent.Client, TorrentLocalArray []*Storage.TorrentLocal, PreviousTorrentArray []ClientDB, config FullClientSettings, db *storm.DB) (RunningTorrentArray []ClientDB) {
//CreateInitialTorrentArray adds all the torrents on program start from the database
func CreateInitialTorrentArray(tclient *torrent.Client, TorrentLocalArray []*Storage.TorrentLocal, db *storm.DB) {
for _, singleTorrentFromStorage := range TorrentLocalArray {
var singleTorrent *torrent.Torrent
var TempHash metainfo.Hash
tickUpdateStruct := Storage.TorrentLocal{} //we are shoving the tick updates into a torrentlocal struct to pass to storage happens at the end of the routine
fullClientDB := new(ClientDB)
//singleTorrentStorageInfo := Storage.FetchTorrentFromStorage(db, TempHash.String()) //pulling the single torrent info from storage ()
var err error
if singleTorrentFromStorage.TorrentType == "file" { //if it is a file pull it from the uploaded torrent folder
singleTorrent = readTorrentFileFromDB(singleTorrentFromStorage, tclient, db)
fullClientDB.SourceType = "Torrent File"
singleTorrent, err = readTorrentFileFromDB(singleTorrentFromStorage, tclient, db)
if err != nil {
continue
}
} else {
singleTorrentFromStorageMagnet := "magnet:?xt=urn:btih:" + singleTorrentFromStorage.Hash //For magnet links just need to prepend the magnet part to the hash to readd
singleTorrent, _ = tclient.AddMagnet(singleTorrentFromStorageMagnet)
fullClientDB.SourceType = "Magnet Link"
singleTorrent, err = tclient.AddMagnet(singleTorrentFromStorageMagnet)
if err != nil {
continue
}
}
if len(singleTorrentFromStorage.InfoBytes) == 0 { //TODO.. kind of a fringe scenario.. not sure if needed since the db should always have the infobytes
timeOut := timeOutInfo(singleTorrent, 45)
if timeOut == true { // if we did timeout then drop the torrent from the boltdb database
if timeOut == true { // if we did timeout then drop the torrent from the bolt.db database
Storage.DelTorrentLocalStorage(db, singleTorrentFromStorage.Hash) //purging torrent from the local database
continue
}
singleTorrentFromStorage.InfoBytes = singleTorrent.Metainfo().InfoBytes
}
err := singleTorrent.SetInfoBytes(singleTorrentFromStorage.InfoBytes) //setting the infobytes back into the torrent
err = singleTorrent.SetInfoBytes(singleTorrentFromStorage.InfoBytes) //setting the infobytes back into the torrent
if err != nil {
Logger.WithFields(logrus.Fields{"torrentFile": singleTorrent.Name(), "error": err}).Error("Unable to add infobytes to the torrent!")
}
//Logger.WithFields(logrus.Fields{"singleTorrent": singleTorrentFromStorage.TorrentName}).Info("Generating infohash")
TempHash = singleTorrent.InfoHash()
if singleTorrentFromStorage.TorrentStatus != "Completed" && singleTorrentFromStorage.TorrentStatus != "Stopped" {
singleTorrent.DownloadAll() //set all of the pieces to download (piece prio is NE to file prio)
NumPieces := singleTorrent.NumPieces() //find the number of pieces
singleTorrent.CancelPieces(1, NumPieces) //cancel all of the pieces to use file priority
for _, singleFile := range singleTorrent.Files() { //setting all of the file priorities to normal
singleFile.SetPriority(torrent.PiecePriorityNormal)
}
}
}
SetFilePriority(tclient, db) //Setting the desired file priority from storage
}
//CreateRunningTorrentArray creates the entire torrent list to pass to client
func CreateRunningTorrentArray(tclient *torrent.Client, TorrentLocalArray []*Storage.TorrentLocal, PreviousTorrentArray []ClientDB, config Settings.FullClientSettings, db *storm.DB) (RunningTorrentArray []ClientDB) {
for _, singleTorrentFromStorage := range TorrentLocalArray {
var singleTorrent *torrent.Torrent
var TempHash metainfo.Hash
for _, liveTorrent := range tclient.Torrents() { //matching the torrent from storage to the live torrent
if singleTorrentFromStorage.Hash == liveTorrent.InfoHash().String() {
singleTorrent = liveTorrent
}
}
tickUpdateStruct := Storage.TorrentLocal{} //we are shoving the tick updates into a torrentlocal struct to pass to storage happens at the end of the routine
fullClientDB := new(ClientDB)
//singleTorrentStorageInfo := Storage.FetchTorrentFromStorage(db, TempHash.String()) //pulling the single torrent info from storage ()
if singleTorrentFromStorage.TorrentStatus == "Dropped" {
Logger.WithFields(logrus.Fields{"selection": singleTorrentFromStorage.TorrentName}).Info("Deleting just the torrent")
singleTorrent.Drop()
Storage.DelTorrentLocalStorage(db, singleTorrentFromStorage.Hash)
}
if singleTorrentFromStorage.TorrentStatus == "DroppedData" {
Logger.WithFields(logrus.Fields{"selection": singleTorrentFromStorage.TorrentName}).Info("Deleting just the torrent")
singleTorrent.Drop()
Storage.DelTorrentLocalStorageAndFiles(db, singleTorrentFromStorage.Hash, Config.TorrentConfig.DataDir)
}
if singleTorrentFromStorage.TorrentType == "file" { //if it is a file pull it from the uploaded torrent folder
fullClientDB.SourceType = "Torrent File"
} else {
fullClientDB.SourceType = "Magnet Link"
}
calculatedTotalSize := CalculateDownloadSize(singleTorrentFromStorage, singleTorrent)
calculatedCompletedSize := CalculateCompletedSize(singleTorrentFromStorage, singleTorrent)
TempHash = singleTorrent.InfoHash()
if (calculatedCompletedSize == singleTorrentFromStorage.TorrentSize) && (singleTorrentFromStorage.TorrentMoved == false) { //if we are done downloading and haven't moved torrent yet
Logger.WithFields(logrus.Fields{"singleTorrent": singleTorrentFromStorage.TorrentName}).Info("Torrent Completed, moving...")
tStorage := Storage.FetchTorrentFromStorage(db, singleTorrent.InfoHash().String()) //Todo... find a better way to do this in the go-routine currently just to make sure it doesn't trigger multiple times
tStorage.TorrentMoved = true
Storage.UpdateStorageTick(db, tStorage)
go func() { //moving torrent in separate go-routine then verifying that the data is still there and correct
err := MoveAndLeaveSymlink(config, singleTorrent.InfoHash().String(), db, false, "") //can take some time to move file so running this in another thread TODO make this a goroutine and skip this block if the routine is still running
if err != nil { //If we fail, print the error and attempt a retry
Logger.WithFields(logrus.Fields{"singleTorrent": singleTorrentFromStorage.TorrentName, "error": err}).Error("Failed to move Torrent!")
VerifyData(singleTorrent)
tStorage.TorrentMoved = false
Storage.UpdateStorageTick(db, tStorage)
}
}()
if (singleTorrent.BytesCompleted() == singleTorrent.Length()) && (singleTorrentFromStorage.TorrentMoved == false) { //if we are done downloading and havent moved torrent yet
MoveAndLeaveSymlink(config, singleTorrent, db) //can take some time to move file so running this in another thread TODO make this a goroutine and skip this block if the routine is still running
}
fullStruct := singleTorrent.Stats()
activePeersString := strconv.Itoa(fullStruct.ActivePeers) //converting to strings
totalPeersString := fmt.Sprintf("%v", fullStruct.TotalPeers)
//fetching all the info from the database
fullClientDB.StoragePath = singleTorrentFromStorage.StoragePath
fullClientDB.StoragePath = singleTorrentFromStorage.StoragePath //grabbed from database
downloadedSizeHumanized := HumanizeBytes(float32(calculatedCompletedSize)) //convert size to GB if needed
totalSizeHumanized := HumanizeBytes(float32(calculatedTotalSize))
downloadedSizeHumanized := HumanizeBytes(float32(singleTorrent.BytesCompleted())) //convert size to GB if needed
totalSizeHumanized := HumanizeBytes(float32(singleTorrent.Length()))
//Logger.WithFields(logrus.Fields{"singleTorrent": singleTorrentFromStorage.TorrentName}).Info("Generated infohash")
//grabbed from torrent client
fullClientDB.DownloadedSize = downloadedSizeHumanized
fullClientDB.Size = totalSizeHumanized
PercentDone := fmt.Sprintf("%.2f", float32(singleTorrent.BytesCompleted())/float32(singleTorrent.Length()))
PercentDone := fmt.Sprintf("%.2f", float32(calculatedCompletedSize)/float32(calculatedTotalSize))
fullClientDB.TorrentHash = TempHash
fullClientDB.PercentDone = PercentDone
fullClientDB.DataBytesRead = fullStruct.ConnStats.DataBytesRead //used for calculations not passed to client calculating up/down speed
fullClientDB.DataBytesWritten = fullStruct.ConnStats.DataBytesWritten //used for calculations not passed to client calculating up/down speed
fullClientDB.DataBytesRead = fullStruct.ConnStats.BytesReadData //used for calculations not passed to client calculating up/down speed
fullClientDB.DataBytesWritten = fullStruct.ConnStats.BytesWrittenData //used for calculations not passed to client calculating up/down speed
fullClientDB.ActivePeers = activePeersString + " / (" + totalPeersString + ")"
fullClientDB.TorrentHashString = TempHash.String()
fullClientDB.StoragePath = singleTorrentFromStorage.StoragePath
fullClientDB.TorrentName = singleTorrentFromStorage.TorrentName
fullClientDB.DateAdded = singleTorrentFromStorage.DateAdded
fullClientDB.BytesCompleted = singleTorrent.BytesCompleted()
fullClientDB.TorrentLabel = singleTorrentFromStorage.Label
fullClientDB.BytesCompleted = calculatedCompletedSize
fullClientDB.NumberofFiles = len(singleTorrent.Files())
//ranging over the previous torrent array to calculate the speed for each torrent
if len(PreviousTorrentArray) > 0 { //if we actually have a previous array
if len(PreviousTorrentArray) > 0 { //if we actually have a previous array //ranging over the previous torrent array to calculate the speed for each torrent
for _, previousElement := range PreviousTorrentArray {
TempHash := singleTorrent.InfoHash()
if previousElement.TorrentHashString == TempHash.String() { //matching previous to new
CalculateTorrentSpeed(singleTorrent, fullClientDB, previousElement)
fullClientDB.TotalUploadedBytes = singleTorrentFromStorage.UploadedBytes + (fullStruct.ConnStats.DataBytesWritten - previousElement.DataBytesWritten)
CalculateTorrentSpeed(singleTorrent, fullClientDB, previousElement, calculatedCompletedSize)
fullClientDB.TotalUploadedBytes = singleTorrentFromStorage.UploadedBytes + (fullStruct.ConnStats.BytesWrittenData - previousElement.DataBytesWritten)
}
}
}
CalculateTorrentETA(singleTorrent, fullClientDB) //needs to be here since we need the speed calcuated before we can estimate the eta.
CalculateTorrentETA(singleTorrentFromStorage.TorrentSize, calculatedCompletedSize, fullClientDB) //needs to be here since we need the speed calculated before we can estimate the eta.
fullClientDB.TotalUploadedSize = HumanizeBytes(float32(fullClientDB.TotalUploadedBytes))
fullClientDB.UploadRatio = CalculateUploadRatio(singleTorrent, fullClientDB) //calculate the upload ratio
CalculateTorrentStatus(singleTorrent, fullClientDB, config, singleTorrentFromStorage)
CalculateTorrentStatus(singleTorrent, fullClientDB, config, singleTorrentFromStorage, calculatedCompletedSize, calculatedTotalSize)
tickUpdateStruct.UploadRatio = fullClientDB.UploadRatio
tickUpdateStruct.TorrentSize = calculatedTotalSize
tickUpdateStruct.UploadedBytes = fullClientDB.TotalUploadedBytes
tickUpdateStruct.TorrentStatus = fullClientDB.Status
tickUpdateStruct.Hash = fullClientDB.TorrentHashString //needed for index
Storage.UpdateStorageTick(db, tickUpdateStruct)
RunningTorrentArray = append(RunningTorrentArray, *fullClientDB)
@@ -247,8 +337,9 @@ func CreateRunningTorrentArray(tclient *torrent.Client, TorrentLocalArray []*Sto
}
//CreateFileListArray creates a file list for a single torrent that is selected and sent to the server
func CreateFileListArray(tclient *torrent.Client, selectedHash string) TorrentFileList {
func CreateFileListArray(tclient *torrent.Client, selectedHash string, db *storm.DB, config Settings.FullClientSettings) TorrentFileList {
runningTorrents := tclient.Torrents() //don't need running torrent array since we aren't adding or deleting from storage
torrentFileListStorage := Storage.FetchTorrentFromStorage(db, selectedHash)
TorrentFileListSelected := TorrentFileList{}
TorrentFileStruct := TorrentFile{}
for _, singleTorrent := range runningTorrents {
@@ -268,7 +359,12 @@ func CreateFileListArray(tclient *torrent.Client, selectedHash string) TorrentFi
}
}
TorrentFileStruct.FilePercent = fmt.Sprintf("%.2f", float32(downloadedBytes)/float32(singleFile.Length()))
TorrentFileStruct.FilePriority = "Normal" //TODO, figure out how to store this per file in storage and also tie a priority to a file
for i, specificFile := range torrentFileListStorage.TorrentFilePriority { //searching for that specific file in storage
if specificFile.TorrentFilePath == singleFile.DisplayPath() {
TorrentFileStruct.FilePriority = torrentFileListStorage.TorrentFilePriority[i].TorrentFilePriority
}
}
TorrentFileStruct.FileSize = HumanizeBytes(float32(singleFile.Length()))
TorrentFileListSelected.FileList = append(TorrentFileListSelected.FileList, TorrentFileStruct)
}

211
engine/engineHelpers.go Normal file
View File

@@ -0,0 +1,211 @@
package engine
import (
"fmt"
"io"
"os"
"time"
"github.com/anacrolix/torrent"
"github.com/asdine/storm"
Settings "github.com/deranjer/goTorrent/settings"
"github.com/deranjer/goTorrent/storage"
Storage "github.com/deranjer/goTorrent/storage"
"github.com/sirupsen/logrus"
)
func secondsToMinutes(inSeconds int64) string {
minutes := inSeconds / 60
seconds := inSeconds % 60
minutesString := fmt.Sprintf("%d", minutes)
secondsString := fmt.Sprintf("%d", seconds)
str := minutesString + " Min/ " + secondsString + " Sec"
return str
}
//VerifyData just verifies the data of a torrent by hash
func VerifyData(singleTorrent *torrent.Torrent) {
singleTorrent.VerifyData()
}
//MakeRange creates a range of pieces to set their priority based on a file
func MakeRange(min, max int) []int {
a := make([]int, max-min+1)
for i := range a {
a[i] = min + i
}
return a
}
//HumanizeBytes returns a nice humanized version of bytes in either GB or MB
func HumanizeBytes(bytes float32) string {
if bytes < 1000000 { //if we have less than 1MB in bytes convert to KB
pBytes := fmt.Sprintf("%.2f", bytes/1024)
pBytes = pBytes + " KB"
return pBytes
}
bytes = bytes / 1024 / 1024 //Converting bytes to a useful measure
if bytes > 1024 {
pBytes := fmt.Sprintf("%.2f", bytes/1024)
pBytes = pBytes + " GB"
return pBytes
}
pBytes := fmt.Sprintf("%.2f", bytes) //If not too big or too small leave it as MB
pBytes = pBytes + " MB"
return pBytes
}
//CopyFile takes a source file string and a destination file string and copies the file
func CopyFile(srcFile string, destFile string) { //TODO move this to our imported copy repo
fileContents, err := os.Open(srcFile)
defer fileContents.Close()
if err != nil {
Logger.WithFields(logrus.Fields{"File": srcFile, "Error": err}).Error("Cannot open source file")
}
outfileContents, err := os.Create(destFile)
defer outfileContents.Close()
if err != nil {
Logger.WithFields(logrus.Fields{"File": destFile, "Error": err}).Error("Cannot open destination file")
}
_, err = io.Copy(outfileContents, fileContents)
if err != nil {
Logger.WithFields(logrus.Fields{"Source File": srcFile, "Destination File": destFile, "Error": err}).Error("Cannot write contents to destination file")
}
}
//SetFilePriority sets the priorities for all of the files in all of the torrents
func SetFilePriority(t *torrent.Client, db *storm.DB) {
storedTorrents := Storage.FetchAllStoredTorrents(db)
for _, singleTorrent := range t.Torrents() {
for _, storedTorrent := range storedTorrents {
if storedTorrent.Hash == singleTorrent.InfoHash().String() {
for _, file := range singleTorrent.Files() {
for _, storedFile := range storedTorrent.TorrentFilePriority {
if storedFile.TorrentFilePath == file.DisplayPath() {
switch storedFile.TorrentFilePriority {
case "High":
file.SetPriority(torrent.PiecePriorityHigh)
case "Normal":
file.SetPriority(torrent.PiecePriorityNormal)
case "Cancel":
file.SetPriority(torrent.PiecePriorityNone)
default:
file.SetPriority(torrent.PiecePriorityNormal)
}
}
}
}
}
}
}
}
//CalculateTorrentSpeed is used to calculate the torrent upload and download speed over time c is current clientdb, oc is last client db to calculate speed over time
func CalculateTorrentSpeed(t *torrent.Torrent, c *ClientDB, oc ClientDB, completedSize int64) {
now := time.Now()
bytes := completedSize
bytesUpload := t.Stats().BytesWrittenData
dt := float32(now.Sub(oc.UpdatedAt)) // get the delta time length between now and last updated
db := float32(bytes - oc.BytesCompleted) //getting the delta bytes
rate := db * (float32(time.Second) / dt) // converting into seconds
dbU := float32(bytesUpload - oc.DataBytesWritten)
rateUpload := dbU * (float32(time.Second) / dt)
if rate >= 0 {
rateMB := rate / 1024 / 1024 //creating MB to calculate ETA
c.DownloadSpeed = fmt.Sprintf("%.2f", rateMB)
c.DownloadSpeed = c.DownloadSpeed + " MB/s"
c.downloadSpeedInt = int64(rate)
}
if rateUpload >= 0 {
rateUpload = rateUpload / 1024 / 1024
c.UploadSpeed = fmt.Sprintf("%.2f", rateUpload)
c.UploadSpeed = c.UploadSpeed + " MB/s"
}
c.UpdatedAt = now
}
//CalculateDownloadSize will calculate the download size once file priorities are sorted out
func CalculateDownloadSize(tFromStorage *Storage.TorrentLocal, activeTorrent *torrent.Torrent) int64 {
var totalLength int64
for _, file := range tFromStorage.TorrentFilePriority {
if file.TorrentFilePriority != "Cancel" {
totalLength = totalLength + file.TorrentFileSize
}
}
return totalLength
}
//CalculateCompletedSize will be used to calculate how much of the actual torrent we have completed minus the canceled files (even if they have been partially downloaded)
func CalculateCompletedSize(tFromStorage *Storage.TorrentLocal, activeTorrent *torrent.Torrent) int64 {
var discardByteLength int64
for _, storageFile := range tFromStorage.TorrentFilePriority {
if storageFile.TorrentFilePriority == "Cancel" { //If the file is canceled don't count it as downloaded
for _, activeFile := range activeTorrent.Files() {
if activeFile.DisplayPath() == storageFile.TorrentFilePath { //match the file from storage to active
for _, piece := range activeFile.State() {
if piece.Partial || piece.Complete {
discardByteLength = discardByteLength + piece.Bytes
}
}
}
}
}
}
downloadedLength := activeTorrent.BytesCompleted() - discardByteLength
if downloadedLength < 0 {
downloadedLength = 0
}
return downloadedLength
}
//CalculateTorrentETA is used to estimate the remaining dl time of the torrent based on the speed that the MB are being downloaded
func CalculateTorrentETA(tSize int64, tBytesCompleted int64, c *ClientDB) {
missingBytes := tSize - tBytesCompleted
if missingBytes == 0 {
c.ETA = "Done"
} else if c.downloadSpeedInt == 0 {
c.ETA = "N/A"
} else {
ETASeconds := missingBytes / c.downloadSpeedInt
str := secondsToMinutes(ETASeconds) //converting seconds to minutes + seconds
c.ETA = str
}
}
//CalculateUploadRatio calculates the download to upload ratio so you can see if you are being a good seeder
func CalculateUploadRatio(t *torrent.Torrent, c *ClientDB) string {
if c.TotalUploadedBytes > 0 && t.BytesCompleted() > 0 { //If we have actually started uploading and downloading stuff start calculating our ratio
uploadRatio := fmt.Sprintf("%.2f", float64(c.TotalUploadedBytes)/float64(t.BytesCompleted()))
return uploadRatio
}
uploadRatio := "0.00" //we haven't uploaded anything so no upload ratio just pass a string directly
return uploadRatio
}
//CalculateTorrentStatus is used to determine what the STATUS column of the frontend will display ll2
func CalculateTorrentStatus(t *torrent.Torrent, c *ClientDB, config Settings.FullClientSettings, tFromStorage *storage.TorrentLocal, bytesCompleted int64, totalSize int64) {
if (tFromStorage.TorrentStatus == "Stopped") || (float64(c.TotalUploadedBytes)/float64(bytesCompleted) >= config.SeedRatioStop && tFromStorage.TorrentUploadLimit == true) { //If storage shows torrent stopped or if it is over the seeding ratio AND is under the global limit
c.Status = "Stopped"
c.MaxConnections = 0
t.SetMaxEstablishedConns(0)
} else { //Only has 2 states in storage, stopped or running, so we know it should be running, and the websocket request handled updating the database with connections and status
bytesMissing := totalSize - bytesCompleted
c.MaxConnections = 80
t.SetMaxEstablishedConns(80)
//t.DownloadAll() //ensure that we are setting the torrent to download
if t.Seeding() && t.Stats().ActivePeers > 0 && bytesMissing == 0 {
c.Status = "Seeding"
} else if t.Stats().ActivePeers > 0 && bytesMissing > 0 {
c.Status = "Downloading"
} else if t.Stats().ActivePeers == 0 && bytesMissing == 0 {
c.Status = "Completed"
} else if t.Stats().ActivePeers == 0 && bytesMissing > 0 {
c.Status = "Awaiting Peers"
} else {
c.Status = "Unknown"
}
}
}

View File

@@ -1,110 +0,0 @@
package engine
import (
"fmt"
"time"
"github.com/anacrolix/torrent"
"github.com/deranjer/goTorrent/storage"
)
func secondsToMinutes(inSeconds int64) string {
minutes := inSeconds / 60
seconds := inSeconds % 60
minutesString := fmt.Sprintf("%d", minutes)
secondsString := fmt.Sprintf("%d", seconds)
str := minutesString + " Min/ " + secondsString + " Sec"
return str
}
//HumanizeBytes returns a nice humanized version of bytes in either GB or MB
func HumanizeBytes(bytes float32) string {
if bytes < 1000000 { //if we have less than 1MB in bytes convert to KB
pBytes := fmt.Sprintf("%.2f", bytes/1024)
pBytes = pBytes + " KB"
return pBytes
}
bytes = bytes / 1024 / 1024 //Converting bytes to a useful measure
if bytes > 1024 {
pBytes := fmt.Sprintf("%.2f", bytes/1024)
pBytes = pBytes + " GB"
return pBytes
}
pBytes := fmt.Sprintf("%.2f", bytes) //If not too big or too small leave it as MB
pBytes = pBytes + " MB"
return pBytes
}
//CalculateTorrentSpeed is used to calculate the torrent upload and download speed over time c is current clientdb, oc is last client db to calculate speed over time
func CalculateTorrentSpeed(t *torrent.Torrent, c *ClientDB, oc ClientDB) {
now := time.Now()
bytes := t.BytesCompleted()
bytesUpload := t.Stats().DataBytesWritten
dt := float32(now.Sub(oc.UpdatedAt)) // get the delta time length between now and last updated
db := float32(bytes - oc.BytesCompleted) //getting the delta bytes
rate := db * (float32(time.Second) / dt) // converting into seconds
dbU := float32(bytesUpload - oc.DataBytesWritten)
rateUpload := dbU * (float32(time.Second) / dt)
if rate >= 0 {
rate = rate / 1024 / 1024 //creating integer to calculate ETA
c.DownloadSpeed = fmt.Sprintf("%.2f", rate)
c.DownloadSpeed = c.DownloadSpeed + " MB/s"
c.downloadSpeedInt = int64(rate)
}
if rateUpload >= 0 {
rateUpload = rateUpload / 1024 / 1024
c.UploadSpeed = fmt.Sprintf("%.2f", rateUpload)
c.UploadSpeed = c.UploadSpeed + " MB/s"
}
c.UpdatedAt = now
}
//CalculateTorrentETA is used to estimate the remaining dl time of the torrent based on the speed that the MB are being downloaded
func CalculateTorrentETA(t *torrent.Torrent, c *ClientDB) {
missingBytes := t.Length() - t.BytesCompleted()
missingMB := missingBytes / 1024 / 1024
if missingMB == 0 {
c.ETA = "Done"
} else if c.downloadSpeedInt == 0 {
c.ETA = "N/A"
} else {
ETASeconds := missingMB / c.downloadSpeedInt
str := secondsToMinutes(ETASeconds) //converting seconds to minutes + seconds
c.ETA = str
}
}
//CalculateUploadRatio calculates the download to upload ratio so you can see if you are being a good seeder
func CalculateUploadRatio(t *torrent.Torrent, c *ClientDB) string {
if c.TotalUploadedBytes > 0 && t.BytesCompleted() > 0 { //If we have actually started uploading and downloading stuff start calculating our ratio
uploadRatio := fmt.Sprintf("%.2f", float64(c.TotalUploadedBytes)/float64(t.BytesCompleted()))
return uploadRatio
}
uploadRatio := "0.00" //we haven't uploaded anything so no upload ratio just pass a string directly
return uploadRatio
}
//CalculateTorrentStatus is used to determine what the STATUS column of the frontend will display ll2
func CalculateTorrentStatus(t *torrent.Torrent, c *ClientDB, config FullClientSettings, tFromStorage *storage.TorrentLocal) { //TODO redo all of this to allow for stopped torrents
if (tFromStorage.TorrentStatus == "Stopped") || (float64(c.TotalUploadedBytes)/float64(t.BytesCompleted()) >= config.SeedRatioStop) {
c.Status = "Stopped"
c.MaxConnections = 0
t.SetMaxEstablishedConns(0)
} else { //Only has 2 states in storage, stopped or running, so we know it should be running, and the websocket request handled updating the database with connections and status
c.MaxConnections = 80
t.SetMaxEstablishedConns(80) //TODO this should not be needed but apparently is needed
t.DownloadAll() //ensure that we are setting the torrent to download
if t.Seeding() && t.Stats().ActivePeers > 0 && t.BytesMissing() == 0 {
c.Status = "Seeding"
} else if t.Stats().ActivePeers > 0 && t.BytesMissing() > 0 {
c.Status = "Downloading"
} else if t.Stats().ActivePeers == 0 && t.BytesMissing() == 0 {
c.Status = "Completed"
} else if t.Stats().ActivePeers == 0 && t.BytesMissing() > 0 {
c.Status = "Awaiting Peers"
} else {
c.Status = "Unknown"
}
}
}

View File

@@ -1,164 +0,0 @@
package engine
import (
"fmt"
"path/filepath"
"github.com/anacrolix/dht"
"github.com/anacrolix/torrent"
"github.com/sirupsen/logrus"
"github.com/spf13/viper"
"golang.org/x/time/rate"
)
//FullClientSettings contains all of the settings for our entire application
type FullClientSettings struct {
LoggingLevel logrus.Level
LoggingOutput string
HTTPAddr string
Version int
TorrentConfig torrent.Config
TFileUploadFolder string
SeedRatioStop float64
PushBulletToken string
DefaultMoveFolder string
}
//default is called if there is a parsing error
func defaultConfig() FullClientSettings {
var Config FullClientSettings
Config.Version = 1.0
Config.LoggingLevel = 3 //Warn level
Config.TorrentConfig.DataDir = "downloads" //the absolute or relative path of the default download directory for torrents
Config.TFileUploadFolder = "uploadedTorrents"
Config.TorrentConfig.Seed = true
Config.HTTPAddr = ":8000"
Config.SeedRatioStop = 1.50
Config.TorrentConfig.DHTConfig = dht.ServerConfig{
StartingNodes: dht.GlobalBootstrapAddrs,
}
return Config
}
func dhtServerSettings(dhtConfig dht.ServerConfig) dht.ServerConfig {
viper.UnmarshalKey("DHTConfig", &dhtConfig)
Logger.WithFields(logrus.Fields{"dhtConfig": dhtConfig}).Info("Displaying DHT Config")
return dhtConfig
}
//FullClientSettingsNew creates a new set of setting from config.toml
func FullClientSettingsNew() FullClientSettings {
viper.SetConfigName("config")
viper.AddConfigPath("./")
err := viper.ReadInConfig()
if err != nil {
fmt.Println("Error reading in config, using defaults", err)
FullClientSettings := defaultConfig()
return FullClientSettings
}
var httpAddr string
httpAddrIP := viper.GetString("serverConfig.ServerAddr")
httpAddrPort := viper.GetString("serverConfig.ServerPort")
seedRatioStop := viper.GetFloat64("serverConfig.SeedRatioStop")
httpAddr = httpAddrIP + httpAddrPort
pushBulletToken := viper.GetString("notifications.PushBulletToken")
defaultMoveFolder := filepath.ToSlash(viper.GetString("serverConfig.DefaultMoveFolder")) //Converting the string literal into a filepath
defaultMoveFolderAbs, err := filepath.Abs(defaultMoveFolder)
if err != nil {
fmt.Println("Failed creating absolute path for defaultMoveFolder", err)
}
dataDir := filepath.ToSlash(viper.GetString("torrentClientConfig.DownloadDir")) //Converting the string literal into a filepath
dataDirAbs, err := filepath.Abs(dataDir) //Converting to an absolute file path
if err != nil {
fmt.Println("Failed creating absolute path for dataDir", err)
}
listenAddr := viper.GetString("torrentClientConfig.ListenAddr")
disablePex := viper.GetBool("torrentClientConfig.DisablePEX")
noDHT := viper.GetBool("torrentClientConfig.NoDHT")
noUpload := viper.GetBool("torrentClientConfig.NoUpload")
seed := viper.GetBool("torrentClientConfig.Seed")
peerID := viper.GetString("torrentClientConfig.PeerID")
disableUTP := viper.GetBool("torrentClientConfig.DisableUTP")
disableTCP := viper.GetBool("torrentClientConfig.DisableTCP")
disableIPv6 := viper.GetBool("torrentClientConfig.DisableIPv6")
debug := viper.GetBool("torrentClientConfig.Debug")
logLevelString := viper.GetString("serverConfig.LogLevel")
logOutput := viper.GetString("serverConfig.LogOutput")
var logLevel logrus.Level
switch logLevelString { //Options = Debug 5, Info 4, Warn 3, Error 2, Fatal 1, Panic 0
case "Panic":
logLevel = 0
case "Fatal":
logLevel = 1
case "Error":
logLevel = 2
case "Warn":
logLevel = 3
case "Info":
logLevel = 4
case "Debug":
logLevel = 5
default:
logLevel = 3
}
dhtServerConfig := dht.ServerConfig{
StartingNodes: dht.GlobalBootstrapAddrs,
}
if viper.IsSet("DHTConfig") {
fmt.Println("Reading in custom DHT config")
dhtServerConfig = dhtServerSettings(dhtServerConfig)
}
uploadRateLimiter := new(rate.Limiter)
viper.UnmarshalKey("UploadRateLimiter", &uploadRateLimiter)
downloadRateLimiter := new(rate.Limiter)
viper.UnmarshalKey("DownloadRateLimiter", &downloadRateLimiter)
encryptionPolicy := torrent.EncryptionPolicy{
DisableEncryption: viper.GetBool("EncryptionPolicy.DisableEncryption"),
ForceEncryption: viper.GetBool("EncryptionPolicy.ForceEncryption"),
PreferNoEncryption: viper.GetBool("EncryptionPolicy.PreferNoEncryption"),
}
tConfig := torrent.Config{
DataDir: dataDirAbs,
ListenAddr: listenAddr,
DisablePEX: disablePex,
NoDHT: noDHT,
DHTConfig: dhtServerConfig,
NoUpload: noUpload,
Seed: seed,
//UploadRateLimiter: uploadRateLimiter,
//DownloadRateLimiter: downloadRateLimiter,
PeerID: peerID,
DisableUTP: disableUTP,
DisableTCP: disableTCP,
DisableIPv6: disableIPv6,
Debug: debug,
EncryptionPolicy: encryptionPolicy,
}
Config := FullClientSettings{
LoggingLevel: logLevel,
LoggingOutput: logOutput,
SeedRatioStop: seedRatioStop,
HTTPAddr: httpAddr,
TorrentConfig: tConfig,
TFileUploadFolder: "uploadedTorrents",
PushBulletToken: pushBulletToken,
DefaultMoveFolder: defaultMoveFolderAbs,
}
return Config
}

15
goTorrentWebUI/acorn Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/acorn/bin/acorn" "$@"
ret=$?
else
node "$basedir/node_modules/acorn/bin/acorn" "$@"
ret=$?
fi
exit $ret

7
goTorrentWebUI/acorn.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\acorn\bin\acorn" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\acorn\bin\acorn" %*
)

15
goTorrentWebUI/ansi-html Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/ansi-html/bin/ansi-html" "$@"
ret=$?
else
node "$basedir/node_modules/ansi-html/bin/ansi-html" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\ansi-html\bin\ansi-html" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\ansi-html\bin\ansi-html" %*
)

15
goTorrentWebUI/atob Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/atob/bin/atob.js" "$@"
ret=$?
else
node "$basedir/node_modules/atob/bin/atob.js" "$@"
ret=$?
fi
exit $ret

7
goTorrentWebUI/atob.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\atob\bin\atob.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\atob\bin\atob.js" %*
)

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/browserslist/cli.js" "$@"
ret=$?
else
node "$basedir/node_modules/browserslist/cli.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\browserslist\cli.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\browserslist\cli.js" %*
)

15
goTorrentWebUI/cssesc Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/cssesc/bin/cssesc" "$@"
ret=$?
else
node "$basedir/node_modules/cssesc/bin/cssesc" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\cssesc\bin\cssesc" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\cssesc\bin\cssesc" %*
)

15
goTorrentWebUI/csso Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/csso/bin/csso" "$@"
ret=$?
else
node "$basedir/node_modules/csso/bin/csso" "$@"
ret=$?
fi
exit $ret

7
goTorrentWebUI/csso.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\csso\bin\csso" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\csso\bin\csso" %*
)

15
goTorrentWebUI/detect Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/detect-port-alt/bin/detect-port" "$@"
ret=$?
else
node "$basedir/node_modules/detect-port-alt/bin/detect-port" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/detect-port-alt/bin/detect-port" "$@"
ret=$?
else
node "$basedir/node_modules/detect-port-alt/bin/detect-port" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\detect-port-alt\bin\detect-port" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\detect-port-alt\bin\detect-port" %*
)

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\detect-port-alt\bin\detect-port" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\detect-port-alt\bin\detect-port" %*
)

15
goTorrentWebUI/errno Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/errno/cli.js" "$@"
ret=$?
else
node "$basedir/node_modules/errno/cli.js" "$@"
ret=$?
fi
exit $ret

7
goTorrentWebUI/errno.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\errno\cli.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\errno\cli.js" %*
)

15
goTorrentWebUI/escodegen Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/escodegen/bin/escodegen.js" "$@"
ret=$?
else
node "$basedir/node_modules/escodegen/bin/escodegen.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\escodegen\bin\escodegen.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\escodegen\bin\escodegen.js" %*
)

15
goTorrentWebUI/esgenerate Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/escodegen/bin/esgenerate.js" "$@"
ret=$?
else
node "$basedir/node_modules/escodegen/bin/esgenerate.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\escodegen\bin\esgenerate.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\escodegen\bin\esgenerate.js" %*
)

15
goTorrentWebUI/eslint Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/eslint/bin/eslint.js" "$@"
ret=$?
else
node "$basedir/node_modules/eslint/bin/eslint.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\eslint\bin\eslint.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\eslint\bin\eslint.js" %*
)

15
goTorrentWebUI/esparse Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/esprima/bin/esparse.js" "$@"
ret=$?
else
node "$basedir/node_modules/esprima/bin/esparse.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\esprima\bin\esparse.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\esprima\bin\esparse.js" %*
)

15
goTorrentWebUI/esvalidate Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/esprima/bin/esvalidate.js" "$@"
ret=$?
else
node "$basedir/node_modules/esprima/bin/esvalidate.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\esprima\bin\esvalidate.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\esprima\bin\esvalidate.js" %*
)

15
goTorrentWebUI/handlebars Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/handlebars/bin/handlebars" "$@"
ret=$?
else
node "$basedir/node_modules/handlebars/bin/handlebars" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\handlebars\bin\handlebars" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\handlebars\bin\handlebars" %*
)

15
goTorrentWebUI/he Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/he/bin/he" "$@"
ret=$?
else
node "$basedir/node_modules/he/bin/he" "$@"
ret=$?
fi
exit $ret

7
goTorrentWebUI/he.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\he\bin\he" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\he\bin\he" %*
)

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/html-minifier/cli.js" "$@"
ret=$?
else
node "$basedir/node_modules/html-minifier/cli.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\html-minifier\cli.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\html-minifier\cli.js" %*
)

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/import-local/fixtures/cli.js" "$@"
ret=$?
else
node "$basedir/node_modules/import-local/fixtures/cli.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\import-local\fixtures\cli.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\import-local\fixtures\cli.js" %*
)

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/internal-ip/cli.js" "$@"
ret=$?
else
node "$basedir/node_modules/internal-ip/cli.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\internal-ip\cli.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\internal-ip\cli.js" %*
)

15
goTorrentWebUI/is-ci Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/is-ci/bin.js" "$@"
ret=$?
else
node "$basedir/node_modules/is-ci/bin.js" "$@"
ret=$?
fi
exit $ret

7
goTorrentWebUI/is-ci.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\is-ci\bin.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\is-ci\bin.js" %*
)

15
goTorrentWebUI/jest Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/jest/bin/jest.js" "$@"
ret=$?
else
node "$basedir/node_modules/jest/bin/jest.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/jest-runtime/bin/jest-runtime.js" "$@"
ret=$?
else
node "$basedir/node_modules/jest-runtime/bin/jest-runtime.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\jest-runtime\bin\jest-runtime.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\jest-runtime\bin\jest-runtime.js" %*
)

7
goTorrentWebUI/jest.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\jest\bin\jest.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\jest\bin\jest.js" %*
)

15
goTorrentWebUI/js-yaml Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/js-yaml/bin/js-yaml.js" "$@"
ret=$?
else
node "$basedir/node_modules/js-yaml/bin/js-yaml.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\js-yaml\bin\js-yaml.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\js-yaml\bin\js-yaml.js" %*
)

View File

@@ -52,7 +52,7 @@ var RSSList = [];
var RSSTorrentList = [];
var torrentListRequest = {
messageType: "torrentListRequest"
MessageType: "torrentListRequest"
//websocket is started in kickwebsocket.js and is picked up here so "ws" is already defined 22
};ws.onmessage = function (evt) {
@@ -128,7 +128,7 @@ var torrentListRequest = {
console.log("Logger data requested");
break;
case "rssListRequest":
case "rssList":
console.log("RSSListRequest recieved", evt.data);
RSSList = [];
for (var i = 0; i < serverMessage.TotalRSSFeeds; i++) {
@@ -191,7 +191,7 @@ var BackendSocket = function (_React$Component) {
case 1:
var peerListHashes = {
MessageType: "torrentPeerListRequest",
Payload: selectionHashes
Payload: {"PeerListHash": selectionHashes}
};
console.log("Peers tab information requested", peerListHashes);
ws.send(JSON.stringify(peerListHashes));
@@ -199,7 +199,7 @@ var BackendSocket = function (_React$Component) {
case 2:
var fileListHashes = {
MessageType: "torrentFileListRequest",
Payload: selectionHashes
Payload: {"FileListHash": selectionHashes[0]}
};
console.log("Files tab information requested", fileListHashes);
ws.send(JSON.stringify(fileListHashes));
@@ -256,7 +256,7 @@ var BackendSocket = function (_React$Component) {
case 1:
var peerListHashes = {
MessageType: "torrentPeerListRequest",
Payload: this.props.selectionHashes
Payload: {"PeerListHash": this.props.selectionHashes}
};
ws.send(JSON.stringify(peerListHashes));
this.props.newPeerList(peerList);
@@ -264,7 +264,7 @@ var BackendSocket = function (_React$Component) {
case 2:
var fileListHashes = {
MessageType: "torrentFileListRequest",
Payload: this.props.selectionHashes
Payload: {"FileListHash": this.props.selectionHashes[0]}
};
ws.send(JSON.stringify(fileListHashes));
this.props.newFileList(fileList);

View File

@@ -116,9 +116,9 @@ var addTorrentFilePopup = function (_React$Component) {
console.log("Base64", base64data);
var torrentFileMessage = {
messageType: "torrentFileSubmit",
messageDetail: _this.state.torrentFileName,
messageDetailTwo: _this.state.storageValue,
MessageType: "torrentFileSubmit",
MessageDetail: this.state.torrentFileName,
MessageDetailTwo: this.state.storageValue,
Payload: [base64data]
};
console.log("Sending magnet link: ", torrentFileMessage);

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/miller-rabin/bin/miller-rabin" "$@"
ret=$?
else
node "$basedir/node_modules/miller-rabin/bin/miller-rabin" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\miller-rabin\bin\miller-rabin" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\miller-rabin\bin\miller-rabin" %*
)

15
goTorrentWebUI/mime Normal file
View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/mime/cli.js" "$@"
ret=$?
else
node "$basedir/node_modules/mime/cli.js" "$@"
ret=$?
fi
exit $ret

7
goTorrentWebUI/mime.cmd Normal file
View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\mime\cli.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\mime\cli.js" %*
)

View File

@@ -0,0 +1,15 @@
#!/bin/sh
basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')")
case `uname` in
*CYGWIN*) basedir=`cygpath -w "$basedir"`;;
esac
if [ -x "$basedir/node" ]; then
"$basedir/node" "$basedir/node_modules/multicast-dns/cli.js" "$@"
ret=$?
else
node "$basedir/node_modules/multicast-dns/cli.js" "$@"
ret=$?
fi
exit $ret

View File

@@ -0,0 +1,7 @@
@IF EXIST "%~dp0\node.exe" (
"%~dp0\node.exe" "%~dp0\node_modules\multicast-dns\cli.js" %*
) ELSE (
@SETLOCAL
@SET PATHEXT=%PATHEXT:;.JS;=;%
node "%~dp0\node_modules\multicast-dns\cli.js" %*
)

View File

@@ -1,7 +1,7 @@
/**
* Bundle of @devexpress/dx-core
* Generated: 2017-11-10
* Version: 1.0.0-beta.1
* Generated: 2018-03-02
* Version: 1.0.3
* License: https://js.devexpress.com/Licensing
*/
@@ -186,22 +186,22 @@ var PluginHost = function () {
this.plugins.filter(function (plugin) {
return plugin.container;
}).forEach(function (plugin) {
if (knownOptionals.has(plugin.pluginName)) {
throw getDependencyError(knownOptionals.get(plugin.pluginName), plugin.pluginName);
if (knownOptionals.has(plugin.name)) {
throw getDependencyError(knownOptionals.get(plugin.name), plugin.name);
}
plugin.dependencies.forEach(function (dependency) {
if (defined.has(dependency.pluginName)) return;
if (defined.has(dependency.name)) return;
if (dependency.optional) {
if (!knownOptionals.has(dependency.pluginName)) {
knownOptionals.set(dependency.pluginName, plugin.pluginName);
if (!knownOptionals.has(dependency.name)) {
knownOptionals.set(dependency.name, plugin.name);
}
return;
}
throw getDependencyError(plugin.pluginName, dependency.pluginName);
throw getDependencyError(plugin.name, dependency.name);
});
defined.add(plugin.pluginName);
defined.add(plugin.name);
});
}
}, {

File diff suppressed because one or more lines are too long

View File

@@ -1,7 +1,7 @@
/**
* Bundle of @devexpress/dx-core
* Generated: 2017-11-10
* Version: 1.0.0-beta.1
* Generated: 2018-03-02
* Version: 1.0.3
* License: https://js.devexpress.com/Licensing
*/
@@ -192,22 +192,22 @@ var PluginHost = function () {
this.plugins.filter(function (plugin) {
return plugin.container;
}).forEach(function (plugin) {
if (knownOptionals.has(plugin.pluginName)) {
throw getDependencyError(knownOptionals.get(plugin.pluginName), plugin.pluginName);
if (knownOptionals.has(plugin.name)) {
throw getDependencyError(knownOptionals.get(plugin.name), plugin.name);
}
plugin.dependencies.forEach(function (dependency) {
if (defined.has(dependency.pluginName)) return;
if (defined.has(dependency.name)) return;
if (dependency.optional) {
if (!knownOptionals.has(dependency.pluginName)) {
knownOptionals.set(dependency.pluginName, plugin.pluginName);
if (!knownOptionals.has(dependency.name)) {
knownOptionals.set(dependency.name, plugin.name);
}
return;
}
throw getDependencyError(plugin.pluginName, dependency.pluginName);
throw getDependencyError(plugin.name, dependency.name);
});
defined.add(plugin.pluginName);
defined.add(plugin.name);
});
}
}, {

File diff suppressed because one or more lines are too long

View File

@@ -1,8 +1,8 @@
{
"_from": "@devexpress/dx-core",
"_id": "@devexpress/dx-core@1.0.0-beta.1",
"_id": "@devexpress/dx-core@1.0.3",
"_inBundle": false,
"_integrity": "sha512-4Kv5RTlmlK7o2DF5BB5r2yWgshvFrUSHWzJzdSyBtFxsQzvI3vJqS0Z0mAplZCyYfRk4xh9SRp6I9DML66v0EQ==",
"_integrity": "sha512-M1Kjju074ddAQmaFuKypM/LdhCZsDISqhGj4LST2ZGQPlGpH89BMBEV8p+8MedFQQCG/svuS25AKip1Gs9KJgA==",
"_location": "/@devexpress/dx-core",
"_phantomChildren": {},
"_requested": {
@@ -19,10 +19,10 @@
"_requiredBy": [
"#USER"
],
"_resolved": "https://registry.npmjs.org/@devexpress/dx-core/-/dx-core-1.0.0-beta.1.tgz",
"_shasum": "63383ec2bd3903d9a163c1316706cde32227d6b4",
"_resolved": "https://registry.npmjs.org/@devexpress/dx-core/-/dx-core-1.0.3.tgz",
"_shasum": "c310b540229f83d6be5797fb2a5da5491757d21b",
"_spec": "@devexpress/dx-core",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI",
"author": {
"name": "Developer Express Inc.",
"url": "https://www.devexpress.com/"
@@ -35,20 +35,20 @@
"description": "Core library for DevExtreme Reactive Components",
"devDependencies": {
"babel-core": "^6.26.0",
"babel-jest": "^21.2.0",
"babel-jest": "^22.1.0",
"babel-plugin-external-helpers": "^6.22.0",
"babel-plugin-transform-object-rest-spread": "^6.26.0",
"babel-plugin-transform-runtime": "^6.23.0",
"babel-preset-es2015": "^6.24.1",
"core-js": "^2.5.1",
"eslint": "^4.10.0",
"core-js": "^2.5.3",
"eslint": "^4.16.0",
"eslint-config-airbnb-base": "^12.1.0",
"eslint-plugin-filenames": "^1.2.0",
"eslint-plugin-import": "^2.8.0",
"eslint-plugin-jest": "^21.3.0",
"jest": "^21.2.1",
"eslint-plugin-jest": "^21.7.0",
"jest": "^22.1.4",
"rollup": "0.50.0",
"rollup-plugin-babel": "^3.0.2",
"rollup-plugin-babel": "^3.0.3",
"rollup-plugin-license": "^0.5.0"
},
"files": [
@@ -81,5 +81,5 @@
"test:coverage": "jest --coverage",
"test:watch": "jest --watch"
},
"version": "1.0.0-beta.1"
"version": "1.0.3"
}

View File

@@ -1,7 +1,7 @@
/**
* Bundle of @devexpress/dx-grid-core
* Generated: 2017-11-10
* Version: 1.0.0-beta.1
* Generated: 2018-03-02
* Version: 1.0.3
* License: https://js.devexpress.com/Licensing
*/
@@ -19,10 +19,12 @@ var rowIdGetter = function rowIdGetter(getRowId, rows) {
return getRowId;
};
var cellValueGetter = function cellValueGetter(getCellValue, columns) {
if (getCellValue) {
return getCellValue;
}
var defaultGetCellValue = function defaultGetCellValue(row, columnName) {
return row[columnName];
};
var cellValueGetter = function cellValueGetter() {
var getCellValue = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : defaultGetCellValue;
var columns = arguments[1];
var useFastAccessor = true;
var map = columns.reduce(function (acc, column) {
@@ -33,28 +35,29 @@ var cellValueGetter = function cellValueGetter(getCellValue, columns) {
return acc;
}, {});
return useFastAccessor ? function (row, columnName) {
return row[columnName];
} : function (row, columnName) {
return map[columnName] ? map[columnName](row, columnName) : row[columnName];
if (useFastAccessor) {
return getCellValue;
}
return function (row, columnName) {
return map[columnName] ? map[columnName](row, columnName) : getCellValue(row, columnName);
};
};
var setColumnSorting = function setColumnSorting(state, _ref) {
var changeColumnSorting = function changeColumnSorting(state, _ref) {
var columnName = _ref.columnName,
direction = _ref.direction,
keepOther = _ref.keepOther,
cancel = _ref.cancel,
sortIndex = _ref.sortIndex;
var sorting = state.sorting;
var nextSorting = [];
if (keepOther === true) {
nextSorting = Array.from(sorting).slice();
nextSorting = sorting.slice();
}
if (Array.isArray(keepOther)) {
nextSorting = Array.from(sorting).filter(function (columnSorting) {
nextSorting = sorting.slice().filter(function (columnSorting) {
return keepOther.indexOf(columnSorting.columnName) > -1;
});
}
@@ -72,7 +75,7 @@ var setColumnSorting = function setColumnSorting(state, _ref) {
nextSorting.splice(columnSortingIndex, 1);
}
if (!cancel) {
if (direction !== null) {
var newIndexFallback = columnSortingIndex > -1 ? columnSortingIndex : nextSorting.length;
var newIndex = sortIndex !== undefined ? sortIndex : newIndexFallback;
nextSorting.splice(newIndex, 0, newColumnSorting);
@@ -446,7 +449,7 @@ var defaultCompare = function defaultCompare(a, b) {
};
var createCompare = function createCompare(sorting, getColumnCompare, getComparableValue) {
return Array.from(sorting).reverse().reduce(function (prevCompare, columnSorting) {
return sorting.slice().reverse().reduce(function (prevCompare, columnSorting) {
var columnName = columnSorting.columnName;
var inverse = columnSorting.direction === 'desc';
@@ -495,7 +498,7 @@ var sortedRows = function sortedRows(rows, sorting, getCellValue, getColumnCompa
if (!getRowLevelKey) {
var _compare = createCompare(sorting, getColumnCompare, getCellValue);
return mergeSort(Array.from(rows), _compare);
return mergeSort(rows.slice(), _compare);
}
var compare = createCompare(sorting, getColumnCompare, function (row, columnName) {
@@ -510,14 +513,14 @@ var sortedRows = function sortedRows(rows, sorting, getCellValue, getColumnCompa
return sortHierarchicalRows(rows, compare, getRowLevelKey);
};
var setColumnFilter = function setColumnFilter(filters, _ref) {
var changeColumnFilter = function changeColumnFilter(filters, _ref) {
var columnName = _ref.columnName,
config = _ref.config;
var filterIndex = filters.findIndex(function (f) {
return f.columnName === columnName;
});
var nextState = Array.from(filters);
var nextState = filters.slice();
if (config) {
var filter = _extends({ columnName: columnName }, config);
@@ -624,56 +627,67 @@ var filteredRows = function filteredRows(rows, filters, getCellValue, getColumnP
var GROUP_KEY_SEPARATOR = '|';
var groupByColumn = function groupByColumn(state, _ref) {
var applyColumnGrouping = function applyColumnGrouping(grouping, _ref) {
var columnName = _ref.columnName,
groupIndex = _ref.groupIndex;
var grouping = Array.from(state.grouping);
var groupingIndex = grouping.findIndex(function (g) {
var nextGrouping = grouping.slice();
var groupingIndex = nextGrouping.findIndex(function (g) {
return g.columnName === columnName;
});
var targetIndex = groupIndex;
if (groupingIndex > -1) {
grouping.splice(groupingIndex, 1);
nextGrouping.splice(groupingIndex, 1);
} else if (groupIndex === undefined) {
targetIndex = grouping.length;
targetIndex = nextGrouping.length;
}
if (targetIndex > -1) {
grouping.splice(targetIndex, 0, {
nextGrouping.splice(targetIndex, 0, {
columnName: columnName
});
}
var ungroupedColumnIndex = state.grouping.findIndex(function (group, index) {
return !grouping[index] || group.columnName !== grouping[index].columnName;
return nextGrouping;
};
var changeColumnGrouping = function changeColumnGrouping(_ref2, _ref3) {
var grouping = _ref2.grouping,
expandedGroups = _ref2.expandedGroups;
var columnName = _ref3.columnName,
groupIndex = _ref3.groupIndex;
var nextGrouping = applyColumnGrouping(grouping, { columnName: columnName, groupIndex: groupIndex });
var ungroupedColumnIndex = grouping.findIndex(function (group, index) {
return !nextGrouping[index] || group.columnName !== nextGrouping[index].columnName;
});
if (ungroupedColumnIndex === -1) {
return {
grouping: grouping
grouping: nextGrouping
};
}
var filteredExpandedGroups = state.expandedGroups.filter(function (group) {
var filteredExpandedGroups = expandedGroups.filter(function (group) {
return group.split(GROUP_KEY_SEPARATOR).length <= ungroupedColumnIndex;
});
if (filteredExpandedGroups.length === state.expandedGroups.length) {
if (filteredExpandedGroups.length === expandedGroups.length) {
return {
grouping: grouping
grouping: nextGrouping
};
}
return {
grouping: grouping,
grouping: nextGrouping,
expandedGroups: filteredExpandedGroups
};
};
var toggleExpandedGroups = function toggleExpandedGroups(state, _ref2) {
var groupKey = _ref2.groupKey;
var toggleExpandedGroups = function toggleExpandedGroups(state, _ref4) {
var groupKey = _ref4.groupKey;
var expandedGroups = Array.from(state.expandedGroups);
var expandedGroups = state.expandedGroups.slice();
var groupKeyIndex = expandedGroups.indexOf(groupKey);
if (groupKeyIndex > -1) {
@@ -687,40 +701,20 @@ var toggleExpandedGroups = function toggleExpandedGroups(state, _ref2) {
};
};
var draftGroupingChange = function draftGroupingChange(state, _ref3) {
var columnName = _ref3.columnName,
groupIndex = _ref3.groupIndex;
return { groupingChange: { columnName: columnName, groupIndex: groupIndex } };
var draftColumnGrouping = function draftColumnGrouping(_ref5, _ref6) {
var grouping = _ref5.grouping,
draftGrouping = _ref5.draftGrouping;
var columnName = _ref6.columnName,
groupIndex = _ref6.groupIndex;
return {
draftGrouping: applyColumnGrouping(draftGrouping || grouping, { columnName: columnName, groupIndex: groupIndex })
};
};
var cancelGroupingChange = function cancelGroupingChange() {
return { groupingChange: null };
var cancelColumnGroupingDraft = function cancelColumnGroupingDraft() {
return {
draftGrouping: null
};
var draftGrouping = function draftGrouping(grouping, groupingChange) {
if (!groupingChange) return grouping;
var columnName = groupingChange.columnName,
groupIndex = groupingChange.groupIndex;
var result = Array.from(grouping);
if (groupIndex !== -1) {
result = result.filter(function (g) {
return g.columnName !== columnName;
});
result.splice(groupIndex, 0, {
columnName: columnName,
draft: true,
mode: grouping.length > result.length ? 'reorder' : 'add'
});
} else {
result = result.map(function (g) {
return g.columnName === columnName ? { columnName: columnName, draft: true, mode: 'remove' } : g;
});
}
return result;
};
var GRID_GROUP_TYPE = 'group';
@@ -735,26 +729,26 @@ var groupRowLevelKeyGetter = function groupRowLevelKeyGetter(row) {
return row[GRID_GROUP_LEVEL_KEY];
};
var defaultColumnIdentity = function defaultColumnIdentity(value) {
var defaultColumnCriteria = function defaultColumnCriteria(value) {
return {
key: String(value),
value: value
};
};
var groupedRows = function groupedRows(rows, grouping, getCellValue, getColumnIdentity) {
var groupedRows = function groupedRows(rows, grouping, getCellValue, getColumnCriteria) {
var keyPrefix = arguments.length > 4 && arguments[4] !== undefined ? arguments[4] : '';
if (!grouping.length) return rows;
var columnName = grouping[0].columnName;
var groupIdentity = getColumnIdentity && getColumnIdentity(columnName) || defaultColumnIdentity;
var groupCriteria = getColumnCriteria && getColumnCriteria(columnName) || defaultColumnCriteria;
var groups = rows.reduce(function (acc, row) {
var _groupIdentity = groupIdentity(getCellValue(row, columnName), row),
key = _groupIdentity.key,
_groupIdentity$value = _groupIdentity.value,
value = _groupIdentity$value === undefined ? key : _groupIdentity$value;
var _groupCriteria = groupCriteria(getCellValue(row, columnName), row),
key = _groupCriteria.key,
_groupCriteria$value = _groupCriteria.value,
value = _groupCriteria$value === undefined ? key : _groupCriteria$value;
var sameKeyItems = acc.get(key);
@@ -778,7 +772,7 @@ var groupedRows = function groupedRows(rows, grouping, getCellValue, getColumnId
var compoundKey = '' + keyPrefix + key;
acc.push((_acc$push = {}, defineProperty(_acc$push, GRID_GROUP_CHECK, true), defineProperty(_acc$push, GRID_GROUP_LEVEL_KEY, GRID_GROUP_TYPE + '_' + groupedBy), defineProperty(_acc$push, 'groupedBy', groupedBy), defineProperty(_acc$push, 'compoundKey', compoundKey), defineProperty(_acc$push, 'key', key), defineProperty(_acc$push, 'value', value), _acc$push));
acc.push.apply(acc, toConsumableArray(groupedRows(items, nestedGrouping, getCellValue, getColumnIdentity, '' + compoundKey + GROUP_KEY_SEPARATOR)));
acc.push.apply(acc, toConsumableArray(groupedRows(items, nestedGrouping, getCellValue, getColumnCriteria, '' + compoundKey + GROUP_KEY_SEPARATOR)));
return acc;
}, []);
};
@@ -789,6 +783,7 @@ var expandedGroupRows = function expandedGroupRows(rows, grouping, expandedGroup
var groupingColumnNames = grouping.map(function (columnGrouping) {
return columnGrouping.columnName;
});
var expandedGroupsSet = new Set(expandedGroups);
var currentGroupExpanded = true;
var currentGroupLevel = 0;
@@ -807,7 +802,7 @@ var expandedGroupRows = function expandedGroupRows(rows, grouping, expandedGroup
return acc;
}
currentGroupExpanded = expandedGroups.has(row.compoundKey);
currentGroupExpanded = expandedGroupsSet.has(row.compoundKey);
currentGroupLevel = groupLevel;
if (currentGroupExpanded) {
@@ -864,19 +859,34 @@ var customGroupingRowIdGetter = function customGroupingRowIdGetter(getRowId, row
};
};
var groupingPanelItems = function groupingPanelItems(columns, grouping) {
return grouping.map(function (_ref) {
var columnName = _ref.columnName,
draft = _ref.draft;
var column = columns.find(function (c) {
return c.name === columnName;
});
var groupingPanelItems = function groupingPanelItems(columns, grouping, draftGrouping) {
var items = draftGrouping.map(function (_ref) {
var columnName = _ref.columnName;
return {
column: column,
draft: draft
column: columns.find(function (c) {
return c.name === columnName;
}),
draft: !grouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
})
};
});
grouping.forEach(function (_ref2, index) {
var columnName = _ref2.columnName;
if (draftGrouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
})) return;
items.splice(index, 0, {
column: columns.find(function (c) {
return c.name === columnName;
}),
draft: true
});
});
return items;
};
var setCurrentPage = function setCurrentPage(prevPage, page) {
@@ -955,47 +965,21 @@ var lastRowOnPage = function lastRowOnPage(currentPage, pageSize, totalRowCount)
return result;
};
var setRowSelection = function setRowSelection(selection, _ref) {
var rowId = _ref.rowId,
selected = _ref.selected;
var selectedRows = Array.from(selection);
var selectedIndex = selectedRows.indexOf(rowId);
var isRowSelected = selected;
if (isRowSelected === undefined) {
isRowSelected = selectedIndex === -1;
}
if (selectedIndex > -1 && !isRowSelected) {
selectedRows.splice(selectedIndex, 1);
} else if (selectedIndex === -1 && isRowSelected) {
selectedRows.push(rowId);
}
return selectedRows;
};
var setRowsSelection = function setRowsSelection(selection, _ref2) {
var rowIds = _ref2.rowIds,
selected = _ref2.selected;
if (rowIds.length === 1) {
return setRowSelection(selection, { rowId: rowIds[0], selected: selected });
}
var toggleSelection = function toggleSelection(selection, _ref) {
var rowIds = _ref.rowIds,
state = _ref.state;
var rowIdsSet = new Set(rowIds);
var isRowsSelected = selected;
if (isRowsSelected === undefined) {
var rowsState = state;
if (rowsState === undefined) {
var availableSelection = selection.filter(function (rowId) {
return rowIdsSet.has(rowId);
});
isRowsSelected = availableSelection.length !== rowIdsSet.size;
rowsState = availableSelection.length !== rowIdsSet.size;
}
if (isRowsSelected) {
if (rowsState) {
var selectionSet = new Set(selection);
return [].concat(toConsumableArray(selection), toConsumableArray(rowIds.filter(function (rowId) {
return !selectionSet.has(rowId);
@@ -1007,41 +991,63 @@ var setRowsSelection = function setRowsSelection(selection, _ref2) {
});
};
var getAvailableToSelect = function getAvailableToSelect(rows, getRowId, isGroupRow) {
var rowsWithAvailableToSelect = function rowsWithAvailableToSelect(rows, getRowId, isGroupRow) {
var dataRows = rows;
if (isGroupRow) {
dataRows = dataRows.filter(function (row) {
return !isGroupRow(row);
});
}
return dataRows.map(function (row) {
return { rows: rows, availableToSelect: dataRows.map(function (row) {
return getRowId(row);
}) };
};
var someSelected = function someSelected(_ref, selection) {
var availableToSelect = _ref.availableToSelect;
var selectionSet = new Set(selection);
return availableToSelect.length !== 0 && selectionSet.size !== 0 && availableToSelect.some(function (elem) {
return selectionSet.has(elem);
}) && availableToSelect.some(function (elem) {
return !selectionSet.has(elem);
});
};
var getAvailableSelection = function getAvailableSelection(selection, availableToSelect) {
var availableToSelectSet = new Set(availableToSelect);
return selection.filter(function (selected) {
return availableToSelectSet.has(selected);
var allSelected = function allSelected(_ref2, selection) {
var availableToSelect = _ref2.availableToSelect;
var selectionSet = new Set(selection);
return selectionSet.size !== 0 && availableToSelect.length !== 0 && !availableToSelect.some(function (elem) {
return !selectionSet.has(elem);
});
};
var startEditRows = function startEditRows(prevEditingRows, _ref) {
var unwrapSelectedRows = function unwrapSelectedRows(_ref3) {
var rows = _ref3.rows;
return rows;
};
var startEditRows = function startEditRows(prevEditingRowIds, _ref) {
var rowIds = _ref.rowIds;
return [].concat(toConsumableArray(prevEditingRows), toConsumableArray(rowIds));
return [].concat(toConsumableArray(prevEditingRowIds), toConsumableArray(rowIds));
};
var stopEditRows = function stopEditRows(prevEditingRows, _ref2) {
var stopEditRows = function stopEditRows(prevEditingRowIds, _ref2) {
var rowIds = _ref2.rowIds;
var rowIdSet = new Set(rowIds);
return prevEditingRows.filter(function (id) {
return prevEditingRowIds.filter(function (id) {
return !rowIdSet.has(id);
});
};
var addRow = function addRow(addedRows, _ref3) {
var row = _ref3.row;
var addRow = function addRow(addedRows) {
var _ref3 = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : { row: {} },
row = _ref3.row;
return [].concat(toConsumableArray(addedRows), [row]);
};
@@ -1049,7 +1055,7 @@ var changeAddedRow = function changeAddedRow(addedRows, _ref4) {
var rowId = _ref4.rowId,
change = _ref4.change;
var result = Array.from(addedRows);
var result = addedRows.slice();
result[rowId] = _extends({}, result[rowId], change);
return result;
};
@@ -1067,34 +1073,34 @@ var cancelAddedRows = function cancelAddedRows(addedRows, _ref5) {
return result;
};
var changeRow = function changeRow(prevChangedRows, _ref6) {
var changeRow = function changeRow(prevRowChanges, _ref6) {
var rowId = _ref6.rowId,
change = _ref6.change;
var prevChange = prevChangedRows[rowId] || {};
return _extends({}, prevChangedRows, defineProperty({}, rowId, _extends({}, prevChange, change)));
var prevChange = prevRowChanges[rowId] || {};
return _extends({}, prevRowChanges, defineProperty({}, rowId, _extends({}, prevChange, change)));
};
var cancelChanges = function cancelChanges(prevChangedRows, _ref7) {
var cancelChanges = function cancelChanges(prevRowChanges, _ref7) {
var rowIds = _ref7.rowIds;
var result = _extends({}, prevChangedRows);
var result = _extends({}, prevRowChanges);
rowIds.forEach(function (rowId) {
delete result[rowId];
});
return result;
};
var deleteRows = function deleteRows(deletedRows, _ref8) {
var deleteRows = function deleteRows(deletedRowIds, _ref8) {
var rowIds = _ref8.rowIds;
return [].concat(toConsumableArray(deletedRows), toConsumableArray(rowIds));
return [].concat(toConsumableArray(deletedRowIds), toConsumableArray(rowIds));
};
var cancelDeletedRows = function cancelDeletedRows(deletedRows, _ref9) {
var cancelDeletedRows = function cancelDeletedRows(deletedRowIds, _ref9) {
var rowIds = _ref9.rowIds;
var rowIdSet = new Set(rowIds);
return deletedRows.filter(function (rowId) {
return deletedRowIds.filter(function (rowId) {
return !rowIdSet.has(rowId);
});
};
@@ -1118,21 +1124,30 @@ var addedRowsByIds = function addedRowsByIds(addedRows, rowIds) {
return result;
};
var computedCreateRowChange = function computedCreateRowChange(columns) {
var map = columns.reduce(function (acc, column) {
if (column.createRowChange) {
acc[column.name] = column.createRowChange;
var defaultCreateRowChange = function defaultCreateRowChange(row, value, columnName) {
return defineProperty({}, columnName, value);
};
var createRowChangeGetter = function createRowChangeGetter() {
var createRowChange = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : defaultCreateRowChange;
var columnExtensions = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : [];
var map = columnExtensions.reduce(function (acc, columnExtension) {
if (columnExtension.createRowChange) {
acc[columnExtension.columnName] = columnExtension.createRowChange;
}
return acc;
}, {});
return function (row, columnName, value) {
return map[columnName] ? map[columnName](row, value, columnName) : defineProperty({}, columnName, value);
return function (row, value, columnName) {
if (map[columnName]) {
return map[columnName](row, value, columnName);
}
return createRowChange(row, value, columnName);
};
};
var getRowChange = function getRowChange(changedRows, rowId) {
return changedRows[rowId] || {};
var getRowChange = function getRowChange(rowChanges, rowId) {
return rowChanges[rowId] || {};
};
var TABLE_REORDERING_TYPE = 'reordering';
@@ -1143,7 +1158,7 @@ var changeColumnOrder = function changeColumnOrder(order, _ref) {
var sourceColumnIndex = order.indexOf(sourceColumnName);
var targetColumnIndex = order.indexOf(targetColumnName);
var newOrder = Array.from(order);
var newOrder = order.slice();
newOrder.splice(sourceColumnIndex, 1);
newOrder.splice(targetColumnIndex, 0, sourceColumnName);
@@ -1154,7 +1169,7 @@ var TABLE_DATA_TYPE = 'data';
var TABLE_NODATA_TYPE = 'nodata';
var orderedColumns = function orderedColumns(tableColumns, order) {
var result = Array.from(tableColumns);
var result = tableColumns.slice();
result.sort(function (a, b) {
if (a.type !== TABLE_DATA_TYPE || b.type !== TABLE_DATA_TYPE) return 0;
@@ -1194,7 +1209,11 @@ var tableColumnsWithWidths = function tableColumnsWithWidths(tableColumns, colum
return tableColumns.reduce(function (acc, tableColumn) {
if (tableColumn.type === 'data') {
var columnName = tableColumn.column.name;
var width = draftColumnWidths[columnName] || columnWidths[columnName];
var isCurrentColumn = function isCurrentColumn(elem) {
return elem.columnName === columnName;
};
var column = draftColumnWidths.find(isCurrentColumn) || columnWidths.find(isCurrentColumn);
var width = column && column.width;
if (width === undefined) {
throw new Error(UNSET_COLUMN_WIDTH_ERROR.replace('$1', columnName));
}
@@ -1208,36 +1227,43 @@ var tableColumnsWithWidths = function tableColumnsWithWidths(tableColumns, colum
var MIN_SIZE = 40;
var changeTableColumnWidths = function changeTableColumnWidths(state, _ref) {
var shifts = _ref.shifts;
var changeTableColumnWidth = function changeTableColumnWidth(state, _ref) {
var columnName = _ref.columnName,
shift = _ref.shift;
var columnWidths = state.columnWidths;
var updatedColumnWidths = Object.keys(shifts).reduce(function (acc, columnName) {
var size = Math.max(MIN_SIZE, columnWidths[columnName] + shifts[columnName]);
return Object.assign(acc, defineProperty({}, columnName, size));
}, {});
return _extends({}, state, {
columnWidths: _extends({}, columnWidths, updatedColumnWidths),
draftColumnWidths: {}
var nextColumnWidth = columnWidths.slice();
var index = nextColumnWidth.findIndex(function (elem) {
return elem.columnName === columnName;
});
var updatedColumn = nextColumnWidth[index];
var size = Math.max(MIN_SIZE, updatedColumn.width + shift);
nextColumnWidth.splice(index, 1, { columnName: columnName, width: size });
return {
columnWidths: nextColumnWidth
};
};
var changeDraftTableColumnWidths = function changeDraftTableColumnWidths(state, _ref2) {
var shifts = _ref2.shifts;
var columnWidths = state.columnWidths,
draftColumnWidths = state.draftColumnWidths;
var draftTableColumnWidth = function draftTableColumnWidth(state, _ref2) {
var columnName = _ref2.columnName,
shift = _ref2.shift;
var columnWidths = state.columnWidths;
var updatedDraftColumnWidths = Object.keys(shifts).reduce(function (acc, columnName) {
if (shifts[columnName] === null) {
delete acc[columnName];
return acc;
}
var size = Math.max(MIN_SIZE, columnWidths[columnName] + shifts[columnName]);
return Object.assign(acc, defineProperty({}, columnName, size));
}, Object.assign({}, draftColumnWidths));
return _extends({}, state, {
draftColumnWidths: updatedDraftColumnWidths
var updatedColumn = columnWidths.find(function (elem) {
return elem.columnName === columnName;
});
var size = Math.max(MIN_SIZE, updatedColumn.width + shift);
return {
draftColumnWidths: [{ columnName: updatedColumn.columnName, width: size }]
};
};
var cancelTableColumnWidthDraft = function cancelTableColumnWidthDraft() {
return {
draftColumnWidths: []
};
};
var TABLE_EDIT_COMMAND_TYPE = 'editCommand';
@@ -1268,8 +1294,8 @@ var isEditTableRow = function isEditTableRow(tableRow) {
return tableRow.type === TABLE_EDIT_TYPE;
};
var tableRowsWithEditing = function tableRowsWithEditing(tableRows, editingRows, addedRows, rowHeight) {
var rowIds = new Set(editingRows);
var tableRowsWithEditing = function tableRowsWithEditing(tableRows, editingRowIds, addedRows, rowHeight) {
var rowIds = new Set(editingRowIds);
var editedTableRows = tableRows.map(function (tableRow) {
return tableRow.type === TABLE_DATA_TYPE && rowIds.has(tableRow.rowId) ? _extends({}, tableRow, {
type: TABLE_EDIT_TYPE,
@@ -1315,37 +1341,44 @@ var isGroupTableRow = function isGroupTableRow(tableRow) {
return tableRow.type === TABLE_GROUP_TYPE;
};
var tableColumnsWithDraftGrouping = function tableColumnsWithDraftGrouping(tableColumns, draftGrouping, showColumnWhenGrouped) {
var tableColumnsWithDraftGrouping = function tableColumnsWithDraftGrouping(tableColumns, grouping, draftGrouping, showColumnWhenGrouped) {
return tableColumns.reduce(function (acc, tableColumn) {
var isDataColumn = tableColumn.type === TABLE_DATA_TYPE;
var tableColumnName = isDataColumn ? tableColumn.column.name : '';
var columnDraftGrouping = draftGrouping.find(function (grouping) {
return grouping.columnName === tableColumnName;
if (tableColumn.type !== TABLE_DATA_TYPE) {
acc.push(tableColumn);
return acc;
}
var columnName = tableColumn.column.name;
var columnGroupingExists = grouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
});
var columnDraftGroupingExists = draftGrouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
});
if (!columnDraftGrouping || showColumnWhenGrouped(tableColumnName)) {
return [].concat(toConsumableArray(acc), [tableColumn]);
} else if (columnDraftGrouping.mode === 'remove' || columnDraftGrouping.mode === 'add') {
return [].concat(toConsumableArray(acc), [_extends({}, tableColumn, {
if (!columnGroupingExists && !columnDraftGroupingExists || showColumnWhenGrouped(columnName)) {
acc.push(tableColumn);
} else if (!columnGroupingExists && columnDraftGroupingExists || columnGroupingExists && !columnDraftGroupingExists) {
acc.push(_extends({}, tableColumn, {
draft: true
})]);
}));
}
return acc;
}, []);
};
var tableColumnsWithGrouping = function tableColumnsWithGrouping(tableColumns, grouping, draftGrouping, groupIndentColumnWidth, showColumnWhenGrouped) {
var tableColumnsWithGrouping = function tableColumnsWithGrouping(columns, tableColumns, grouping, draftGrouping, indentColumnWidth, showColumnWhenGrouped) {
return [].concat(toConsumableArray(grouping.map(function (columnGrouping) {
var groupedColumn = tableColumns.find(function (tableColumn) {
return tableColumn.type === TABLE_DATA_TYPE && tableColumn.column.name === columnGrouping.columnName;
}).column;
var groupedColumn = columns.find(function (column) {
return column.name === columnGrouping.columnName;
});
return {
key: TABLE_GROUP_TYPE + '_' + groupedColumn.name,
type: TABLE_GROUP_TYPE,
column: groupedColumn,
width: groupIndentColumnWidth
width: indentColumnWidth
};
})), toConsumableArray(tableColumnsWithDraftGrouping(tableColumns, draftGrouping, showColumnWhenGrouped)));
})), toConsumableArray(tableColumnsWithDraftGrouping(tableColumns, grouping, draftGrouping, showColumnWhenGrouped)));
};
var tableRowsWithGrouping = function tableRowsWithGrouping(tableRows, isGroupRow) {
@@ -1375,8 +1408,8 @@ var tableRowsWithHeading = function tableRowsWithHeading(headerRows) {
var TABLE_DETAIL_TYPE = 'detail';
var isDetailRowExpanded = function isDetailRowExpanded(expandedRows, rowId) {
return expandedRows.indexOf(rowId) > -1;
var isDetailRowExpanded = function isDetailRowExpanded(expandedDetailRowIds, rowId) {
return expandedDetailRowIds.indexOf(rowId) > -1;
};
var isDetailToggleTableCell = function isDetailToggleTableCell(tableRow, tableColumn) {
return tableColumn.type === TABLE_DETAIL_TYPE && tableRow.type === TABLE_DATA_TYPE;
@@ -1385,26 +1418,26 @@ var isDetailTableRow = function isDetailTableRow(tableRow) {
return tableRow.type === TABLE_DETAIL_TYPE;
};
var setDetailRowExpanded = function setDetailRowExpanded(prevExpanded, _ref) {
var toggleDetailRowExpanded = function toggleDetailRowExpanded(prevExpanded, _ref) {
var rowId = _ref.rowId,
isExpanded = _ref.isExpanded;
state = _ref.state;
var expandedRows = Array.from(prevExpanded);
var expandedIndex = expandedRows.indexOf(rowId);
var isRowExpanded = isExpanded !== undefined ? isExpanded : expandedIndex === -1;
var expandedDetailRowIds = prevExpanded.slice();
var expandedIndex = expandedDetailRowIds.indexOf(rowId);
var rowState = state !== undefined ? state : expandedIndex === -1;
if (expandedIndex > -1 && !isRowExpanded) {
expandedRows.splice(expandedIndex, 1);
} else if (expandedIndex === -1 && isRowExpanded) {
expandedRows.push(rowId);
if (expandedIndex > -1 && !rowState) {
expandedDetailRowIds.splice(expandedIndex, 1);
} else if (expandedIndex === -1 && rowState) {
expandedDetailRowIds.push(rowId);
}
return expandedRows;
return expandedDetailRowIds;
};
var tableRowsWithExpandedDetail = function tableRowsWithExpandedDetail(tableRows, expandedRows, rowHeight) {
var tableRowsWithExpandedDetail = function tableRowsWithExpandedDetail(tableRows, expandedDetailRowIds, rowHeight) {
var result = tableRows;
expandedRows.forEach(function (expandedRowId) {
expandedDetailRowIds.forEach(function (expandedRowId) {
var rowIndex = result.findIndex(function (tableRow) {
return tableRow.type === TABLE_DATA_TYPE && tableRow.rowId === expandedRowId;
});
@@ -1426,8 +1459,8 @@ var tableRowsWithExpandedDetail = function tableRowsWithExpandedDetail(tableRows
return result;
};
var tableColumnsWithDetail = function tableColumnsWithDetail(tableColumns, detailToggleCellWidth) {
return [{ key: TABLE_DETAIL_TYPE, type: TABLE_DETAIL_TYPE, width: detailToggleCellWidth }].concat(toConsumableArray(tableColumns));
var tableColumnsWithDetail = function tableColumnsWithDetail(tableColumns, toggleColumnWidth) {
return [{ key: TABLE_DETAIL_TYPE, type: TABLE_DETAIL_TYPE, width: toggleColumnWidth }].concat(toConsumableArray(tableColumns));
};
var TABLE_SELECT_TYPE = 'select';
@@ -1456,12 +1489,29 @@ var isDataTableRow = function isDataTableRow(tableRow) {
return tableRow.type === TABLE_DATA_TYPE;
};
var tableColumnsWithDataRows = function tableColumnsWithDataRows(columns) {
var getColumnExtension = function getColumnExtension(columnExtensions, columnName) {
if (!columnExtensions) {
return {};
}
var columnExtension = columnExtensions.find(function (extension) {
return extension.columnName === columnName;
});
if (!columnExtension) {
return {};
}
return columnExtension;
};
var tableColumnsWithDataRows = function tableColumnsWithDataRows(columns, columnExtensions) {
return columns.map(function (column) {
var name = column.name;
var columnExtension = getColumnExtension(columnExtensions, name);
return {
key: TABLE_DATA_TYPE + '_' + column.name,
key: TABLE_DATA_TYPE + '_' + name,
type: TABLE_DATA_TYPE,
width: column.width,
width: columnExtension.width,
align: columnExtension.align,
column: column
};
});
@@ -1479,20 +1529,26 @@ var tableRowsWithDataRows = function tableRowsWithDataRows(rows, getRowId) {
});
};
var visibleTableColumns = function visibleTableColumns(tableColumns, hiddenColumns) {
var visibleTableColumns = function visibleTableColumns(tableColumns, hiddenColumnNames) {
return tableColumns.filter(function (tableColumn) {
return hiddenColumns.indexOf(tableColumn.column.name) === -1;
return tableColumn.type !== TABLE_DATA_TYPE || hiddenColumnNames.indexOf(tableColumn.column.name) === -1;
});
};
var columnChooserItems = function columnChooserItems(columns, hiddenColumns) {
var tableDataColumnsExist = function tableDataColumnsExist(tableColumns) {
return tableColumns.some(function (column) {
return column.type === TABLE_DATA_TYPE;
});
};
var columnChooserItems = function columnChooserItems(columns, hiddenColumnNames) {
return columns.map(function (column) {
return { column: column, hidden: hiddenColumns.indexOf(column.name) !== -1 };
return { column: column, hidden: hiddenColumnNames.indexOf(column.name) !== -1 };
});
};
var toggleColumn = function toggleColumn(hiddenColumns, columnName) {
return hiddenColumns.indexOf(columnName) === -1 ? [].concat(toConsumableArray(hiddenColumns), [columnName]) : hiddenColumns.filter(function (hiddenColumn) {
var toggleColumn = function toggleColumn(hiddenColumnNames, columnName) {
return hiddenColumnNames.indexOf(columnName) === -1 ? [].concat(toConsumableArray(hiddenColumnNames), [columnName]) : hiddenColumnNames.filter(function (hiddenColumn) {
return hiddenColumn !== columnName;
});
};
@@ -1652,15 +1708,36 @@ var isOnTheSameLine = function isOnTheSameLine(geometry, y) {
return y >= geometry.top && y <= geometry.bottom;
};
var getGroupCellTargetIndex = function getGroupCellTargetIndex(geometries, sourceIndex, _ref) {
var x = _ref.x,
y = _ref.y;
var rectToObject = function rectToObject(_ref) {
var top = _ref.top,
right = _ref.right,
bottom = _ref.bottom,
left = _ref.left;
return {
top: top, right: right, bottom: bottom, left: left
};
};
var collapseGapsBetweenItems = function collapseGapsBetweenItems(geometries) {
return geometries.map(function (geometry, index) {
if (index !== geometries.length - 1 && geometry.top === geometries[index + 1].top) {
return _extends({}, geometry, {
right: geometries[index + 1].left
});
}
return geometry;
});
};
var getGroupCellTargetIndex = function getGroupCellTargetIndex(geometries, sourceIndex, _ref2) {
var x = _ref2.x,
y = _ref2.y;
if (geometries.length === 0) return 0;
var targetGeometries = sourceIndex !== -1 ? getTargetColumnGeometries(geometries, sourceIndex) : geometries;
var targetGeometries = sourceIndex !== -1 ? getTargetColumnGeometries(geometries, sourceIndex) : geometries.map(rectToObject);
var targetIndex = targetGeometries.findIndex(function (geometry, index) {
var targetIndex = collapseGapsBetweenItems(targetGeometries).findIndex(function (geometry, index) {
var inVerticalBounds = isOnTheSameLine(geometry, y);
var inHorizontalBounds = x >= geometry.left && x <= geometry.right;
var shouldGoFirst = index === 0 && x < geometry.left;
@@ -1692,5 +1769,5 @@ var getMessagesFormatter = function getMessagesFormatter(messages) {
};
};
export { getTableRowColumnsWithColSpan, getTableColumnGeometries, getTableTargetColumnIndex, getAnimations, filterActiveAnimations, evalAnimations, getGroupCellTargetIndex, getMessagesFormatter, rowIdGetter, cellValueGetter, setColumnSorting, getColumnSortingDirection, sortedRows, setColumnFilter, getColumnFilterConfig, filteredRows, groupByColumn, toggleExpandedGroups, draftGroupingChange, cancelGroupingChange, draftGrouping, groupRowChecker, groupRowLevelKeyGetter, groupedRows, expandedGroupRows, customGroupedRows, customGroupingRowIdGetter, groupingPanelItems, setCurrentPage, setPageSize, paginatedRows, rowsWithPageHeaders, pageCount, rowCount, firstRowOnPage, lastRowOnPage, setRowsSelection, getAvailableToSelect, getAvailableSelection, startEditRows, stopEditRows, addRow, changeAddedRow, cancelAddedRows, changeRow, cancelChanges, deleteRows, cancelDeletedRows, changedRowsByIds, addedRowsByIds, computedCreateRowChange, getRowChange, TABLE_REORDERING_TYPE, changeColumnOrder, orderedColumns, tableHeaderRowsWithReordering, draftOrder, tableColumnsWithWidths, changeTableColumnWidths, changeDraftTableColumnWidths, TABLE_EDIT_COMMAND_TYPE, isHeadingEditCommandsTableCell, isEditCommandsTableCell, tableColumnsWithEditing, TABLE_ADDED_TYPE, TABLE_EDIT_TYPE, isEditTableCell, isAddedTableRow, isEditTableRow, tableRowsWithEditing, TABLE_FILTER_TYPE, isFilterTableCell, isFilterTableRow, tableHeaderRowsWithFilter, TABLE_GROUP_TYPE, isGroupTableCell, isGroupIndentTableCell, isGroupTableRow, tableColumnsWithGrouping, tableRowsWithGrouping, TABLE_HEADING_TYPE, isHeadingTableCell, isHeadingTableRow, tableRowsWithHeading, TABLE_DETAIL_TYPE, isDetailRowExpanded, isDetailToggleTableCell, isDetailTableRow, setDetailRowExpanded, tableRowsWithExpandedDetail, tableColumnsWithDetail, TABLE_SELECT_TYPE, isSelectTableCell, isSelectAllTableCell, tableColumnsWithSelection, TABLE_DATA_TYPE, TABLE_NODATA_TYPE, isNoDataTableRow, isDataTableCell, isHeaderStubTableCell, isDataTableRow, tableColumnsWithDataRows, tableRowsWithDataRows, visibleTableColumns, columnChooserItems, toggleColumn };
export { getColumnExtension, getTableRowColumnsWithColSpan, getTableColumnGeometries, getTableTargetColumnIndex, getAnimations, filterActiveAnimations, evalAnimations, getGroupCellTargetIndex, getMessagesFormatter, rowIdGetter, cellValueGetter, changeColumnSorting, getColumnSortingDirection, sortedRows, changeColumnFilter, getColumnFilterConfig, filteredRows, GROUP_KEY_SEPARATOR, changeColumnGrouping, toggleExpandedGroups, draftColumnGrouping, cancelColumnGroupingDraft, groupRowChecker, groupRowLevelKeyGetter, groupedRows, expandedGroupRows, customGroupedRows, customGroupingRowIdGetter, groupingPanelItems, setCurrentPage, setPageSize, paginatedRows, rowsWithPageHeaders, pageCount, rowCount, firstRowOnPage, lastRowOnPage, toggleSelection, rowsWithAvailableToSelect, someSelected, allSelected, unwrapSelectedRows, startEditRows, stopEditRows, addRow, changeAddedRow, cancelAddedRows, changeRow, cancelChanges, deleteRows, cancelDeletedRows, changedRowsByIds, addedRowsByIds, createRowChangeGetter, getRowChange, TABLE_REORDERING_TYPE, changeColumnOrder, orderedColumns, tableHeaderRowsWithReordering, draftOrder, tableColumnsWithWidths, changeTableColumnWidth, draftTableColumnWidth, cancelTableColumnWidthDraft, TABLE_EDIT_COMMAND_TYPE, isHeadingEditCommandsTableCell, isEditCommandsTableCell, tableColumnsWithEditing, TABLE_ADDED_TYPE, TABLE_EDIT_TYPE, isEditTableCell, isAddedTableRow, isEditTableRow, tableRowsWithEditing, TABLE_FILTER_TYPE, isFilterTableCell, isFilterTableRow, tableHeaderRowsWithFilter, TABLE_GROUP_TYPE, isGroupTableCell, isGroupIndentTableCell, isGroupTableRow, tableColumnsWithGrouping, tableRowsWithGrouping, TABLE_HEADING_TYPE, isHeadingTableCell, isHeadingTableRow, tableRowsWithHeading, TABLE_DETAIL_TYPE, isDetailRowExpanded, isDetailToggleTableCell, isDetailTableRow, toggleDetailRowExpanded, tableRowsWithExpandedDetail, tableColumnsWithDetail, TABLE_SELECT_TYPE, isSelectTableCell, isSelectAllTableCell, tableColumnsWithSelection, TABLE_DATA_TYPE, TABLE_NODATA_TYPE, isNoDataTableRow, isDataTableCell, isHeaderStubTableCell, isDataTableRow, tableColumnsWithDataRows, tableRowsWithDataRows, visibleTableColumns, tableDataColumnsExist, columnChooserItems, toggleColumn };
//# sourceMappingURL=dx-grid-core.es.js.map

File diff suppressed because one or more lines are too long

View File

@@ -1,7 +1,7 @@
/**
* Bundle of @devexpress/dx-grid-core
* Generated: 2017-11-10
* Version: 1.0.0-beta.1
* Generated: 2018-03-02
* Version: 1.0.3
* License: https://js.devexpress.com/Licensing
*/
@@ -23,10 +23,12 @@ var rowIdGetter = function rowIdGetter(getRowId, rows) {
return getRowId;
};
var cellValueGetter = function cellValueGetter(getCellValue, columns) {
if (getCellValue) {
return getCellValue;
}
var defaultGetCellValue = function defaultGetCellValue(row, columnName) {
return row[columnName];
};
var cellValueGetter = function cellValueGetter() {
var getCellValue = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : defaultGetCellValue;
var columns = arguments[1];
var useFastAccessor = true;
var map = columns.reduce(function (acc, column) {
@@ -37,28 +39,29 @@ var cellValueGetter = function cellValueGetter(getCellValue, columns) {
return acc;
}, {});
return useFastAccessor ? function (row, columnName) {
return row[columnName];
} : function (row, columnName) {
return map[columnName] ? map[columnName](row, columnName) : row[columnName];
if (useFastAccessor) {
return getCellValue;
}
return function (row, columnName) {
return map[columnName] ? map[columnName](row, columnName) : getCellValue(row, columnName);
};
};
var setColumnSorting = function setColumnSorting(state, _ref) {
var changeColumnSorting = function changeColumnSorting(state, _ref) {
var columnName = _ref.columnName,
direction = _ref.direction,
keepOther = _ref.keepOther,
cancel = _ref.cancel,
sortIndex = _ref.sortIndex;
var sorting = state.sorting;
var nextSorting = [];
if (keepOther === true) {
nextSorting = Array.from(sorting).slice();
nextSorting = sorting.slice();
}
if (Array.isArray(keepOther)) {
nextSorting = Array.from(sorting).filter(function (columnSorting) {
nextSorting = sorting.slice().filter(function (columnSorting) {
return keepOther.indexOf(columnSorting.columnName) > -1;
});
}
@@ -76,7 +79,7 @@ var setColumnSorting = function setColumnSorting(state, _ref) {
nextSorting.splice(columnSortingIndex, 1);
}
if (!cancel) {
if (direction !== null) {
var newIndexFallback = columnSortingIndex > -1 ? columnSortingIndex : nextSorting.length;
var newIndex = sortIndex !== undefined ? sortIndex : newIndexFallback;
nextSorting.splice(newIndex, 0, newColumnSorting);
@@ -450,7 +453,7 @@ var defaultCompare = function defaultCompare(a, b) {
};
var createCompare = function createCompare(sorting, getColumnCompare, getComparableValue) {
return Array.from(sorting).reverse().reduce(function (prevCompare, columnSorting) {
return sorting.slice().reverse().reduce(function (prevCompare, columnSorting) {
var columnName = columnSorting.columnName;
var inverse = columnSorting.direction === 'desc';
@@ -499,7 +502,7 @@ var sortedRows = function sortedRows(rows, sorting, getCellValue, getColumnCompa
if (!getRowLevelKey) {
var _compare = createCompare(sorting, getColumnCompare, getCellValue);
return mergeSort(Array.from(rows), _compare);
return mergeSort(rows.slice(), _compare);
}
var compare = createCompare(sorting, getColumnCompare, function (row, columnName) {
@@ -514,14 +517,14 @@ var sortedRows = function sortedRows(rows, sorting, getCellValue, getColumnCompa
return sortHierarchicalRows(rows, compare, getRowLevelKey);
};
var setColumnFilter = function setColumnFilter(filters, _ref) {
var changeColumnFilter = function changeColumnFilter(filters, _ref) {
var columnName = _ref.columnName,
config = _ref.config;
var filterIndex = filters.findIndex(function (f) {
return f.columnName === columnName;
});
var nextState = Array.from(filters);
var nextState = filters.slice();
if (config) {
var filter = _extends({ columnName: columnName }, config);
@@ -628,56 +631,67 @@ var filteredRows = function filteredRows(rows, filters, getCellValue, getColumnP
var GROUP_KEY_SEPARATOR = '|';
var groupByColumn = function groupByColumn(state, _ref) {
var applyColumnGrouping = function applyColumnGrouping(grouping, _ref) {
var columnName = _ref.columnName,
groupIndex = _ref.groupIndex;
var grouping = Array.from(state.grouping);
var groupingIndex = grouping.findIndex(function (g) {
var nextGrouping = grouping.slice();
var groupingIndex = nextGrouping.findIndex(function (g) {
return g.columnName === columnName;
});
var targetIndex = groupIndex;
if (groupingIndex > -1) {
grouping.splice(groupingIndex, 1);
nextGrouping.splice(groupingIndex, 1);
} else if (groupIndex === undefined) {
targetIndex = grouping.length;
targetIndex = nextGrouping.length;
}
if (targetIndex > -1) {
grouping.splice(targetIndex, 0, {
nextGrouping.splice(targetIndex, 0, {
columnName: columnName
});
}
var ungroupedColumnIndex = state.grouping.findIndex(function (group, index) {
return !grouping[index] || group.columnName !== grouping[index].columnName;
return nextGrouping;
};
var changeColumnGrouping = function changeColumnGrouping(_ref2, _ref3) {
var grouping = _ref2.grouping,
expandedGroups = _ref2.expandedGroups;
var columnName = _ref3.columnName,
groupIndex = _ref3.groupIndex;
var nextGrouping = applyColumnGrouping(grouping, { columnName: columnName, groupIndex: groupIndex });
var ungroupedColumnIndex = grouping.findIndex(function (group, index) {
return !nextGrouping[index] || group.columnName !== nextGrouping[index].columnName;
});
if (ungroupedColumnIndex === -1) {
return {
grouping: grouping
grouping: nextGrouping
};
}
var filteredExpandedGroups = state.expandedGroups.filter(function (group) {
var filteredExpandedGroups = expandedGroups.filter(function (group) {
return group.split(GROUP_KEY_SEPARATOR).length <= ungroupedColumnIndex;
});
if (filteredExpandedGroups.length === state.expandedGroups.length) {
if (filteredExpandedGroups.length === expandedGroups.length) {
return {
grouping: grouping
grouping: nextGrouping
};
}
return {
grouping: grouping,
grouping: nextGrouping,
expandedGroups: filteredExpandedGroups
};
};
var toggleExpandedGroups = function toggleExpandedGroups(state, _ref2) {
var groupKey = _ref2.groupKey;
var toggleExpandedGroups = function toggleExpandedGroups(state, _ref4) {
var groupKey = _ref4.groupKey;
var expandedGroups = Array.from(state.expandedGroups);
var expandedGroups = state.expandedGroups.slice();
var groupKeyIndex = expandedGroups.indexOf(groupKey);
if (groupKeyIndex > -1) {
@@ -691,40 +705,20 @@ var toggleExpandedGroups = function toggleExpandedGroups(state, _ref2) {
};
};
var draftGroupingChange = function draftGroupingChange(state, _ref3) {
var columnName = _ref3.columnName,
groupIndex = _ref3.groupIndex;
return { groupingChange: { columnName: columnName, groupIndex: groupIndex } };
var draftColumnGrouping = function draftColumnGrouping(_ref5, _ref6) {
var grouping = _ref5.grouping,
draftGrouping = _ref5.draftGrouping;
var columnName = _ref6.columnName,
groupIndex = _ref6.groupIndex;
return {
draftGrouping: applyColumnGrouping(draftGrouping || grouping, { columnName: columnName, groupIndex: groupIndex })
};
};
var cancelGroupingChange = function cancelGroupingChange() {
return { groupingChange: null };
var cancelColumnGroupingDraft = function cancelColumnGroupingDraft() {
return {
draftGrouping: null
};
var draftGrouping = function draftGrouping(grouping, groupingChange) {
if (!groupingChange) return grouping;
var columnName = groupingChange.columnName,
groupIndex = groupingChange.groupIndex;
var result = Array.from(grouping);
if (groupIndex !== -1) {
result = result.filter(function (g) {
return g.columnName !== columnName;
});
result.splice(groupIndex, 0, {
columnName: columnName,
draft: true,
mode: grouping.length > result.length ? 'reorder' : 'add'
});
} else {
result = result.map(function (g) {
return g.columnName === columnName ? { columnName: columnName, draft: true, mode: 'remove' } : g;
});
}
return result;
};
var GRID_GROUP_TYPE = 'group';
@@ -739,26 +733,26 @@ var groupRowLevelKeyGetter = function groupRowLevelKeyGetter(row) {
return row[GRID_GROUP_LEVEL_KEY];
};
var defaultColumnIdentity = function defaultColumnIdentity(value) {
var defaultColumnCriteria = function defaultColumnCriteria(value) {
return {
key: String(value),
value: value
};
};
var groupedRows = function groupedRows(rows, grouping, getCellValue, getColumnIdentity) {
var groupedRows = function groupedRows(rows, grouping, getCellValue, getColumnCriteria) {
var keyPrefix = arguments.length > 4 && arguments[4] !== undefined ? arguments[4] : '';
if (!grouping.length) return rows;
var columnName = grouping[0].columnName;
var groupIdentity = getColumnIdentity && getColumnIdentity(columnName) || defaultColumnIdentity;
var groupCriteria = getColumnCriteria && getColumnCriteria(columnName) || defaultColumnCriteria;
var groups = rows.reduce(function (acc, row) {
var _groupIdentity = groupIdentity(getCellValue(row, columnName), row),
key = _groupIdentity.key,
_groupIdentity$value = _groupIdentity.value,
value = _groupIdentity$value === undefined ? key : _groupIdentity$value;
var _groupCriteria = groupCriteria(getCellValue(row, columnName), row),
key = _groupCriteria.key,
_groupCriteria$value = _groupCriteria.value,
value = _groupCriteria$value === undefined ? key : _groupCriteria$value;
var sameKeyItems = acc.get(key);
@@ -782,7 +776,7 @@ var groupedRows = function groupedRows(rows, grouping, getCellValue, getColumnId
var compoundKey = '' + keyPrefix + key;
acc.push((_acc$push = {}, defineProperty(_acc$push, GRID_GROUP_CHECK, true), defineProperty(_acc$push, GRID_GROUP_LEVEL_KEY, GRID_GROUP_TYPE + '_' + groupedBy), defineProperty(_acc$push, 'groupedBy', groupedBy), defineProperty(_acc$push, 'compoundKey', compoundKey), defineProperty(_acc$push, 'key', key), defineProperty(_acc$push, 'value', value), _acc$push));
acc.push.apply(acc, toConsumableArray(groupedRows(items, nestedGrouping, getCellValue, getColumnIdentity, '' + compoundKey + GROUP_KEY_SEPARATOR)));
acc.push.apply(acc, toConsumableArray(groupedRows(items, nestedGrouping, getCellValue, getColumnCriteria, '' + compoundKey + GROUP_KEY_SEPARATOR)));
return acc;
}, []);
};
@@ -793,6 +787,7 @@ var expandedGroupRows = function expandedGroupRows(rows, grouping, expandedGroup
var groupingColumnNames = grouping.map(function (columnGrouping) {
return columnGrouping.columnName;
});
var expandedGroupsSet = new Set(expandedGroups);
var currentGroupExpanded = true;
var currentGroupLevel = 0;
@@ -811,7 +806,7 @@ var expandedGroupRows = function expandedGroupRows(rows, grouping, expandedGroup
return acc;
}
currentGroupExpanded = expandedGroups.has(row.compoundKey);
currentGroupExpanded = expandedGroupsSet.has(row.compoundKey);
currentGroupLevel = groupLevel;
if (currentGroupExpanded) {
@@ -868,19 +863,34 @@ var customGroupingRowIdGetter = function customGroupingRowIdGetter(getRowId, row
};
};
var groupingPanelItems = function groupingPanelItems(columns, grouping) {
return grouping.map(function (_ref) {
var columnName = _ref.columnName,
draft = _ref.draft;
var column = columns.find(function (c) {
return c.name === columnName;
});
var groupingPanelItems = function groupingPanelItems(columns, grouping, draftGrouping) {
var items = draftGrouping.map(function (_ref) {
var columnName = _ref.columnName;
return {
column: column,
draft: draft
column: columns.find(function (c) {
return c.name === columnName;
}),
draft: !grouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
})
};
});
grouping.forEach(function (_ref2, index) {
var columnName = _ref2.columnName;
if (draftGrouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
})) return;
items.splice(index, 0, {
column: columns.find(function (c) {
return c.name === columnName;
}),
draft: true
});
});
return items;
};
var setCurrentPage = function setCurrentPage(prevPage, page) {
@@ -959,47 +969,21 @@ var lastRowOnPage = function lastRowOnPage(currentPage, pageSize, totalRowCount)
return result;
};
var setRowSelection = function setRowSelection(selection, _ref) {
var rowId = _ref.rowId,
selected = _ref.selected;
var selectedRows = Array.from(selection);
var selectedIndex = selectedRows.indexOf(rowId);
var isRowSelected = selected;
if (isRowSelected === undefined) {
isRowSelected = selectedIndex === -1;
}
if (selectedIndex > -1 && !isRowSelected) {
selectedRows.splice(selectedIndex, 1);
} else if (selectedIndex === -1 && isRowSelected) {
selectedRows.push(rowId);
}
return selectedRows;
};
var setRowsSelection = function setRowsSelection(selection, _ref2) {
var rowIds = _ref2.rowIds,
selected = _ref2.selected;
if (rowIds.length === 1) {
return setRowSelection(selection, { rowId: rowIds[0], selected: selected });
}
var toggleSelection = function toggleSelection(selection, _ref) {
var rowIds = _ref.rowIds,
state = _ref.state;
var rowIdsSet = new Set(rowIds);
var isRowsSelected = selected;
if (isRowsSelected === undefined) {
var rowsState = state;
if (rowsState === undefined) {
var availableSelection = selection.filter(function (rowId) {
return rowIdsSet.has(rowId);
});
isRowsSelected = availableSelection.length !== rowIdsSet.size;
rowsState = availableSelection.length !== rowIdsSet.size;
}
if (isRowsSelected) {
if (rowsState) {
var selectionSet = new Set(selection);
return [].concat(toConsumableArray(selection), toConsumableArray(rowIds.filter(function (rowId) {
return !selectionSet.has(rowId);
@@ -1011,41 +995,63 @@ var setRowsSelection = function setRowsSelection(selection, _ref2) {
});
};
var getAvailableToSelect = function getAvailableToSelect(rows, getRowId, isGroupRow) {
var rowsWithAvailableToSelect = function rowsWithAvailableToSelect(rows, getRowId, isGroupRow) {
var dataRows = rows;
if (isGroupRow) {
dataRows = dataRows.filter(function (row) {
return !isGroupRow(row);
});
}
return dataRows.map(function (row) {
return { rows: rows, availableToSelect: dataRows.map(function (row) {
return getRowId(row);
}) };
};
var someSelected = function someSelected(_ref, selection) {
var availableToSelect = _ref.availableToSelect;
var selectionSet = new Set(selection);
return availableToSelect.length !== 0 && selectionSet.size !== 0 && availableToSelect.some(function (elem) {
return selectionSet.has(elem);
}) && availableToSelect.some(function (elem) {
return !selectionSet.has(elem);
});
};
var getAvailableSelection = function getAvailableSelection(selection, availableToSelect) {
var availableToSelectSet = new Set(availableToSelect);
return selection.filter(function (selected) {
return availableToSelectSet.has(selected);
var allSelected = function allSelected(_ref2, selection) {
var availableToSelect = _ref2.availableToSelect;
var selectionSet = new Set(selection);
return selectionSet.size !== 0 && availableToSelect.length !== 0 && !availableToSelect.some(function (elem) {
return !selectionSet.has(elem);
});
};
var startEditRows = function startEditRows(prevEditingRows, _ref) {
var unwrapSelectedRows = function unwrapSelectedRows(_ref3) {
var rows = _ref3.rows;
return rows;
};
var startEditRows = function startEditRows(prevEditingRowIds, _ref) {
var rowIds = _ref.rowIds;
return [].concat(toConsumableArray(prevEditingRows), toConsumableArray(rowIds));
return [].concat(toConsumableArray(prevEditingRowIds), toConsumableArray(rowIds));
};
var stopEditRows = function stopEditRows(prevEditingRows, _ref2) {
var stopEditRows = function stopEditRows(prevEditingRowIds, _ref2) {
var rowIds = _ref2.rowIds;
var rowIdSet = new Set(rowIds);
return prevEditingRows.filter(function (id) {
return prevEditingRowIds.filter(function (id) {
return !rowIdSet.has(id);
});
};
var addRow = function addRow(addedRows, _ref3) {
var row = _ref3.row;
var addRow = function addRow(addedRows) {
var _ref3 = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : { row: {} },
row = _ref3.row;
return [].concat(toConsumableArray(addedRows), [row]);
};
@@ -1053,7 +1059,7 @@ var changeAddedRow = function changeAddedRow(addedRows, _ref4) {
var rowId = _ref4.rowId,
change = _ref4.change;
var result = Array.from(addedRows);
var result = addedRows.slice();
result[rowId] = _extends({}, result[rowId], change);
return result;
};
@@ -1071,34 +1077,34 @@ var cancelAddedRows = function cancelAddedRows(addedRows, _ref5) {
return result;
};
var changeRow = function changeRow(prevChangedRows, _ref6) {
var changeRow = function changeRow(prevRowChanges, _ref6) {
var rowId = _ref6.rowId,
change = _ref6.change;
var prevChange = prevChangedRows[rowId] || {};
return _extends({}, prevChangedRows, defineProperty({}, rowId, _extends({}, prevChange, change)));
var prevChange = prevRowChanges[rowId] || {};
return _extends({}, prevRowChanges, defineProperty({}, rowId, _extends({}, prevChange, change)));
};
var cancelChanges = function cancelChanges(prevChangedRows, _ref7) {
var cancelChanges = function cancelChanges(prevRowChanges, _ref7) {
var rowIds = _ref7.rowIds;
var result = _extends({}, prevChangedRows);
var result = _extends({}, prevRowChanges);
rowIds.forEach(function (rowId) {
delete result[rowId];
});
return result;
};
var deleteRows = function deleteRows(deletedRows, _ref8) {
var deleteRows = function deleteRows(deletedRowIds, _ref8) {
var rowIds = _ref8.rowIds;
return [].concat(toConsumableArray(deletedRows), toConsumableArray(rowIds));
return [].concat(toConsumableArray(deletedRowIds), toConsumableArray(rowIds));
};
var cancelDeletedRows = function cancelDeletedRows(deletedRows, _ref9) {
var cancelDeletedRows = function cancelDeletedRows(deletedRowIds, _ref9) {
var rowIds = _ref9.rowIds;
var rowIdSet = new Set(rowIds);
return deletedRows.filter(function (rowId) {
return deletedRowIds.filter(function (rowId) {
return !rowIdSet.has(rowId);
});
};
@@ -1122,21 +1128,30 @@ var addedRowsByIds = function addedRowsByIds(addedRows, rowIds) {
return result;
};
var computedCreateRowChange = function computedCreateRowChange(columns) {
var map = columns.reduce(function (acc, column) {
if (column.createRowChange) {
acc[column.name] = column.createRowChange;
var defaultCreateRowChange = function defaultCreateRowChange(row, value, columnName) {
return defineProperty({}, columnName, value);
};
var createRowChangeGetter = function createRowChangeGetter() {
var createRowChange = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : defaultCreateRowChange;
var columnExtensions = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : [];
var map = columnExtensions.reduce(function (acc, columnExtension) {
if (columnExtension.createRowChange) {
acc[columnExtension.columnName] = columnExtension.createRowChange;
}
return acc;
}, {});
return function (row, columnName, value) {
return map[columnName] ? map[columnName](row, value, columnName) : defineProperty({}, columnName, value);
return function (row, value, columnName) {
if (map[columnName]) {
return map[columnName](row, value, columnName);
}
return createRowChange(row, value, columnName);
};
};
var getRowChange = function getRowChange(changedRows, rowId) {
return changedRows[rowId] || {};
var getRowChange = function getRowChange(rowChanges, rowId) {
return rowChanges[rowId] || {};
};
var TABLE_REORDERING_TYPE = 'reordering';
@@ -1147,7 +1162,7 @@ var changeColumnOrder = function changeColumnOrder(order, _ref) {
var sourceColumnIndex = order.indexOf(sourceColumnName);
var targetColumnIndex = order.indexOf(targetColumnName);
var newOrder = Array.from(order);
var newOrder = order.slice();
newOrder.splice(sourceColumnIndex, 1);
newOrder.splice(targetColumnIndex, 0, sourceColumnName);
@@ -1158,7 +1173,7 @@ var TABLE_DATA_TYPE = 'data';
var TABLE_NODATA_TYPE = 'nodata';
var orderedColumns = function orderedColumns(tableColumns, order) {
var result = Array.from(tableColumns);
var result = tableColumns.slice();
result.sort(function (a, b) {
if (a.type !== TABLE_DATA_TYPE || b.type !== TABLE_DATA_TYPE) return 0;
@@ -1198,7 +1213,11 @@ var tableColumnsWithWidths = function tableColumnsWithWidths(tableColumns, colum
return tableColumns.reduce(function (acc, tableColumn) {
if (tableColumn.type === 'data') {
var columnName = tableColumn.column.name;
var width = draftColumnWidths[columnName] || columnWidths[columnName];
var isCurrentColumn = function isCurrentColumn(elem) {
return elem.columnName === columnName;
};
var column = draftColumnWidths.find(isCurrentColumn) || columnWidths.find(isCurrentColumn);
var width = column && column.width;
if (width === undefined) {
throw new Error(UNSET_COLUMN_WIDTH_ERROR.replace('$1', columnName));
}
@@ -1212,36 +1231,43 @@ var tableColumnsWithWidths = function tableColumnsWithWidths(tableColumns, colum
var MIN_SIZE = 40;
var changeTableColumnWidths = function changeTableColumnWidths(state, _ref) {
var shifts = _ref.shifts;
var changeTableColumnWidth = function changeTableColumnWidth(state, _ref) {
var columnName = _ref.columnName,
shift = _ref.shift;
var columnWidths = state.columnWidths;
var updatedColumnWidths = Object.keys(shifts).reduce(function (acc, columnName) {
var size = Math.max(MIN_SIZE, columnWidths[columnName] + shifts[columnName]);
return Object.assign(acc, defineProperty({}, columnName, size));
}, {});
return _extends({}, state, {
columnWidths: _extends({}, columnWidths, updatedColumnWidths),
draftColumnWidths: {}
var nextColumnWidth = columnWidths.slice();
var index = nextColumnWidth.findIndex(function (elem) {
return elem.columnName === columnName;
});
var updatedColumn = nextColumnWidth[index];
var size = Math.max(MIN_SIZE, updatedColumn.width + shift);
nextColumnWidth.splice(index, 1, { columnName: columnName, width: size });
return {
columnWidths: nextColumnWidth
};
};
var changeDraftTableColumnWidths = function changeDraftTableColumnWidths(state, _ref2) {
var shifts = _ref2.shifts;
var columnWidths = state.columnWidths,
draftColumnWidths = state.draftColumnWidths;
var draftTableColumnWidth = function draftTableColumnWidth(state, _ref2) {
var columnName = _ref2.columnName,
shift = _ref2.shift;
var columnWidths = state.columnWidths;
var updatedDraftColumnWidths = Object.keys(shifts).reduce(function (acc, columnName) {
if (shifts[columnName] === null) {
delete acc[columnName];
return acc;
}
var size = Math.max(MIN_SIZE, columnWidths[columnName] + shifts[columnName]);
return Object.assign(acc, defineProperty({}, columnName, size));
}, Object.assign({}, draftColumnWidths));
return _extends({}, state, {
draftColumnWidths: updatedDraftColumnWidths
var updatedColumn = columnWidths.find(function (elem) {
return elem.columnName === columnName;
});
var size = Math.max(MIN_SIZE, updatedColumn.width + shift);
return {
draftColumnWidths: [{ columnName: updatedColumn.columnName, width: size }]
};
};
var cancelTableColumnWidthDraft = function cancelTableColumnWidthDraft() {
return {
draftColumnWidths: []
};
};
var TABLE_EDIT_COMMAND_TYPE = 'editCommand';
@@ -1272,8 +1298,8 @@ var isEditTableRow = function isEditTableRow(tableRow) {
return tableRow.type === TABLE_EDIT_TYPE;
};
var tableRowsWithEditing = function tableRowsWithEditing(tableRows, editingRows, addedRows, rowHeight) {
var rowIds = new Set(editingRows);
var tableRowsWithEditing = function tableRowsWithEditing(tableRows, editingRowIds, addedRows, rowHeight) {
var rowIds = new Set(editingRowIds);
var editedTableRows = tableRows.map(function (tableRow) {
return tableRow.type === TABLE_DATA_TYPE && rowIds.has(tableRow.rowId) ? _extends({}, tableRow, {
type: TABLE_EDIT_TYPE,
@@ -1319,37 +1345,44 @@ var isGroupTableRow = function isGroupTableRow(tableRow) {
return tableRow.type === TABLE_GROUP_TYPE;
};
var tableColumnsWithDraftGrouping = function tableColumnsWithDraftGrouping(tableColumns, draftGrouping, showColumnWhenGrouped) {
var tableColumnsWithDraftGrouping = function tableColumnsWithDraftGrouping(tableColumns, grouping, draftGrouping, showColumnWhenGrouped) {
return tableColumns.reduce(function (acc, tableColumn) {
var isDataColumn = tableColumn.type === TABLE_DATA_TYPE;
var tableColumnName = isDataColumn ? tableColumn.column.name : '';
var columnDraftGrouping = draftGrouping.find(function (grouping) {
return grouping.columnName === tableColumnName;
if (tableColumn.type !== TABLE_DATA_TYPE) {
acc.push(tableColumn);
return acc;
}
var columnName = tableColumn.column.name;
var columnGroupingExists = grouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
});
var columnDraftGroupingExists = draftGrouping.some(function (columnGrouping) {
return columnGrouping.columnName === columnName;
});
if (!columnDraftGrouping || showColumnWhenGrouped(tableColumnName)) {
return [].concat(toConsumableArray(acc), [tableColumn]);
} else if (columnDraftGrouping.mode === 'remove' || columnDraftGrouping.mode === 'add') {
return [].concat(toConsumableArray(acc), [_extends({}, tableColumn, {
if (!columnGroupingExists && !columnDraftGroupingExists || showColumnWhenGrouped(columnName)) {
acc.push(tableColumn);
} else if (!columnGroupingExists && columnDraftGroupingExists || columnGroupingExists && !columnDraftGroupingExists) {
acc.push(_extends({}, tableColumn, {
draft: true
})]);
}));
}
return acc;
}, []);
};
var tableColumnsWithGrouping = function tableColumnsWithGrouping(tableColumns, grouping, draftGrouping, groupIndentColumnWidth, showColumnWhenGrouped) {
var tableColumnsWithGrouping = function tableColumnsWithGrouping(columns, tableColumns, grouping, draftGrouping, indentColumnWidth, showColumnWhenGrouped) {
return [].concat(toConsumableArray(grouping.map(function (columnGrouping) {
var groupedColumn = tableColumns.find(function (tableColumn) {
return tableColumn.type === TABLE_DATA_TYPE && tableColumn.column.name === columnGrouping.columnName;
}).column;
var groupedColumn = columns.find(function (column) {
return column.name === columnGrouping.columnName;
});
return {
key: TABLE_GROUP_TYPE + '_' + groupedColumn.name,
type: TABLE_GROUP_TYPE,
column: groupedColumn,
width: groupIndentColumnWidth
width: indentColumnWidth
};
})), toConsumableArray(tableColumnsWithDraftGrouping(tableColumns, draftGrouping, showColumnWhenGrouped)));
})), toConsumableArray(tableColumnsWithDraftGrouping(tableColumns, grouping, draftGrouping, showColumnWhenGrouped)));
};
var tableRowsWithGrouping = function tableRowsWithGrouping(tableRows, isGroupRow) {
@@ -1379,8 +1412,8 @@ var tableRowsWithHeading = function tableRowsWithHeading(headerRows) {
var TABLE_DETAIL_TYPE = 'detail';
var isDetailRowExpanded = function isDetailRowExpanded(expandedRows, rowId) {
return expandedRows.indexOf(rowId) > -1;
var isDetailRowExpanded = function isDetailRowExpanded(expandedDetailRowIds, rowId) {
return expandedDetailRowIds.indexOf(rowId) > -1;
};
var isDetailToggleTableCell = function isDetailToggleTableCell(tableRow, tableColumn) {
return tableColumn.type === TABLE_DETAIL_TYPE && tableRow.type === TABLE_DATA_TYPE;
@@ -1389,26 +1422,26 @@ var isDetailTableRow = function isDetailTableRow(tableRow) {
return tableRow.type === TABLE_DETAIL_TYPE;
};
var setDetailRowExpanded = function setDetailRowExpanded(prevExpanded, _ref) {
var toggleDetailRowExpanded = function toggleDetailRowExpanded(prevExpanded, _ref) {
var rowId = _ref.rowId,
isExpanded = _ref.isExpanded;
state = _ref.state;
var expandedRows = Array.from(prevExpanded);
var expandedIndex = expandedRows.indexOf(rowId);
var isRowExpanded = isExpanded !== undefined ? isExpanded : expandedIndex === -1;
var expandedDetailRowIds = prevExpanded.slice();
var expandedIndex = expandedDetailRowIds.indexOf(rowId);
var rowState = state !== undefined ? state : expandedIndex === -1;
if (expandedIndex > -1 && !isRowExpanded) {
expandedRows.splice(expandedIndex, 1);
} else if (expandedIndex === -1 && isRowExpanded) {
expandedRows.push(rowId);
if (expandedIndex > -1 && !rowState) {
expandedDetailRowIds.splice(expandedIndex, 1);
} else if (expandedIndex === -1 && rowState) {
expandedDetailRowIds.push(rowId);
}
return expandedRows;
return expandedDetailRowIds;
};
var tableRowsWithExpandedDetail = function tableRowsWithExpandedDetail(tableRows, expandedRows, rowHeight) {
var tableRowsWithExpandedDetail = function tableRowsWithExpandedDetail(tableRows, expandedDetailRowIds, rowHeight) {
var result = tableRows;
expandedRows.forEach(function (expandedRowId) {
expandedDetailRowIds.forEach(function (expandedRowId) {
var rowIndex = result.findIndex(function (tableRow) {
return tableRow.type === TABLE_DATA_TYPE && tableRow.rowId === expandedRowId;
});
@@ -1430,8 +1463,8 @@ var tableRowsWithExpandedDetail = function tableRowsWithExpandedDetail(tableRows
return result;
};
var tableColumnsWithDetail = function tableColumnsWithDetail(tableColumns, detailToggleCellWidth) {
return [{ key: TABLE_DETAIL_TYPE, type: TABLE_DETAIL_TYPE, width: detailToggleCellWidth }].concat(toConsumableArray(tableColumns));
var tableColumnsWithDetail = function tableColumnsWithDetail(tableColumns, toggleColumnWidth) {
return [{ key: TABLE_DETAIL_TYPE, type: TABLE_DETAIL_TYPE, width: toggleColumnWidth }].concat(toConsumableArray(tableColumns));
};
var TABLE_SELECT_TYPE = 'select';
@@ -1460,12 +1493,29 @@ var isDataTableRow = function isDataTableRow(tableRow) {
return tableRow.type === TABLE_DATA_TYPE;
};
var tableColumnsWithDataRows = function tableColumnsWithDataRows(columns) {
var getColumnExtension = function getColumnExtension(columnExtensions, columnName) {
if (!columnExtensions) {
return {};
}
var columnExtension = columnExtensions.find(function (extension) {
return extension.columnName === columnName;
});
if (!columnExtension) {
return {};
}
return columnExtension;
};
var tableColumnsWithDataRows = function tableColumnsWithDataRows(columns, columnExtensions) {
return columns.map(function (column) {
var name = column.name;
var columnExtension = getColumnExtension(columnExtensions, name);
return {
key: TABLE_DATA_TYPE + '_' + column.name,
key: TABLE_DATA_TYPE + '_' + name,
type: TABLE_DATA_TYPE,
width: column.width,
width: columnExtension.width,
align: columnExtension.align,
column: column
};
});
@@ -1483,20 +1533,26 @@ var tableRowsWithDataRows = function tableRowsWithDataRows(rows, getRowId) {
});
};
var visibleTableColumns = function visibleTableColumns(tableColumns, hiddenColumns) {
var visibleTableColumns = function visibleTableColumns(tableColumns, hiddenColumnNames) {
return tableColumns.filter(function (tableColumn) {
return hiddenColumns.indexOf(tableColumn.column.name) === -1;
return tableColumn.type !== TABLE_DATA_TYPE || hiddenColumnNames.indexOf(tableColumn.column.name) === -1;
});
};
var columnChooserItems = function columnChooserItems(columns, hiddenColumns) {
var tableDataColumnsExist = function tableDataColumnsExist(tableColumns) {
return tableColumns.some(function (column) {
return column.type === TABLE_DATA_TYPE;
});
};
var columnChooserItems = function columnChooserItems(columns, hiddenColumnNames) {
return columns.map(function (column) {
return { column: column, hidden: hiddenColumns.indexOf(column.name) !== -1 };
return { column: column, hidden: hiddenColumnNames.indexOf(column.name) !== -1 };
});
};
var toggleColumn = function toggleColumn(hiddenColumns, columnName) {
return hiddenColumns.indexOf(columnName) === -1 ? [].concat(toConsumableArray(hiddenColumns), [columnName]) : hiddenColumns.filter(function (hiddenColumn) {
var toggleColumn = function toggleColumn(hiddenColumnNames, columnName) {
return hiddenColumnNames.indexOf(columnName) === -1 ? [].concat(toConsumableArray(hiddenColumnNames), [columnName]) : hiddenColumnNames.filter(function (hiddenColumn) {
return hiddenColumn !== columnName;
});
};
@@ -1656,15 +1712,36 @@ var isOnTheSameLine = function isOnTheSameLine(geometry, y) {
return y >= geometry.top && y <= geometry.bottom;
};
var getGroupCellTargetIndex = function getGroupCellTargetIndex(geometries, sourceIndex, _ref) {
var x = _ref.x,
y = _ref.y;
var rectToObject = function rectToObject(_ref) {
var top = _ref.top,
right = _ref.right,
bottom = _ref.bottom,
left = _ref.left;
return {
top: top, right: right, bottom: bottom, left: left
};
};
var collapseGapsBetweenItems = function collapseGapsBetweenItems(geometries) {
return geometries.map(function (geometry, index) {
if (index !== geometries.length - 1 && geometry.top === geometries[index + 1].top) {
return _extends({}, geometry, {
right: geometries[index + 1].left
});
}
return geometry;
});
};
var getGroupCellTargetIndex = function getGroupCellTargetIndex(geometries, sourceIndex, _ref2) {
var x = _ref2.x,
y = _ref2.y;
if (geometries.length === 0) return 0;
var targetGeometries = sourceIndex !== -1 ? getTargetColumnGeometries(geometries, sourceIndex) : geometries;
var targetGeometries = sourceIndex !== -1 ? getTargetColumnGeometries(geometries, sourceIndex) : geometries.map(rectToObject);
var targetIndex = targetGeometries.findIndex(function (geometry, index) {
var targetIndex = collapseGapsBetweenItems(targetGeometries).findIndex(function (geometry, index) {
var inVerticalBounds = isOnTheSameLine(geometry, y);
var inHorizontalBounds = x >= geometry.left && x <= geometry.right;
var shouldGoFirst = index === 0 && x < geometry.left;
@@ -1696,6 +1773,7 @@ var getMessagesFormatter = function getMessagesFormatter(messages) {
};
};
exports.getColumnExtension = getColumnExtension;
exports.getTableRowColumnsWithColSpan = getTableRowColumnsWithColSpan;
exports.getTableColumnGeometries = getTableColumnGeometries;
exports.getTableTargetColumnIndex = getTableTargetColumnIndex;
@@ -1706,17 +1784,17 @@ exports.getGroupCellTargetIndex = getGroupCellTargetIndex;
exports.getMessagesFormatter = getMessagesFormatter;
exports.rowIdGetter = rowIdGetter;
exports.cellValueGetter = cellValueGetter;
exports.setColumnSorting = setColumnSorting;
exports.changeColumnSorting = changeColumnSorting;
exports.getColumnSortingDirection = getColumnSortingDirection;
exports.sortedRows = sortedRows;
exports.setColumnFilter = setColumnFilter;
exports.changeColumnFilter = changeColumnFilter;
exports.getColumnFilterConfig = getColumnFilterConfig;
exports.filteredRows = filteredRows;
exports.groupByColumn = groupByColumn;
exports.GROUP_KEY_SEPARATOR = GROUP_KEY_SEPARATOR;
exports.changeColumnGrouping = changeColumnGrouping;
exports.toggleExpandedGroups = toggleExpandedGroups;
exports.draftGroupingChange = draftGroupingChange;
exports.cancelGroupingChange = cancelGroupingChange;
exports.draftGrouping = draftGrouping;
exports.draftColumnGrouping = draftColumnGrouping;
exports.cancelColumnGroupingDraft = cancelColumnGroupingDraft;
exports.groupRowChecker = groupRowChecker;
exports.groupRowLevelKeyGetter = groupRowLevelKeyGetter;
exports.groupedRows = groupedRows;
@@ -1732,9 +1810,11 @@ exports.pageCount = pageCount;
exports.rowCount = rowCount;
exports.firstRowOnPage = firstRowOnPage;
exports.lastRowOnPage = lastRowOnPage;
exports.setRowsSelection = setRowsSelection;
exports.getAvailableToSelect = getAvailableToSelect;
exports.getAvailableSelection = getAvailableSelection;
exports.toggleSelection = toggleSelection;
exports.rowsWithAvailableToSelect = rowsWithAvailableToSelect;
exports.someSelected = someSelected;
exports.allSelected = allSelected;
exports.unwrapSelectedRows = unwrapSelectedRows;
exports.startEditRows = startEditRows;
exports.stopEditRows = stopEditRows;
exports.addRow = addRow;
@@ -1746,7 +1826,7 @@ exports.deleteRows = deleteRows;
exports.cancelDeletedRows = cancelDeletedRows;
exports.changedRowsByIds = changedRowsByIds;
exports.addedRowsByIds = addedRowsByIds;
exports.computedCreateRowChange = computedCreateRowChange;
exports.createRowChangeGetter = createRowChangeGetter;
exports.getRowChange = getRowChange;
exports.TABLE_REORDERING_TYPE = TABLE_REORDERING_TYPE;
exports.changeColumnOrder = changeColumnOrder;
@@ -1754,8 +1834,9 @@ exports.orderedColumns = orderedColumns;
exports.tableHeaderRowsWithReordering = tableHeaderRowsWithReordering;
exports.draftOrder = draftOrder;
exports.tableColumnsWithWidths = tableColumnsWithWidths;
exports.changeTableColumnWidths = changeTableColumnWidths;
exports.changeDraftTableColumnWidths = changeDraftTableColumnWidths;
exports.changeTableColumnWidth = changeTableColumnWidth;
exports.draftTableColumnWidth = draftTableColumnWidth;
exports.cancelTableColumnWidthDraft = cancelTableColumnWidthDraft;
exports.TABLE_EDIT_COMMAND_TYPE = TABLE_EDIT_COMMAND_TYPE;
exports.isHeadingEditCommandsTableCell = isHeadingEditCommandsTableCell;
exports.isEditCommandsTableCell = isEditCommandsTableCell;
@@ -1784,7 +1865,7 @@ exports.TABLE_DETAIL_TYPE = TABLE_DETAIL_TYPE;
exports.isDetailRowExpanded = isDetailRowExpanded;
exports.isDetailToggleTableCell = isDetailToggleTableCell;
exports.isDetailTableRow = isDetailTableRow;
exports.setDetailRowExpanded = setDetailRowExpanded;
exports.toggleDetailRowExpanded = toggleDetailRowExpanded;
exports.tableRowsWithExpandedDetail = tableRowsWithExpandedDetail;
exports.tableColumnsWithDetail = tableColumnsWithDetail;
exports.TABLE_SELECT_TYPE = TABLE_SELECT_TYPE;
@@ -1800,6 +1881,7 @@ exports.isDataTableRow = isDataTableRow;
exports.tableColumnsWithDataRows = tableColumnsWithDataRows;
exports.tableRowsWithDataRows = tableRowsWithDataRows;
exports.visibleTableColumns = visibleTableColumns;
exports.tableDataColumnsExist = tableDataColumnsExist;
exports.columnChooserItems = columnChooserItems;
exports.toggleColumn = toggleColumn;

File diff suppressed because one or more lines are too long

View File

@@ -1,8 +1,8 @@
{
"_from": "@devexpress/dx-grid-core",
"_id": "@devexpress/dx-grid-core@1.0.0-beta.1",
"_id": "@devexpress/dx-grid-core@1.0.3",
"_inBundle": false,
"_integrity": "sha512-3hKM7JUKKHJGJ8C/B20SDfCbkxr7R6ADhKb/IfkWrepJQ78uPDce9wWxwkjl8EqSd8r1jKMWbg5dgXMU6zQwWw==",
"_integrity": "sha512-k+mzGd1Gjqbq92BwZdr+UMQcTFfezk2usEaSRqBO30b6+THYYAIx5kFzCbkcv1H37CtFNju29t52ZTNDJZixVQ==",
"_location": "/@devexpress/dx-grid-core",
"_phantomChildren": {},
"_requested": {
@@ -17,12 +17,13 @@
"fetchSpec": "latest"
},
"_requiredBy": [
"#USER"
"#USER",
"/@devexpress/dx-react-grid"
],
"_resolved": "https://registry.npmjs.org/@devexpress/dx-grid-core/-/dx-grid-core-1.0.0-beta.1.tgz",
"_shasum": "48f76255c7192e7727f2c9b97efb2bf70774471d",
"_resolved": "https://registry.npmjs.org/@devexpress/dx-grid-core/-/dx-grid-core-1.0.3.tgz",
"_shasum": "e6b2708593c10c6dfab2cbc4c2c3f82b5ab910c2",
"_spec": "@devexpress/dx-grid-core",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI",
"author": {
"name": "Developer Express Inc.",
"url": "https://www.devexpress.com/"
@@ -34,24 +35,23 @@
"deprecated": false,
"description": "Core library for the DevExtreme Reactive Grid component",
"devDependencies": {
"@devexpress/dx-core": "1.0.0-beta.1",
"@devexpress/dx-core": "1.0.3",
"babel-core": "^6.26.0",
"babel-jest": "^21.2.0",
"babel-jest": "^22.1.0",
"babel-plugin-external-helpers": "^6.22.0",
"babel-plugin-transform-object-rest-spread": "^6.26.0",
"babel-plugin-transform-runtime": "^6.23.0",
"babel-preset-es2015": "^6.24.1",
"core-js": "^2.5.1",
"eslint": "^4.10.0",
"core-js": "^2.5.3",
"eslint": "^4.16.0",
"eslint-config-airbnb-base": "^12.1.0",
"eslint-plugin-filenames": "^1.2.0",
"eslint-plugin-import": "^2.8.0",
"eslint-plugin-jest": "^21.3.0",
"jest": "^21.2.1",
"eslint-plugin-jest": "^21.7.0",
"jest": "^22.1.4",
"rollup": "0.50.0",
"rollup-plugin-babel": "^3.0.2",
"rollup-plugin-license": "^0.5.0",
"seamless-immutable": "^7.1.2"
"rollup-plugin-babel": "^3.0.3",
"rollup-plugin-license": "^0.5.0"
},
"files": [
"dist"
@@ -68,7 +68,7 @@
"module": "dist/dx-grid-core.es.js",
"name": "@devexpress/dx-grid-core",
"peerDependencies": {
"@devexpress/dx-core": "1.0.0-beta.1"
"@devexpress/dx-core": "1.0.3"
},
"publishConfig": {
"access": "public"
@@ -86,5 +86,5 @@
"test:coverage": "jest --coverage",
"test:watch": "jest --watch"
},
"version": "1.0.0-beta.1"
"version": "1.0.3"
}

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

View File

@@ -1,7 +1,7 @@
/**
* Bundle of @devexpress/dx-core
* Generated: 2017-11-10
* Version: 1.0.0-beta.1
* Generated: 2018-03-02
* Version: 1.0.3
* License: https://js.devexpress.com/Licensing
*/
@@ -186,22 +186,22 @@ var PluginHost = function () {
this.plugins.filter(function (plugin) {
return plugin.container;
}).forEach(function (plugin) {
if (knownOptionals.has(plugin.pluginName)) {
throw getDependencyError(knownOptionals.get(plugin.pluginName), plugin.pluginName);
if (knownOptionals.has(plugin.name)) {
throw getDependencyError(knownOptionals.get(plugin.name), plugin.name);
}
plugin.dependencies.forEach(function (dependency) {
if (defined.has(dependency.pluginName)) return;
if (defined.has(dependency.name)) return;
if (dependency.optional) {
if (!knownOptionals.has(dependency.pluginName)) {
knownOptionals.set(dependency.pluginName, plugin.pluginName);
if (!knownOptionals.has(dependency.name)) {
knownOptionals.set(dependency.name, plugin.name);
}
return;
}
throw getDependencyError(plugin.pluginName, dependency.pluginName);
throw getDependencyError(plugin.name, dependency.name);
});
defined.add(plugin.pluginName);
defined.add(plugin.name);
});
}
}, {

File diff suppressed because one or more lines are too long

View File

@@ -1,7 +1,7 @@
/**
* Bundle of @devexpress/dx-core
* Generated: 2017-11-10
* Version: 1.0.0-beta.1
* Generated: 2018-03-02
* Version: 1.0.3
* License: https://js.devexpress.com/Licensing
*/
@@ -192,22 +192,22 @@ var PluginHost = function () {
this.plugins.filter(function (plugin) {
return plugin.container;
}).forEach(function (plugin) {
if (knownOptionals.has(plugin.pluginName)) {
throw getDependencyError(knownOptionals.get(plugin.pluginName), plugin.pluginName);
if (knownOptionals.has(plugin.name)) {
throw getDependencyError(knownOptionals.get(plugin.name), plugin.name);
}
plugin.dependencies.forEach(function (dependency) {
if (defined.has(dependency.pluginName)) return;
if (defined.has(dependency.name)) return;
if (dependency.optional) {
if (!knownOptionals.has(dependency.pluginName)) {
knownOptionals.set(dependency.pluginName, plugin.pluginName);
if (!knownOptionals.has(dependency.name)) {
knownOptionals.set(dependency.name, plugin.name);
}
return;
}
throw getDependencyError(plugin.pluginName, dependency.pluginName);
throw getDependencyError(plugin.name, dependency.name);
});
defined.add(plugin.pluginName);
defined.add(plugin.name);
});
}
}, {

File diff suppressed because one or more lines are too long

View File

@@ -1,28 +1,28 @@
{
"_from": "@devexpress/dx-core@1.0.0-beta.1",
"_id": "@devexpress/dx-core@1.0.0-beta.1",
"_from": "@devexpress/dx-core@1.0.3",
"_id": "@devexpress/dx-core@1.0.3",
"_inBundle": false,
"_integrity": "sha512-4Kv5RTlmlK7o2DF5BB5r2yWgshvFrUSHWzJzdSyBtFxsQzvI3vJqS0Z0mAplZCyYfRk4xh9SRp6I9DML66v0EQ==",
"_integrity": "sha512-M1Kjju074ddAQmaFuKypM/LdhCZsDISqhGj4LST2ZGQPlGpH89BMBEV8p+8MedFQQCG/svuS25AKip1Gs9KJgA==",
"_location": "/@devexpress/dx-react-core/@devexpress/dx-core",
"_phantomChildren": {},
"_requested": {
"type": "version",
"registry": true,
"raw": "@devexpress/dx-core@1.0.0-beta.1",
"raw": "@devexpress/dx-core@1.0.3",
"name": "@devexpress/dx-core",
"escapedName": "@devexpress%2fdx-core",
"scope": "@devexpress",
"rawSpec": "1.0.0-beta.1",
"rawSpec": "1.0.3",
"saveSpec": null,
"fetchSpec": "1.0.0-beta.1"
"fetchSpec": "1.0.3"
},
"_requiredBy": [
"/@devexpress/dx-react-core"
],
"_resolved": "https://registry.npmjs.org/@devexpress/dx-core/-/dx-core-1.0.0-beta.1.tgz",
"_shasum": "63383ec2bd3903d9a163c1316706cde32227d6b4",
"_spec": "@devexpress/dx-core@1.0.0-beta.1",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core",
"_resolved": "https://registry.npmjs.org/@devexpress/dx-core/-/dx-core-1.0.3.tgz",
"_shasum": "c310b540229f83d6be5797fb2a5da5491757d21b",
"_spec": "@devexpress/dx-core@1.0.3",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core",
"author": {
"name": "Developer Express Inc.",
"url": "https://www.devexpress.com/"
@@ -35,20 +35,20 @@
"description": "Core library for DevExtreme Reactive Components",
"devDependencies": {
"babel-core": "^6.26.0",
"babel-jest": "^21.2.0",
"babel-jest": "^22.1.0",
"babel-plugin-external-helpers": "^6.22.0",
"babel-plugin-transform-object-rest-spread": "^6.26.0",
"babel-plugin-transform-runtime": "^6.23.0",
"babel-preset-es2015": "^6.24.1",
"core-js": "^2.5.1",
"eslint": "^4.10.0",
"core-js": "^2.5.3",
"eslint": "^4.16.0",
"eslint-config-airbnb-base": "^12.1.0",
"eslint-plugin-filenames": "^1.2.0",
"eslint-plugin-import": "^2.8.0",
"eslint-plugin-jest": "^21.3.0",
"jest": "^21.2.1",
"eslint-plugin-jest": "^21.7.0",
"jest": "^22.1.4",
"rollup": "0.50.0",
"rollup-plugin-babel": "^3.0.2",
"rollup-plugin-babel": "^3.0.3",
"rollup-plugin-license": "^0.5.0"
},
"files": [
@@ -81,5 +81,5 @@
"test:coverage": "jest --coverage",
"test:watch": "jest --watch"
},
"version": "1.0.0-beta.1"
"version": "1.0.3"
}

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/asap/-/asap-2.0.6.tgz",
"_shasum": "e50347611d7e690943208bbdafebcbc2fb866d46",
"_spec": "asap@~2.0.3",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\promise",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\promise",
"browser": {
"./asap": "./browser-asap.js",
"./asap.js": "./browser-asap.js",

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/core-js/-/core-js-1.2.7.tgz",
"_shasum": "652294c14651db28fa93bd2d5ff2983a4f08c636",
"_spec": "core-js@^1.0.0",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\fbjs",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\fbjs",
"bugs": {
"url": "https://github.com/zloirock/core-js/issues"
},

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/encoding/-/encoding-0.1.12.tgz",
"_shasum": "538b66f3ee62cd1ab51ec323829d1f9480c74beb",
"_spec": "encoding@^0.1.11",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\node-fetch",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\node-fetch",
"author": {
"name": "Andris Reinman"
},

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/fbjs/-/fbjs-0.8.16.tgz",
"_shasum": "5e67432f550dc41b572bf55847b8aca64e5337db",
"_spec": "fbjs@^0.8.16",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\prop-types",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\prop-types",
"browserify": {
"transform": [
"loose-envify"

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.4.19.tgz",
"_shasum": "f7468f60135f5e5dad3399c0a81be9a1603a082b",
"_spec": "iconv-lite@~0.4.13",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\encoding",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\encoding",
"author": {
"name": "Alexander Shtuchkin",
"email": "ashtuchkin@gmail.com"

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/is-stream/-/is-stream-1.1.0.tgz",
"_shasum": "12d4a3dd4e68e0b79ceb8dbc84173ae80d91ca44",
"_spec": "is-stream@^1.0.1",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\node-fetch",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\node-fetch",
"author": {
"name": "Sindre Sorhus",
"email": "sindresorhus@gmail.com",

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/isomorphic-fetch/-/isomorphic-fetch-2.2.1.tgz",
"_shasum": "611ae1acf14f5e81f729507472819fe9733558a9",
"_spec": "isomorphic-fetch@^2.1.1",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\fbjs",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\fbjs",
"author": {
"name": "Matt Andrews",
"email": "matt@mattandre.ws"

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-3.0.2.tgz",
"_shasum": "9866df395102130e38f7f996bceb65443209c25b",
"_spec": "js-tokens@^3.0.0",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\loose-envify",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\loose-envify",
"author": {
"name": "Simon Lydell"
},

View File

@@ -22,7 +22,7 @@
"_resolved": "https://registry.npmjs.org/loose-envify/-/loose-envify-1.3.1.tgz",
"_shasum": "d1a8ad33fa9ce0e713d65fdd0ac8b748d478c848",
"_spec": "loose-envify@^1.3.1",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\prop-types",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\prop-types",
"author": {
"name": "Andres Suarez",
"email": "zertosh@gmail.com"

View File

@@ -21,7 +21,7 @@
"_resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-1.7.3.tgz",
"_shasum": "980f6f72d85211a5347c6b2bc18c5b84c3eb47ef",
"_spec": "node-fetch@^1.0.1",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\isomorphic-fetch",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\isomorphic-fetch",
"author": {
"name": "David Frank"
},

View File

@@ -22,7 +22,7 @@
"_resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz",
"_shasum": "2109adc7965887cfc05cbbd442cac8bfbb360863",
"_spec": "object-assign@^4.1.1",
"_where": "C:\\Users\\deranjer\\GoglandProjects\\torrent-project\\torrent-project\\node_modules\\@devexpress\\dx-react-core\\node_modules\\prop-types",
"_where": "C:\\Users\\deranjer\\go\\src\\github.com\\deranjer\\goTorrent\\goTorrentWebUI\\node_modules\\@devexpress\\dx-react-core\\node_modules\\prop-types",
"author": {
"name": "Sindre Sorhus",
"email": "sindresorhus@gmail.com",

Some files were not shown because too many files have changed in this diff Show More