Compare commits
103 Commits
v2.0.6-bet
...
v2.0.16-be
Author | SHA1 | Date | |
---|---|---|---|
![]() |
d15223fb1a | ||
![]() |
d29a12b6db | ||
![]() |
9100e25a21 | ||
![]() |
7672f1955e | ||
![]() |
5f52171fc4 | ||
![]() |
31ac82ad71 | ||
![]() |
38ca4e37a6 | ||
![]() |
3c55550702 | ||
![]() |
7dff6b121b | ||
![]() |
d77d889695 | ||
![]() |
318a21438f | ||
![]() |
7175b57a28 | ||
![]() |
e1e5a050c2 | ||
![]() |
58996c1115 | ||
![]() |
7301fe5f6e | ||
![]() |
a27c423569 | ||
![]() |
19680d3bc7 | ||
![]() |
ecaca4e5dc | ||
![]() |
191de0b577 | ||
![]() |
ebcc073b32 | ||
![]() |
043b3fd57b | ||
![]() |
dd50502dcb | ||
![]() |
f159a1014d | ||
![]() |
abb801535c | ||
![]() |
2732dbf1b1 | ||
![]() |
095d893005 | ||
![]() |
5d8455d141 | ||
![]() |
aa3450bfcc | ||
![]() |
45c2ccdffe | ||
![]() |
fc14c3165f | ||
![]() |
0fad245148 | ||
![]() |
79609c384e | ||
![]() |
09054ddb4b | ||
![]() |
6f912d4aa2 | ||
![]() |
96033a8214 | ||
![]() |
5ca65f4797 | ||
![]() |
d2fccbde68 | ||
![]() |
e6b48d7baf | ||
![]() |
3e51310511 | ||
![]() |
32b43202c2 | ||
![]() |
446170f8de | ||
![]() |
c5a9ecd4ac | ||
![]() |
2af5f817a3 | ||
![]() |
4e55cf3cd4 | ||
![]() |
eeb0478813 | ||
![]() |
33739f1cb2 | ||
![]() |
515e6a8071 | ||
![]() |
2b22f8eb4f | ||
![]() |
e9725a0081 | ||
![]() |
8fd159d2fe | ||
![]() |
3d7e6c8b2c | ||
![]() |
0c048d61b1 | ||
![]() |
f05b8e5cd1 | ||
![]() |
0b38fec827 | ||
![]() |
547dc9ed33 | ||
![]() |
896a37bea9 | ||
![]() |
3f90037db3 | ||
![]() |
380ca11ced | ||
![]() |
ab3a288e49 | ||
![]() |
638e225f80 | ||
![]() |
5089ede207 | ||
![]() |
a3e6e76158 | ||
![]() |
7c4c7bfc90 | ||
![]() |
644fea6665 | ||
![]() |
a1349ff8a6 | ||
![]() |
71c20002b8 | ||
![]() |
157af84226 | ||
![]() |
9b4536f132 | ||
![]() |
29ab470e42 | ||
![]() |
c67fa480a7 | ||
![]() |
0a1a691c73 | ||
![]() |
48588f23bf | ||
![]() |
cf14fbc3f0 | ||
![]() |
e471d5207d | ||
![]() |
5722a52082 | ||
![]() |
08c32e875e | ||
![]() |
7d3ee3afb3 | ||
![]() |
def8600f5c | ||
![]() |
74a68f3c7d | ||
![]() |
64c9247dd1 | ||
![]() |
1bfcd34247 | ||
![]() |
19864e97e6 | ||
![]() |
ec5c5e1420 | ||
![]() |
803f4e14ca | ||
![]() |
6cc254b80a | ||
![]() |
59593ab1aa | ||
![]() |
65a0a0eb7d | ||
![]() |
f4206b401f | ||
![]() |
99f8d24b3e | ||
![]() |
26b06e453d | ||
![]() |
54ab646048 | ||
![]() |
12c9aa3d6a | ||
![]() |
1ae8544f2d | ||
![]() |
eae9e66c75 | ||
![]() |
ad041a1691 | ||
![]() |
1aee3b6c8f | ||
![]() |
04d4ffb63d | ||
![]() |
80b318b45c | ||
![]() |
19969a8b1f | ||
![]() |
b84888356f | ||
![]() |
c9436195f3 | ||
![]() |
98cfb50571 | ||
![]() |
b67884ea7f |
1
.gitignore
vendored
1
.gitignore
vendored
@@ -12,6 +12,7 @@
|
||||
*.db*
|
||||
*.db-journal
|
||||
*.ini
|
||||
release.lock
|
||||
version.lock
|
||||
logs/*
|
||||
cache/*
|
||||
|
110
CHANGELOG.md
110
CHANGELOG.md
@@ -1,5 +1,113 @@
|
||||
# Changelog
|
||||
|
||||
## v2.0.16-beta (2018-01-30)
|
||||
|
||||
* Monitoring:
|
||||
* Fix: Timestamp sometimes showing as "0:60" on the activity cards.
|
||||
* Fix: Incorrect session information being shown for playback of synced content.
|
||||
* Fix: Sessions not being stopped when "Playback Stopped" notifications were enabled.
|
||||
* UI:
|
||||
* Fix: Stream resolution showing up as "unknown" on the graphs.
|
||||
* New: Added user filter to the Synced Items table.
|
||||
* Other:
|
||||
* New: Option to use the Plex server update channel when checking for updates.
|
||||
|
||||
|
||||
## v2.0.15-beta (2018-01-27)
|
||||
|
||||
* Monitoring:
|
||||
* Fix: Live TV sessions not being stopped in History.
|
||||
* Fix: Stream location showing as "unknown" on the activity cards.
|
||||
* New: Improved Live TV details on the activity cards.
|
||||
* Notifications:
|
||||
* New: Added labels and collections to notification parameters.
|
||||
* New: Added more server details to notification parameters.
|
||||
* Change: Renamed "PlexPy" update notification parameters to "Tautulli".
|
||||
|
||||
|
||||
## v2.0.14-beta (2018-01-20)
|
||||
|
||||
* Monitoring:
|
||||
* Change: Added "Cellular" bandwidth to "WAN" in activity header.
|
||||
* Notifications:
|
||||
* Fix: Plex Web URL for tracks now go to the album page.
|
||||
* Fix: Recently added notifications being sent for the entire library when DVR EPG data was refreshed.
|
||||
* Fix: Notifier settings not loading with an apostrophe in the custom condition values.
|
||||
* Fix: Custom email addresses not being saved when closing the notifier settings.
|
||||
* Change: Re-enabled Browser notifications.
|
||||
* Change: Renamed "PlexPy" update notification parameters to "Tautulli".
|
||||
* Change: Emails no longer automatically insert HTML line breaks.
|
||||
* Change: "Date" header added to email notifications.
|
||||
* UI:
|
||||
* Change: Show all changelogs since the previous version when updating.
|
||||
|
||||
|
||||
## v2.0.13-beta (2018-01-13)
|
||||
|
||||
* Notifications:
|
||||
* New: Added dropdown selection for email addresses of shared users.
|
||||
* New: Added more notification options for Join.
|
||||
* Change: Show "OR" between custom condition values.
|
||||
* Other:
|
||||
* New: Use JSON Web Tokens for authentication. Login now works with SSO applications.
|
||||
* New: Allow the Plex server admin to login as a Tautulli admin using their Plex.tv account.
|
||||
|
||||
|
||||
## v2.0.12-beta (2018-01-07)
|
||||
|
||||
* Notifications:
|
||||
* Fix: Incorrect Plex URL parameter value.
|
||||
* Change: Custom condition logic is now optional. An implicit "and" is applied between all conditions if the logic is blank.
|
||||
* UI:
|
||||
* New: Added separate required LAN/WAN bandwidth in the activity header.
|
||||
* API:
|
||||
* Fix: Notify API command not sending notifications.
|
||||
|
||||
|
||||
## v2.0.11-beta (2018-01-05)
|
||||
|
||||
* Notifications:
|
||||
* Fix: Some notification parameters showing up blank.
|
||||
* UI:
|
||||
* Fix: Stream data showing up as "None" for pre-v2 history.
|
||||
* Other:
|
||||
* Fix: Ability to login using the hashed password.
|
||||
|
||||
|
||||
## v2.0.10-beta (2018-01-04)
|
||||
|
||||
* Monitoring:
|
||||
* Fix: HW transcoding indicator on activity cards incorrect after refreshing.
|
||||
* Notifications:
|
||||
* Remove: Notification toggles from library and user settings. Use custom conditions to filter out notifications instead.
|
||||
* UI:
|
||||
* Fix: Incorrect examples for some date format options. Also added a few missing date format options. (Thanks @Tommatheussen)
|
||||
|
||||
|
||||
## v2.0.9-beta (2018-01-03)
|
||||
|
||||
* Notifications:
|
||||
* Fix: Notifications failing due to incorrect season/episode number types.
|
||||
|
||||
|
||||
## v2.0.8-beta (2018-01-03)
|
||||
|
||||
* Monitoring:
|
||||
* Fix: Incorrect HW transcoding indicator on activity cards.
|
||||
* Fix: Long product/player names hidden behind platform icon on activity cards.
|
||||
* Notifications:
|
||||
* Fix: Notifications failing due to some missing notification parameters.
|
||||
|
||||
|
||||
## v2.0.7-beta (2018-01-01)
|
||||
|
||||
* Monitoring:
|
||||
* Fix: Incorrect LAN/WAN location on activity cards.
|
||||
* Fix: Paused time not recording correctly.
|
||||
* Other:
|
||||
* Fix: Failed to retrieve synced items when there are special characters in the title.
|
||||
|
||||
|
||||
## v2.0.6-beta (2017-12-31)
|
||||
|
||||
* Monitoring:
|
||||
@@ -17,8 +125,8 @@
|
||||
* Fix: Error sending Join notifications.
|
||||
* UI:
|
||||
* New: Added total required bandwidth in the activity header.
|
||||
* Fix: Failing to retrieve releases from GitHub.
|
||||
* Other:
|
||||
* Fix: Failing to retrieve releases from GitHub.
|
||||
* Fix: CherryPy SSL connection warning. (Thanks @felixbuenemann)
|
||||
* Fix: Sanitize script output in logs.
|
||||
* Change: Login sessions persists across server restarts.
|
||||
|
@@ -1,6 +1,6 @@
|
||||
# Tautulli
|
||||
|
||||
[](https://discord.gg/36ggawe)
|
||||
[](https://discord.gg/tQcWEUp)
|
||||
[](https://www.reddit.com/r/Tautulli/)
|
||||
[](https://forums.plex.tv/discussion/169591/plexpy-another-plex-monitoring-program)
|
||||
|
||||
@@ -49,7 +49,7 @@ This project is based on code from [Headphones](https://github.com/rembo10/headp
|
||||
- Checking the [Wiki](https://github.com/JonnyWong16/plexpy/wiki) for
|
||||
[ [Installation] ](https://github.com/JonnyWong16/plexpy/wiki/Installation) and
|
||||
[ [FAQs] ](https://github.com/JonnyWong16/plexpy/wiki/Frequently-Asked-Questions-(FAQ)).
|
||||
- For basic questions try asking on [Discord](https://discord.gg/36ggawe), [Reddit](https://www.reddit.com/r/Tautulli), or the [Plex Forums](https://forums.plex.tv/discussion/169591/plexpy-another-plex-monitoring-program) first before opening an issue.
|
||||
- For basic questions try asking on [Discord](https://discord.gg/tQcWEUp), [Reddit](https://www.reddit.com/r/Tautulli), or the [Plex Forums](https://forums.plex.tv/discussion/169591/plexpy-another-plex-monitoring-program) first before opening an issue.
|
||||
|
||||
##### If nothing has worked:
|
||||
|
||||
|
@@ -2,6 +2,7 @@
|
||||
import plexpy
|
||||
from plexpy import version
|
||||
from plexpy.helpers import anon_url
|
||||
from plexpy.notifiers import BROWSER_NOTIFIERS
|
||||
%>
|
||||
<!doctype html>
|
||||
|
||||
@@ -64,7 +65,7 @@
|
||||
<span class="icon-bar"></span>
|
||||
<span class="icon-bar"></span>
|
||||
</button>
|
||||
<a class="navbar-brand svg" href="home">
|
||||
<a class="navbar-brand" href="home" title="Tautulli">
|
||||
<object data="${http_root}images/logo-tautulli.svg" type="image/svg+xml" style="height: 45px;"></object>
|
||||
</a>
|
||||
</div>
|
||||
@@ -138,7 +139,7 @@
|
||||
<li><a href="#" data-target="#admin-login-modal" data-toggle="modal"><i class="fa fa-fw fa-lock"></i> Admin Login</a></li>
|
||||
<li role="separator" class="divider"></li>
|
||||
% endif
|
||||
% if _session['expiry']:
|
||||
% if _session['exp']:
|
||||
<li><a href="${http_root}auth/logout"><i class="fa fa-fw fa-sign-out"></i> Sign Out</a></li>
|
||||
% endif
|
||||
</ul>
|
||||
@@ -161,7 +162,7 @@ ${next.modalIncludes()}
|
||||
<div id="admin-login-modal" class="modal fade" tabindex="-1" role="dialog" aria-labelledby="admin-login-modal">
|
||||
<div class="modal-dialog" role="document">
|
||||
<div class="modal-content">
|
||||
<form action="${http_root}auth/login" method="post">
|
||||
<form id="login-form">
|
||||
<div class="modal-header">
|
||||
<button type="button" class="close" data-dismiss="modal" aria-hidden="true"><i class="fa fa-remove"></i></button>
|
||||
<h4 class="modal-title">Admin Login</h4>
|
||||
@@ -190,7 +191,8 @@ ${next.modalIncludes()}
|
||||
</div>
|
||||
</div>
|
||||
<div class="modal-footer">
|
||||
<button type="submit" class="btn btn-bright login-button"><i class="fa fa-sign-in"></i> Sign In</button>
|
||||
<span id="incorrect-login" style="padding-right: 25px; display: none;">Incorrect username or password.</span>
|
||||
<button id="sign-in" type="submit" class="btn btn-bright login-button"><i class="fa fa-sign-in"></i> Sign In</button>
|
||||
</div>
|
||||
<input type="hidden" id="admin_login" name="admin_login" value="1" />
|
||||
</form>
|
||||
@@ -282,6 +284,9 @@ ${next.modalIncludes()}
|
||||
<script src="${http_root}js/pnotify.custom.min.js"></script>
|
||||
<script src="${http_root}js/script.js${cache_param}"></script>
|
||||
<script src="${http_root}js/jquery.qrcode.min.js"></script>
|
||||
% if _session['user_group'] == 'admin' and BROWSER_NOTIFIERS:
|
||||
<script src="${http_root}js/ajaxNotifications.js"></script>
|
||||
% endif
|
||||
<script>
|
||||
% if _session['user_group'] == 'admin':
|
||||
$('#updateDismiss').click(function() {
|
||||
@@ -386,6 +391,29 @@ ${next.modalIncludes()}
|
||||
$('#admin-login-modal').on('shown.bs.modal', function () {
|
||||
$('#admin-login-modal #username').focus()
|
||||
})
|
||||
|
||||
$('#login-form').submit(function(event) {
|
||||
event.preventDefault();
|
||||
$('#sign-in').prop('disabled', true).html('<i class="fa fa-refresh fa-spin"></i> Sign In');
|
||||
$.ajax({
|
||||
url: '${http_root}auth/signin',
|
||||
type: 'POST',
|
||||
data: $(this).serialize(),
|
||||
dataType: 'json',
|
||||
statusCode: {
|
||||
200: function() {
|
||||
window.location = "${http_root}";
|
||||
},
|
||||
401: function() {
|
||||
$('#incorrect-login').show();
|
||||
$('#username').focus();
|
||||
}
|
||||
},
|
||||
complete: function() {
|
||||
$('#sign-in').prop('disabled', false).html('<i class="fa fa-sign-in"></i> Sign In');
|
||||
}
|
||||
});
|
||||
});
|
||||
% endif
|
||||
</script>
|
||||
${next.javascriptIncludes()}
|
||||
|
@@ -84,7 +84,7 @@ DOCUMENTATION :: END
|
||||
<tr>
|
||||
<td>Support:</td>
|
||||
<td>
|
||||
<a class="no-highlight support-modal-link" href="${anon_url('https://discord.gg/36ggawe')}" target="_blank">Tautulli Discord Server</a> |
|
||||
<a class="no-highlight support-modal-link" href="${anon_url('https://discord.gg/tQcWEUp')}" target="_blank">Tautulli Discord Server</a> |
|
||||
<a class="no-highlight support-modal-link" href="${anon_url('https://www.reddit.com/r/Tautulli')}" target="_blank">Tautulli Subreddit</a> |
|
||||
<a class="no-highlight support-modal-link" href="${anon_url('https://forums.plex.tv/discussion/169591/plexpy-another-plex-monitoring-program')}" target="_blank">Plex Forums</a>
|
||||
</td>
|
||||
|
@@ -13,18 +13,6 @@ a:focus {
|
||||
text-decoration: none;
|
||||
outline: none;
|
||||
}
|
||||
a.svg {
|
||||
position: relative;
|
||||
display: inline-block;
|
||||
}
|
||||
a.svg:after {
|
||||
content: "";
|
||||
position: absolute;
|
||||
top: 0;
|
||||
right: 0;
|
||||
bottom: 0;
|
||||
left: 0;
|
||||
}
|
||||
select, .react-selectize.bootstrap3.root-node .react-selectize-control {
|
||||
margin: 5px 0 5px 0;
|
||||
border: 2px solid #444;
|
||||
@@ -83,11 +71,20 @@ select.form-control {
|
||||
border-radius: 3px;
|
||||
transition: background-color .3s;
|
||||
}
|
||||
.react-selectize.root-node .react-selectize-control {
|
||||
.react-selectize.root-node .react-selectize-control,
|
||||
.selectize-control.form-control .selectize-input {
|
||||
color: #fff !important;
|
||||
border: 0px solid #444 !important;
|
||||
background: #555 !important;
|
||||
padding: 1px 2px;
|
||||
transition: background-color .3s;
|
||||
}
|
||||
.selectize-control.form-control .selectize-input {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
flex-wrap: wrap;
|
||||
margin-bottom: 4px;
|
||||
padding-left: 5px;
|
||||
}
|
||||
.react-selectize.root-node .react-selectize-control .react-selectize-placeholder {
|
||||
color: #fff !important;
|
||||
@@ -95,20 +92,86 @@ select.form-control {
|
||||
.react-selectize.root-node .react-selectize-control .react-selectize-toggle-button path {
|
||||
fill: #fff !important;
|
||||
}
|
||||
.react-selectize.root-node .simple-value,
|
||||
.selectize-control.multi .selectize-input > div {
|
||||
background: #444444 !important;
|
||||
color: #ffffff !important;
|
||||
padding-bottom: 2px !important;
|
||||
transition: background-color .3s;
|
||||
}
|
||||
.react-selectize.root-node .simple-value span {
|
||||
padding-bottom: 2px !important;
|
||||
}
|
||||
.react-selectize.root-node .react-selectize-control .react-selectize-search-field-and-selected-values .resizable-input{
|
||||
.react-selectize.root-node .react-selectize-control .react-selectize-search-field-and-selected-values .value-wrapper:not(:first-child):before {
|
||||
content: "or";
|
||||
padding: 0 3px;
|
||||
text-transform: uppercase;
|
||||
font-size: 10px;
|
||||
}
|
||||
.react-selectize.root-node .react-selectize-control .react-selectize-search-field-and-selected-values .resizable-input {
|
||||
padding-top: 3px !important;
|
||||
padding-bottom: 3px !important;
|
||||
}
|
||||
select.form-control:focus {
|
||||
select.form-control:focus,
|
||||
.react-selectize.root-node.open .react-selectize-control,
|
||||
.selectize-control.form-control .selectize-input.focus {
|
||||
outline: 0;
|
||||
outline: thin dotted \9;
|
||||
color: #555;
|
||||
background-color: #fff;
|
||||
color: #555 !important;
|
||||
background-color: #fff !important;
|
||||
transition: background-color .3s;
|
||||
}
|
||||
.react-selectize.root-node.open .simple-value,
|
||||
.selectize-control.multi .selectize-input.focus > div,
|
||||
.selectize-control.multi .selectize-input > div.active {
|
||||
background: #efefef !important;
|
||||
color: #333333 !important;
|
||||
transition: background-color .3s;
|
||||
}
|
||||
.react-selectize.root-node.open .react-selectize-control .react-selectize-toggle-button path {
|
||||
fill: #999 !important;
|
||||
}
|
||||
.selectize-control .selectize-input > div .item-value {
|
||||
opacity: 0.8;
|
||||
font-size: 12px;
|
||||
}
|
||||
.selectize-control .selectize-input > div .item-text + .item-value {
|
||||
margin-left: 5px;
|
||||
}
|
||||
.selectize-control .selectize-input > div .item-value:before {
|
||||
content: '<';
|
||||
opacity: 0.8;
|
||||
font-size: 12px;
|
||||
}
|
||||
.selectize-control .selectize-input > div .item-value:after {
|
||||
content: '>';
|
||||
opacity: 0.8;
|
||||
font-size: 12px;
|
||||
}
|
||||
.selectize-control .selectize-dropdown .caption {
|
||||
font-size: 12px;
|
||||
display: block;
|
||||
color: #a0a0a0;
|
||||
}
|
||||
.selectize-control .selectize-dropdown .select-all,
|
||||
.selectize-control .selectize-dropdown .remove-all {
|
||||
font-weight: bold;
|
||||
}
|
||||
.selectize-control .selectize-dropdown .border-all {
|
||||
pointer-events: none;
|
||||
display: block;
|
||||
height: 1px;
|
||||
margin: 9px -12px 9px -12px;
|
||||
padding: 0 !important;
|
||||
overflow: hidden;
|
||||
background-color: #e5e5e5;
|
||||
}
|
||||
.selectize-control .selectize-dropdown .border-all:last-child {
|
||||
display: none;
|
||||
}
|
||||
.selectize-dropdown .optgroup-header {
|
||||
font-weight: bold;
|
||||
}
|
||||
select.form-control option {
|
||||
color: #555;
|
||||
background-color: #fff;
|
||||
@@ -118,6 +181,9 @@ img {
|
||||
-moz-box-sizing: content-box;
|
||||
box-sizing: content-box;
|
||||
}
|
||||
object {
|
||||
pointer-events: none;
|
||||
}
|
||||
.navbar {
|
||||
background: #000;
|
||||
-webkit-box-shadow: 0 0 0 3px rgba(0,0,0,.2);
|
||||
@@ -154,7 +220,7 @@ img {
|
||||
}
|
||||
.nav .open > a, .nav .open > a:hover, .nav .open > a:focus {
|
||||
background-color: #2f2f2f;
|
||||
border-color: none;
|
||||
border-color: unset;
|
||||
}
|
||||
.dropdown-menu {
|
||||
background-color: #282828;
|
||||
@@ -640,8 +706,8 @@ a .users-poster-face:hover {
|
||||
height: 290px;
|
||||
min-width: 350px;
|
||||
max-width: 500px;
|
||||
margin-right: 20px;
|
||||
margin-bottom: 20px;
|
||||
margin-right: 25px;
|
||||
margin-bottom: 25px;
|
||||
}
|
||||
.dashboard-activity-container {
|
||||
height: 240px;
|
||||
@@ -853,6 +919,18 @@ a .users-poster-face:hover {
|
||||
-webkit-flex-grow: 1;
|
||||
flex-grow: 1;
|
||||
}
|
||||
.dashboard-activity-info-item .sub-value.platform-right {
|
||||
margin-right: 55px;
|
||||
text-overflow: ellipsis;
|
||||
overflow: hidden;
|
||||
white-space: nowrap;
|
||||
}
|
||||
.dashboard-activity-info-item .sub-value.time-right {
|
||||
margin-right: 60px;
|
||||
text-overflow: ellipsis;
|
||||
overflow: hidden;
|
||||
white-space: nowrap;
|
||||
}
|
||||
.dashboard-activity-info-item .sub-value .ip-container {
|
||||
display: inline-flex;
|
||||
}
|
||||
@@ -910,7 +988,6 @@ a .users-poster-face:hover {
|
||||
background-image: -o-linear-gradient(top, #fbb450, #f89406);
|
||||
background-image: linear-gradient(to bottom, #fbb450, #f89406);
|
||||
background-repeat: repeat-x;
|
||||
filter: progid:DXImageTransform.Microsoft.gradient(startColorstr='#fffbb450', endColorstr='#fff89406', GradientType=0);
|
||||
position: absolute;
|
||||
height: 100%;
|
||||
max-width: 100%;
|
||||
@@ -1047,8 +1124,8 @@ a .dashboard-activity-metadata-user-thumb:hover {
|
||||
height: 160px;
|
||||
min-width: 350px;
|
||||
max-width: 500px;
|
||||
margin-right: 20px;
|
||||
margin-bottom: 20px;
|
||||
margin-right: 25px;
|
||||
margin-bottom: 25px;
|
||||
}
|
||||
.dashboard-stats-container {
|
||||
height: 160px;
|
||||
@@ -1270,7 +1347,7 @@ a .dashboard-activity-metadata-user-thumb:hover {
|
||||
.dashboard-stats-info {
|
||||
width: 100%;
|
||||
font-size: 12px;
|
||||
padding: 3px 0 5px 15px;
|
||||
padding: 3px 0 0 15px;
|
||||
position: relative;
|
||||
}
|
||||
.dashboard-stats-info-list {
|
||||
@@ -1665,7 +1742,6 @@ a:hover .dashboard-recent-media-cover {
|
||||
background-image: -moz-linear-gradient(top,rgba(0,0,0,.7) 0,rgba(0,0,0,.9) 100%);
|
||||
background-image: linear-gradient(to bottom,rgba(0,0,0,.7) 0,rgba(0,0,0,.9) 100%);
|
||||
background-repeat: repeat-x;
|
||||
filter: progid:DXImageTransform.Microsoft.gradient(startColorstr='#b3000000', endColorstr='#e6000000', GradientType=0);
|
||||
webkit-box-shadow: inset 0 0 0 2px #e9a049;
|
||||
-moz-box-shadow: inset 0 0 0 2px #e9a049;
|
||||
box-shadow: inset 0 0 0 2px #e9a049;
|
||||
@@ -1683,6 +1759,18 @@ a:hover .dashboard-recent-media-cover {
|
||||
opacity: 0;
|
||||
transition: opacity .3s;
|
||||
}
|
||||
.summary-poster-face-overlay span:before {
|
||||
content: "View On";
|
||||
color: #999;
|
||||
font-size: 13px;
|
||||
font-weight: bold;
|
||||
text-transform: uppercase;
|
||||
text-align: center;
|
||||
display: block;
|
||||
position: absolute;
|
||||
top: calc(50% - 34px);
|
||||
width: 100%;
|
||||
}
|
||||
a:hover .summary-poster-face .summary-poster-face-overlay,
|
||||
a:hover .summary-poster-face-episode .summary-poster-face-overlay,
|
||||
a:hover .summary-poster-face-track .summary-poster-face-overlay,
|
||||
@@ -3719,7 +3807,11 @@ a:hover .overlay-refresh-image:hover {
|
||||
.no-image {
|
||||
background-image: none !important;
|
||||
}
|
||||
|
||||
#info-modal .stream-info-current {
|
||||
color: #aaa;
|
||||
text-align: center;
|
||||
padding-bottom: 10px;
|
||||
}
|
||||
#info-modal .stream-info-item {
|
||||
display: flex;
|
||||
flex-direction: row;
|
||||
|
@@ -64,6 +64,7 @@ DOCUMENTATION :: END
|
||||
from collections import defaultdict
|
||||
from urllib import quote
|
||||
from plexpy import helpers
|
||||
from plexpy.common import VIDEO_RESOLUTION_OVERRIDES, AUDIO_CODEC_OVERRIDES
|
||||
import plexpy
|
||||
%>
|
||||
<% data = defaultdict(lambda: 'Unknown', **session) %>
|
||||
@@ -134,15 +135,15 @@ DOCUMENTATION :: END
|
||||
<ul class="list-unstyled dashboard-activity-info-list">
|
||||
<li class="dashboard-activity-info-item">
|
||||
<div class="sub-heading">Product</div>
|
||||
<div class="sub-value">${data['product']}</div>
|
||||
<div class="sub-value platform-right">${data['product']}</div>
|
||||
</li>
|
||||
<li class="dashboard-activity-info-item">
|
||||
<div class="sub-heading">Player</div>
|
||||
<div class="sub-value">${data['player']}</div>
|
||||
<div class="sub-value platform-right">${data['player']}</div>
|
||||
</li>
|
||||
<li class="dashboard-activity-info-item">
|
||||
<div class="sub-heading">Quality</div>
|
||||
<div class="sub-value" id="stream_quality-${sk}">
|
||||
<div class="sub-value platform-right" id="stream_quality-${sk}">
|
||||
% if data['media_type'] != 'photo' and data['quality_profile'] != 'Unknown':
|
||||
<%
|
||||
br = helpers.cast_to_int(data['stream_bitrate']) or ''
|
||||
@@ -200,8 +201,8 @@ DOCUMENTATION :: END
|
||||
<li class="dashboard-activity-info-item">
|
||||
<div class="sub-heading">Container</div>
|
||||
<div class="sub-value" id="transcode_container-${sk}">
|
||||
% if data.get('stream_container_decision') == 'transcode':
|
||||
Transcode (${data['container'].upper()} → ${data['stream_container'].upper()})
|
||||
% if data['stream_container_decision'] == 'transcode':
|
||||
Transcode (${data['container'].upper()} <i class="fa fa-long-arrow-right"></i> ${data['stream_container'].upper()})
|
||||
% else:
|
||||
Direct Play (${data['container'].upper()})
|
||||
% endif
|
||||
@@ -212,19 +213,16 @@ DOCUMENTATION :: END
|
||||
<div class="sub-heading">Video</div>
|
||||
<div class="sub-value" id="video_decision-${sk}">
|
||||
% if data['media_type'] in ('movie', 'episode', 'clip'):
|
||||
% if data.get('stream_video_decision') == 'transcode':
|
||||
% if data['stream_video_decision'] == 'transcode':
|
||||
<%
|
||||
hw_d = hw_e = ''
|
||||
if data['transcode_hw_requested'] == 1 and data['transcode_hw_full_pipeline'] == 0:
|
||||
hw_d = ' (HW)'
|
||||
elif data['transcode_hw_requested'] == 1 and data['transcode_hw_full_pipeline'] == 1:
|
||||
hw_d = hw_e = ' (HW)'
|
||||
hw_d = ' (HW)' if data['transcode_hw_decoding'] else ''
|
||||
hw_e = ' (HW)' if data['transcode_hw_encoding'] else ''
|
||||
%>
|
||||
Transcode (${data['video_codec'].upper()}${hw_d} ${plexpy.common.VIDEO_RESOLUTION_OVERRIDES.get(data['video_resolution'], data['video_resolution'])} → ${data['stream_video_codec'].upper()}${hw_e} ${plexpy.common.VIDEO_RESOLUTION_OVERRIDES.get(data['stream_video_resolution'], data['stream_video_resolution'])})
|
||||
% elif data.get('stream_video_decision') == 'copy':
|
||||
Direct Stream (${data['stream_video_codec'].upper()} ${plexpy.common.VIDEO_RESOLUTION_OVERRIDES.get(data['stream_video_resolution'], data['stream_video_resolution'])})
|
||||
Transcode (${data['video_codec'].upper()}${hw_d} ${VIDEO_RESOLUTION_OVERRIDES.get(data['video_resolution'], data['video_resolution'])} <i class="fa fa-long-arrow-right"></i> ${data['stream_video_codec'].upper()}${hw_e} ${VIDEO_RESOLUTION_OVERRIDES.get(data['stream_video_resolution'], data['stream_video_resolution'])})
|
||||
% elif data['stream_video_decision'] == 'copy':
|
||||
Direct Stream (${data['stream_video_codec'].upper()} ${VIDEO_RESOLUTION_OVERRIDES.get(data['stream_video_resolution'], data['stream_video_resolution'])})
|
||||
% else:
|
||||
Direct Play (${data['video_codec'].upper()} ${plexpy.common.VIDEO_RESOLUTION_OVERRIDES.get(data['video_resolution'], data['video_resolution'])})
|
||||
Direct Play (${data['video_codec'].upper()} ${VIDEO_RESOLUTION_OVERRIDES.get(data['video_resolution'], data['video_resolution'])})
|
||||
% endif
|
||||
% elif data['media_type'] == 'photo':
|
||||
Direct Play (${data['width']}x${data['height']})
|
||||
@@ -236,12 +234,12 @@ DOCUMENTATION :: END
|
||||
<li class="dashboard-activity-info-item">
|
||||
<div class="sub-heading">Audio</div>
|
||||
<div class="sub-value" id="audio_decision-${sk}">
|
||||
% if data.get('stream_audio_decision') == 'transcode':
|
||||
Transcode (${plexpy.common.AUDIO_CODEC_OVERRIDES.get(data['audio_codec'], data['audio_codec'].upper())} ${data['audio_channel_layout'].split('(')[0].capitalize()} → ${plexpy.common.AUDIO_CODEC_OVERRIDES.get(data['stream_audio_codec'], data['stream_audio_codec'].upper())} ${data['stream_audio_channel_layout'].split('(')[0].capitalize()})
|
||||
% elif data.get('stream_audio_decision') == 'copy':
|
||||
Direct Stream (${plexpy.common.AUDIO_CODEC_OVERRIDES.get(data['stream_audio_codec'], data['stream_audio_codec'].upper())} ${data['stream_audio_channel_layout'].split('(')[0].capitalize()})
|
||||
% if data['stream_audio_decision'] == 'transcode':
|
||||
Transcode (${AUDIO_CODEC_OVERRIDES.get(data['audio_codec'], data['audio_codec'].upper())} ${data['audio_channel_layout'].split('(')[0].capitalize()} <i class="fa fa-long-arrow-right"></i> ${AUDIO_CODEC_OVERRIDES.get(data['stream_audio_codec'], data['stream_audio_codec'].upper())} ${data['stream_audio_channel_layout'].split('(')[0].capitalize()})
|
||||
% elif data['stream_audio_decision'] == 'copy':
|
||||
Direct Stream (${AUDIO_CODEC_OVERRIDES.get(data['stream_audio_codec'], data['stream_audio_codec'].upper())} ${data['stream_audio_channel_layout'].split('(')[0].capitalize()})
|
||||
% else:
|
||||
Direct Play (${plexpy.common.AUDIO_CODEC_OVERRIDES.get(data['audio_codec'], data['audio_codec'].upper())} ${data['audio_channel_layout'].split('(')[0].capitalize()})
|
||||
Direct Play (${AUDIO_CODEC_OVERRIDES.get(data['audio_codec'], data['audio_codec'].upper())} ${data['audio_channel_layout'].split('(')[0].capitalize()})
|
||||
% endif
|
||||
</div>
|
||||
</li>
|
||||
@@ -252,7 +250,7 @@ DOCUMENTATION :: END
|
||||
<div class="sub-value" id="subtitle_decision-${sk}">
|
||||
% if data['subtitles'] == 1:
|
||||
% if data['stream_subtitle_decision'] == 'transcode':
|
||||
Transcode (${data['subtitle_codec'].upper()} → ${data['stream_subtitle_codec'].upper()})
|
||||
Transcode (${data['subtitle_codec'].upper()} <i class="fa fa-long-arrow-right"></i> ${data['stream_subtitle_codec'].upper()})
|
||||
% elif data['stream_subtitle_decision'] == 'copy':
|
||||
Direct Stream (${data['subtitle_codec'].upper()})
|
||||
% elif data['stream_subtitle_decision'] == 'burn':
|
||||
@@ -270,9 +268,9 @@ DOCUMENTATION :: END
|
||||
<ul class="list-unstyled dashboard-activity-info-list">
|
||||
<li class="dashboard-activity-info-item">
|
||||
<div class="sub-heading">Location</div>
|
||||
<div class="sub-value">
|
||||
<div class="sub-value time-right">
|
||||
% if data['ip_address'] != 'N/A':
|
||||
${'LAN' if data['local'] == 1 else 'WAN'}: <span class="ip-container"><span class="ip-address">${data['ip_address']}</span></span>
|
||||
<span id="location-${sk}">${data['location'].upper()}</span>: <span class="ip-container"><span class="ip-address">${data['ip_address']}</span></span>
|
||||
<a href="#" class="external_ip-modal" data-toggle="modal" data-target="#ip-info-modal" data-ip="${data['ip_address']}">
|
||||
<span id="external_ip-${sk}" class="external-ip-tooltip" data-toggle="tooltip" title="Lookup External IP" style="display: none;"><i class="fa fa-map-marker"></i></span>
|
||||
</a>
|
||||
@@ -290,7 +288,7 @@ DOCUMENTATION :: END
|
||||
</li>
|
||||
<li class="dashboard-activity-info-item">
|
||||
<div class="sub-heading">Bandwidth</div>
|
||||
<div class="sub-value">
|
||||
<div class="sub-value time-right">
|
||||
% if data['media_type'] != 'photo' and helpers.cast_to_int(data['bandwidth']):
|
||||
<%
|
||||
bw = helpers.cast_to_int(data['bandwidth'])
|
||||
@@ -314,7 +312,9 @@ DOCUMENTATION :: END
|
||||
</div>
|
||||
% if data['media_type'] != 'photo':
|
||||
<div class="dashboard-activity-info-time">
|
||||
% if data['view_offset']:
|
||||
% if data['live'] == 1:
|
||||
<br />Live
|
||||
% elif data['view_offset']:
|
||||
ETA:
|
||||
<span id="stream-eta-${sk}">
|
||||
<script>
|
||||
@@ -342,8 +342,12 @@ DOCUMENTATION :: END
|
||||
</div>
|
||||
<div class="dashboard-activity-progress">
|
||||
<div class="dashboard-activity-progress-bar">
|
||||
% if data['live'] == 1:
|
||||
<div id="progress-bar-${sk}" class="progress-bar" style="width: 100%" data-toggle="tooltip" title="Stream Progress Live">Live</div>
|
||||
% else:
|
||||
<div id="buffer-bar-${sk}" class="buffer-bar" style="width: ${data['transcode_progress']}%" data-toggle="tooltip" title="Transcoder Progress ${data['transcode_progress']}%">${data['transcode_progress']}%</div>
|
||||
<div id="progress-bar-${sk}" class="progress-bar" style="width: ${data['progress_percent']}%" data-last_view_offset="${data['view_offset']}" data-view_offset="${data['view_offset']}" data-stream_duration="${data['stream_duration']}" data-state="${data['state']}" data-toggle="tooltip" title="Stream Progress ${data['progress_percent']}%">${data['progress_percent']}%</div>
|
||||
% endif
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
@@ -391,7 +395,11 @@ DOCUMENTATION :: END
|
||||
</div>
|
||||
</div>
|
||||
<div class="dashboard-activity-metadata-subtitle-container">
|
||||
% if data['channel_stream'] == 0:
|
||||
% if data['live'] == 1:
|
||||
<div id="media-type-${sk}" class="dashboard-activity-metadata-media_type-icon" title="Plex Live TV">
|
||||
<i class="fa fa-fw fa-television"></i>
|
||||
</div>
|
||||
% elif data['channel_stream'] == 0:
|
||||
<div id="media-type-${sk}" class="dashboard-activity-metadata-media_type-icon" title="${data['media_type'].capitalize()}">
|
||||
% if data['media_type'] == 'movie':
|
||||
<i class="fa fa-fw fa-film"></i>
|
||||
@@ -406,12 +414,14 @@ DOCUMENTATION :: END
|
||||
% endif
|
||||
</div>
|
||||
% else:
|
||||
<div id="media-type-${sk}" title="Channel">
|
||||
<div id="media-type-${sk}" class="dashboard-activity-metadata-media_type-icon" title="Channel">
|
||||
<i class="fa fa-fw fa-cloud"></i>
|
||||
</div>
|
||||
% endif
|
||||
<div class="dashboard-activity-metadata-subtitle">
|
||||
% if data['channel_stream'] == 0:
|
||||
% if data['live'] == 1:
|
||||
<span title="Plex Live TV" class="sub-heading">Plex Live TV</span>
|
||||
% elif data['channel_stream'] == 0:
|
||||
% if data['media_type'] == 'movie':
|
||||
<span title="${data['year']}" class="sub-heading">${data['year']}</span>
|
||||
% elif data['media_type'] == 'episode':
|
||||
|
@@ -47,24 +47,12 @@ DOCUMENTATION :: END
|
||||
</div>
|
||||
<p class="help-block">Change the library's picture in Tautulli. To reset to default, leave this field empty and save.</p>
|
||||
</div>
|
||||
<div class="checkbox">
|
||||
<label>
|
||||
<input type="checkbox" id="do_notify" name="do_notify" value="1" ${helpers.checked(data['do_notify'])}> Enable notifications
|
||||
</label>
|
||||
<p class="help-block">Uncheck this if you do not want to receive notifications for this library's activity.</p>
|
||||
</div>
|
||||
<div class="checkbox">
|
||||
<label>
|
||||
<input type="checkbox" id="keep_history" name="keep_history" value="1" ${helpers.checked(data['keep_history'])}> Keep history
|
||||
</label>
|
||||
<p class="help-block">Uncheck this if you do not want to keep any history on this library's activity.</p>
|
||||
</div>
|
||||
<div class="checkbox">
|
||||
<label>
|
||||
<input type="checkbox" id="do_notify_created" name="do_notify_created" value="1" ${helpers.checked(data['do_notify_created'])}> Enable recently added notifications
|
||||
</label>
|
||||
<p class="help-block">Uncheck this if you do not want to receive recently added notifications for this library.</p>
|
||||
</div>
|
||||
% if data['section_id']:
|
||||
<div class="form-group">
|
||||
<button class="btn btn-danger" id="delete-all-history">Purge</button>
|
||||
@@ -85,15 +73,7 @@ DOCUMENTATION :: END
|
||||
// Save library options
|
||||
$("#save_library").on('click', function () {
|
||||
var custom_thumb = $("#custom_thumb_url").val();
|
||||
var do_notify = 0;
|
||||
var do_notify_created = 0;
|
||||
var keep_history = 0;
|
||||
if ($("#do_notify").is(":checked")) {
|
||||
do_notify = 1;
|
||||
}
|
||||
if ($("#do_notify_created").is(":checked")) {
|
||||
do_notify_created = 1;
|
||||
}
|
||||
if ($("#keep_history").is(":checked")) {
|
||||
keep_history = 1;
|
||||
}
|
||||
@@ -103,8 +83,6 @@ DOCUMENTATION :: END
|
||||
data: {
|
||||
section_id: '${data["section_id"]}',
|
||||
custom_thumb: custom_thumb,
|
||||
do_notify: do_notify,
|
||||
do_notify_created: do_notify_created,
|
||||
keep_history: keep_history
|
||||
},
|
||||
cache: false,
|
||||
|
@@ -56,12 +56,6 @@ DOCUMENTATION :: END
|
||||
</div>
|
||||
<p class="help-block">Change the users profile picture in Tautulli. To reset to default, leave this field empty and save.</p>
|
||||
</div>
|
||||
<div class="checkbox">
|
||||
<label>
|
||||
<input type="checkbox" id="do_notify" name="do_notify" value="1" ${helpers.checked(data['do_notify'])}> Enable notifications
|
||||
</label>
|
||||
<p class="help-block">Uncheck this if you do not want to receive notifications for this user's activity.</p>
|
||||
</div>
|
||||
<div class="checkbox">
|
||||
<label>
|
||||
<input type="checkbox" id="keep_history" name="keep_history" value="1" ${helpers.checked(data['keep_history'])}> Keep history
|
||||
@@ -95,12 +89,8 @@ DOCUMENTATION :: END
|
||||
$("#save_user").on('click', function () {
|
||||
var friendly_name = $("input#friendly_name").val();
|
||||
var custom_thumb = $("#custom_avatar_url").val();
|
||||
var do_notify = 0;
|
||||
var keep_history = 0;
|
||||
var allow_guest = 0;
|
||||
if ($("#do_notify").is(":checked")) {
|
||||
do_notify = 1;
|
||||
}
|
||||
if ($("#keep_history").is(":checked")) {
|
||||
keep_history = 1;
|
||||
}
|
||||
@@ -114,7 +104,6 @@ DOCUMENTATION :: END
|
||||
user_id: '${data["user_id"]}',
|
||||
friendly_name: friendly_name,
|
||||
custom_thumb: custom_thumb,
|
||||
do_notify: do_notify,
|
||||
keep_history: keep_history,
|
||||
allow_guest: allow_guest
|
||||
},
|
||||
|
@@ -114,7 +114,7 @@
|
||||
$.ajax({
|
||||
url: 'get_user_names',
|
||||
type: 'get',
|
||||
dataType: "json",
|
||||
dataType: 'json',
|
||||
success: function (data) {
|
||||
var select = $('#history-user');
|
||||
data.sort(function (a, b) {
|
||||
@@ -130,7 +130,6 @@
|
||||
function loadHistoryTable(media_type, selected_user_id) {
|
||||
history_table_options.ajax = {
|
||||
url: 'get_history',
|
||||
type: 'post',
|
||||
data: function (d) {
|
||||
return {
|
||||
json_data: JSON.stringify(d),
|
||||
@@ -138,9 +137,13 @@
|
||||
user_id: selected_user_id
|
||||
};
|
||||
}
|
||||
}
|
||||
};
|
||||
history_table = $('#history_table').DataTable(history_table_options);
|
||||
var colvis = new $.fn.dataTable.ColVis(history_table, { buttonText: '<i class="fa fa-columns"></i> Select columns', buttonClass: 'btn btn-dark', exclude: [0, 11] });
|
||||
var colvis = new $.fn.dataTable.ColVis(history_table, {
|
||||
buttonText: '<i class="fa fa-columns"></i> Select columns',
|
||||
buttonClass: 'btn btn-dark',
|
||||
exclude: [0, 11]
|
||||
});
|
||||
$(colvis.button()).appendTo('div.colvis-button-bar');
|
||||
|
||||
clearSearchButton('history_table', history_table);
|
||||
@@ -160,7 +163,7 @@
|
||||
}
|
||||
|
||||
var media_type = null;
|
||||
var selected_user_id = "${_session['user_id']}" == "None" ? null : "${_session['user_id']}"
|
||||
var selected_user_id = "${_session['user_id']}" == "None" ? null : "${_session['user_id']}";
|
||||
loadHistoryTable(media_type, selected_user_id);
|
||||
|
||||
% if _session['user_group'] == 'admin':
|
||||
|
Binary file not shown.
Before Width: | Height: | Size: 23 KiB After Width: | Height: | Size: 28 KiB |
@@ -131,12 +131,13 @@
|
||||
<%def name="modalIncludes()">
|
||||
|
||||
% if _session['user_group'] == 'admin' and config['update_show_changelog']:
|
||||
<% from plexpy.common import VERSION_NUMBER %>
|
||||
<div id="changelog-modal" class="modal fade wide" tabindex="-1" role="dialog" aria-labelledby="changelog-modal">
|
||||
<div class="modal-dialog" role="document">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<button type="button" class="close" data-dismiss="modal" aria-hidden="true"><i class="fa fa-remove"></i></button>
|
||||
<h4 class="modal-title">Tautulli Updated</h4>
|
||||
<h4 class="modal-title">Tautulli Updated to <strong>${VERSION_NUMBER}</strong></h4>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
</div>
|
||||
@@ -292,7 +293,9 @@
|
||||
var sc_dp = current_activity.stream_count_direct_play,
|
||||
sc_ds = current_activity.stream_count_direct_stream,
|
||||
sc_tc = current_activity.stream_count_transcode,
|
||||
total_bw = current_activity.total_bandwidth;
|
||||
total_bw = current_activity.total_bandwidth,
|
||||
lan_bw = current_activity.lan_bandwidth,
|
||||
wan_bw = current_activity.wan_bandwidth;
|
||||
var streams_header = stream_count + ' stream' + (stream_count > 1 ? 's' : '') + ' (';
|
||||
if (sc_dp) {
|
||||
streams_header += sc_dp + ' direct play' + (sc_dp > 1 ? 's' : '') + ', ';
|
||||
@@ -306,13 +309,23 @@
|
||||
streams_header = streams_header.replace(/, $/, '') + ')';
|
||||
$('#currentActivityHeader-streams').text(streams_header);
|
||||
|
||||
var bandwidth_header = (total_bw > 1000) ? ((total_bw / 1000).toFixed(1) + ' Mbps') : (total_bw + ' kbps');
|
||||
var bandwidth_header = ((total_bw > 1000) ? ((total_bw / 1000).toFixed(1) + ' Mbps') : (total_bw + ' kbps'));
|
||||
var lan_wan_bandwidth_header = '';
|
||||
if (lan_bw) {
|
||||
lan_wan_bandwidth_header += 'LAN: ' + ((lan_bw > 1000) ? ((lan_bw / 1000).toFixed(1) + ' Mbps') : (lan_bw + ' kbps')) + ', ';
|
||||
}
|
||||
if (wan_bw) {
|
||||
lan_wan_bandwidth_header += 'WAN: ' + ((wan_bw > 1000) ? ((wan_bw / 1000).toFixed(1) + ' Mbps') : (wan_bw + ' kbps')) + ', ';
|
||||
}
|
||||
if (lan_wan_bandwidth_header) {
|
||||
bandwidth_header += ' (' + lan_wan_bandwidth_header.replace(/, $/, '') + ')';
|
||||
}
|
||||
$('#currentActivityHeader-bandwidth').text(bandwidth_header);
|
||||
|
||||
$('#currentActivityHeader').show();
|
||||
|
||||
sessions.forEach(function (session) {
|
||||
var s = new Proxy(session, defaultHandler);
|
||||
var s = (typeof Proxy === "function") ? new Proxy(session, defaultHandler) : session;
|
||||
var key = s.session_key;
|
||||
var session_id = s.session_id;
|
||||
var instance = $('#activity-instance-' + key);
|
||||
@@ -325,25 +338,26 @@
|
||||
}
|
||||
|
||||
// Update play state icon
|
||||
var state_icon = '';
|
||||
switch (s.state) {
|
||||
case 'playing':
|
||||
var state_icon = '<i class="fa fa-fw fa-play"></i> ';
|
||||
state_icon = '<i class="fa fa-fw fa-play"></i> ';
|
||||
break;
|
||||
case 'paused':
|
||||
var state_icon = '<i class="fa fa-fw fa-pause"></i> ';
|
||||
state_icon = '<i class="fa fa-fw fa-pause"></i> ';
|
||||
break;
|
||||
case 'buffering':
|
||||
var state_icon = '<i class="fa fa-fw fa-spinner"></i> ';
|
||||
state_icon = '<i class="fa fa-fw fa-spinner"></i> ';
|
||||
break;
|
||||
default:
|
||||
var state_icon = '<i class="fa fa-fw fa-question-circle"></i> ';
|
||||
state_icon = '<i class="fa fa-fw fa-question-circle"></i> ';
|
||||
}
|
||||
$('#play-state-' + key).html(state_icon).attr('title', capitalizeFirstLetter(s.state));
|
||||
|
||||
// Switching tracks can be under the same session key, so need to update the info.
|
||||
if (s.media_type === 'track') {
|
||||
// Update if artist changed
|
||||
if (s.grandparent_rating_key != instance.data('grandparent_rating_key')) {
|
||||
if (s.grandparent_rating_key !== instance.data('grandparent_rating_key')) {
|
||||
$('#background-' + key).css('background-image', 'url(pms_image_proxy?img=' + s.art + '&width=500&height=280&fallback=art&refresh=true)');
|
||||
$('#metadata-grandparent_title-' + key)
|
||||
.attr('href', 'info?rating_key=' + s.grandparent_rating_key)
|
||||
@@ -351,7 +365,7 @@
|
||||
.text(s.grandparent_title);
|
||||
}
|
||||
// Update cover if album changed
|
||||
if (s.parent_rating_key != instance.data('parent_rating_key')) {
|
||||
if (s.parent_rating_key !== instance.data('parent_rating_key')) {
|
||||
$('#poster-' + key).css('background-image', 'url(pms_image_proxy?img=' + s.parent_thumb + '&width=300&height=300&fallback=poster&refresh=true)');
|
||||
$('#poster-' + key + '-bg').css('background-image', 'url(pms_image_proxy?img=' + s.parent_thumb + '&width=300&height=300&fallback=poster&refresh=true)');
|
||||
$('#poster-url-' + key)
|
||||
@@ -363,7 +377,7 @@
|
||||
.text(s.parent_title);
|
||||
}
|
||||
// Update cover if track changed
|
||||
if (s.parent_rating_key != instance.data('parent_rating_key')) {
|
||||
if (s.parent_rating_key !== instance.data('parent_rating_key')) {
|
||||
$('#metadata-title-' + key)
|
||||
.attr('href', 'info?rating_key=' + s.rating_key)
|
||||
.attr('title', s.title)
|
||||
@@ -374,7 +388,7 @@
|
||||
// Update the transcode state
|
||||
var transcode_decision = '';
|
||||
if (s.transcode_decision === 'transcode') {
|
||||
var throttled = (s.transcode_throttled == 1) ? ' (Throttled)' : ' (Speed: ' + s.transcode_speed + ')';
|
||||
var throttled = (s.transcode_throttled === 1) ? ' (Throttled)' : ' (Speed: ' + s.transcode_speed + ')';
|
||||
transcode_decision = 'Transcode' + throttled;
|
||||
} else if (s.transcode_decision === 'copy') {
|
||||
transcode_decision = 'Direct Stream';
|
||||
@@ -385,44 +399,40 @@
|
||||
|
||||
var transcode_container = '';
|
||||
if (s.stream_container_decision === 'transcode') {
|
||||
transcode_container = 'Transcode (' + s.container.toUpperCase() + ' → ' + s.stream_container.toUpperCase() + ')';
|
||||
transcode_container = 'Transcode (' + s.container.toUpperCase() + ' <i class="fa fa-long-arrow-right"></i> ' + s.stream_container.toUpperCase() + ')';
|
||||
} else {
|
||||
transcode_container = 'Direct Play (' + s.container.toUpperCase() + ')';
|
||||
}
|
||||
$('#transcode_container-' + key).html(transcode_container);
|
||||
|
||||
var video_decision = '';
|
||||
if (['movie', 'episode', 'clip'].indexOf(s.media_type) > -1 && s.video_decision != '') {
|
||||
if (['movie', 'episode', 'clip'].indexOf(s.media_type) > -1 && s.video_decision !== '') {
|
||||
var v_res= '';
|
||||
switch (s.video_resolution.toLowerCase()) {
|
||||
case 'sd':
|
||||
var v_res = 'SD';
|
||||
v_res = 'SD';
|
||||
break;
|
||||
case '4k':
|
||||
var v_res = '4k';
|
||||
v_res = '4k';
|
||||
break;
|
||||
default:
|
||||
var v_res = s.video_resolution + 'p'
|
||||
v_res = s.video_resolution + 'p'
|
||||
}
|
||||
var sv_res = '';
|
||||
switch (s.stream_video_resolution.toLowerCase()) {
|
||||
case 'sd':
|
||||
var sv_res = 'SD';
|
||||
sv_res = 'SD';
|
||||
break;
|
||||
case '4k':
|
||||
var sv_res = '4k';
|
||||
sv_res = '4k';
|
||||
break;
|
||||
default:
|
||||
var sv_res = s.stream_video_resolution + 'p'
|
||||
sv_res = s.stream_video_resolution + 'p'
|
||||
}
|
||||
if (s.stream_video_decision === 'transcode') {
|
||||
var hw_d = '';
|
||||
var hw_e = '';
|
||||
if (s.transcode_hw_requested === 1 && s.transcode_hw_full_pipeline === 0) {
|
||||
hw_d = ' (HW)';
|
||||
} else if (s.transcode_hw_requested === 1 && s.transcode_hw_full_pipeline === 1) {
|
||||
hw_d = ' (HW)';
|
||||
hw_e = ' (HW)';
|
||||
}
|
||||
video_decision = 'Transcode (' + s.video_codec.toUpperCase() + hw_d + ' ' + v_res + ' → ' + s.stream_video_codec.toUpperCase() + hw_e + ' ' + sv_res + ')';
|
||||
var hw_d = (s.transcode_hw_decoding === 1) ? ' (HW)' : '';
|
||||
var hw_e = (s.transcode_hw_encoding === 1) ? ' (HW)' : '';
|
||||
video_decision = 'Transcode (' + s.video_codec.toUpperCase() + hw_d + ' ' + v_res + ' <i class="fa fa-long-arrow-right"></i> ' + s.stream_video_codec.toUpperCase() + hw_e + ' ' + sv_res + ')';
|
||||
} else if (s.stream_video_decision === 'copy') {
|
||||
video_decision = 'Direct Stream (' + s.stream_video_codec.toUpperCase() + ' ' + sv_res + ')';
|
||||
} else {
|
||||
@@ -434,11 +444,11 @@
|
||||
$('#video_decision-' + key).html(video_decision);
|
||||
|
||||
var audio_decision = '';
|
||||
if (['movie', 'episode', 'clip', 'track'].indexOf(s.media_type) > -1 && s.audio_codec) {
|
||||
if (['movie', 'episode', 'clip', 'track'].indexOf(s.media_type) > -1 && s.audio_decision) {
|
||||
var a_codec = (s.audio_codec === 'truehd') ? 'TrueHD' : s.audio_codec.toUpperCase();
|
||||
var sa_codec = (s.stream_audio_codec === 'truehd') ? 'TrueHD' : s.stream_audio_codec.toUpperCase();
|
||||
if (s.stream_audio_decision === 'transcode') {
|
||||
audio_decision = 'Transcode (' + a_codec + ' ' + capitalizeFirstLetter(s.audio_channel_layout.split('(')[0]) + ' → ' + sa_codec + ' ' + capitalizeFirstLetter(s.stream_audio_channel_layout.split('(')[0]) + ')';
|
||||
audio_decision = 'Transcode (' + a_codec + ' ' + capitalizeFirstLetter(s.audio_channel_layout.split('(')[0]) + ' <i class="fa fa-long-arrow-right"></i> ' + sa_codec + ' ' + capitalizeFirstLetter(s.stream_audio_channel_layout.split('(')[0]) + ')';
|
||||
} else if (s.stream_audio_decision === 'copy') {
|
||||
audio_decision = 'Direct Stream (' + sa_codec + ' ' + capitalizeFirstLetter(s.stream_audio_channel_layout.split('(')[0]) + ')';
|
||||
} else {
|
||||
@@ -450,19 +460,19 @@
|
||||
var subtitle_decision = 'None';
|
||||
if (['movie', 'episode', 'clip'].indexOf(s.media_type) > -1 && s.subtitles === 1) {
|
||||
if (s.stream_subtitle_decision === 'transcode') {
|
||||
subtitle_decision = 'Transcode (' + s.subtitle_codec.toUpperCase() + ' → ' + s.stream_subtitle_codec.toUpperCase() + ')';
|
||||
subtitle_decision = 'Transcode (' + s.subtitle_codec.toUpperCase() + ' <i class="fa fa-long-arrow-right"></i> ' + s.stream_subtitle_codec.toUpperCase() + ')';
|
||||
} else if (s.stream_subtitle_decision === 'copy') {
|
||||
subtitle_decision = 'Direct Stream (' + s.subtitle_codec.toUpperCase() + ')';
|
||||
} else if (s.stream_subtitle_decision === 'burn') {
|
||||
subtitle_decision = 'Burn (' + s.subtitle_codec.toUpperCase() + ')';
|
||||
} else {
|
||||
subtitle_decision = 'Direct Play (' + ((s.synced_version == '1') ? s.stream_subtitle_codec.toUpperCase() : s.subtitle_codec.toUpperCase()) + ')';
|
||||
subtitle_decision = 'Direct Play (' + ((s.synced_version === '1') ? s.stream_subtitle_codec.toUpperCase() : s.subtitle_codec.toUpperCase()) + ')';
|
||||
}
|
||||
}
|
||||
$('#subtitle_decision-' + key).html(subtitle_decision);
|
||||
|
||||
// Update the stream quality profile and bandwidth
|
||||
if (s.media_type != 'photo' && s.quality_profile != 'Unknown') {
|
||||
if (s.media_type !== 'photo' && s.quality_profile !== 'Unknown') {
|
||||
var br = parseInt(s.stream_bitrate) || '';
|
||||
if (br) {
|
||||
if (br > 1000) {
|
||||
@@ -478,9 +488,11 @@
|
||||
$('#optimized_version-' + key).html(s.optimized_version_profile + ' (' + s.optimized_version_title + ')');
|
||||
$('#synced_quality_profile-' + key).html(s.synced_quality_profile);
|
||||
|
||||
if (s.media_type != 'photo' && parseInt(s.bandwidth)) {
|
||||
$('#location-' + key).html(s.location.toUpperCase());
|
||||
|
||||
if (s.media_type !== 'photo' && parseInt(s.bandwidth)) {
|
||||
var bw = parseInt(s.bandwidth);
|
||||
if (bw != "Unknown") {
|
||||
if (bw !== "Unknown") {
|
||||
if (bw > 1000) {
|
||||
bw = (bw / 1000).toFixed(1) + ' Mbps';
|
||||
} else {
|
||||
@@ -492,17 +504,19 @@
|
||||
|
||||
// Update the stream progress times
|
||||
$('#stream-eta-' + key).html(moment().add(parseInt(s.duration) - parseInt(s.view_offset), 'milliseconds').format(time_format));
|
||||
$('#stream-view-offset-' + key).data('state', s.state);
|
||||
if ($('#stream-view-offset-' + key).data('last_view_offset') != s.view_offset) {
|
||||
$('#stream-view-offset-' + key).data('last_view_offset', s.view_offset).data('view_offset', s.view_offset);
|
||||
var stream_view_offset = $('#stream-view-offset-' + key);
|
||||
stream_view_offset.data('state', s.state);
|
||||
if (stream_view_offset.data('last_view_offset') !== s.view_offset) {
|
||||
stream_view_offset.data('last_view_offset', s.view_offset).data('view_offset', s.view_offset);
|
||||
}
|
||||
|
||||
// Update the progress bars, percent - 3 because of 3px padding-right
|
||||
$('#buffer-bar-' + key).width(parseInt(s.transcode_progress) - 3 + '%').html(s.transcode_progress + '%')
|
||||
.attr('data-original-title', 'Transcoder Progress ' + s.transcode_progress + '%');
|
||||
$('#progress-bar-' + key).data('state', s.state);
|
||||
if ($('#progress-bar-' + key).data('last_view_offset') != s.view_offset) {
|
||||
$('#progress-bar-' + key).data('last_view_offset', s.view_offset).data('view_offset', s.view_offset);
|
||||
var progress_bar = $('#progress-bar-' + key);
|
||||
progress_bar.data('state', s.state);
|
||||
if (progress_bar.data('last_view_offset') && progress_bar.data('last_view_offset') !== s.view_offset) {
|
||||
progress_bar.data('last_view_offset', s.view_offset).data('view_offset', s.view_offset);
|
||||
}
|
||||
|
||||
// Add temporary class so we know which instances are still active
|
||||
@@ -771,13 +785,13 @@
|
||||
leftTotal = Math.max(Math.min(leftTotal + scrollAmount, 0), leftMax);
|
||||
scroller.animate({ left: leftTotal }, 250);
|
||||
|
||||
if (leftTotal == 0) {
|
||||
if (leftTotal === 0) {
|
||||
$("#recently-added-page-left").addClass("disabled").blur();
|
||||
} else {
|
||||
$("#recently-added-page-left").removeClass("disabled");
|
||||
}
|
||||
|
||||
if (leftTotal == leftMax) {
|
||||
if (leftTotal === leftMax) {
|
||||
$("#recently-added-page-right").addClass("disabled").blur();
|
||||
} else {
|
||||
$("#recently-added-page-right").removeClass("disabled");
|
||||
@@ -809,7 +823,7 @@
|
||||
$.ajax({
|
||||
url: 'get_changelog',
|
||||
data: {
|
||||
latest_only: true,
|
||||
since_prev_release: true,
|
||||
update_shown: true
|
||||
},
|
||||
cache: false,
|
||||
|
@@ -38,20 +38,21 @@ DOCUMENTATION :: END
|
||||
<%!
|
||||
import re
|
||||
|
||||
from plexpy import common, notifiers
|
||||
from plexpy import notifiers
|
||||
from plexpy.common import MEDIA_TYPE_HEADERS, MEDIA_FLAGS_AUDIO, MEDIA_FLAGS_VIDEO
|
||||
|
||||
# Get audio codec file
|
||||
def af(codec):
|
||||
for pattern, file in common.MEDIA_FLAGS_AUDIO.iteritems():
|
||||
for pattern, file_type in MEDIA_FLAGS_AUDIO.iteritems():
|
||||
if re.match(pattern, codec):
|
||||
return file
|
||||
return file_type
|
||||
return codec
|
||||
|
||||
# Get audio codec file
|
||||
def vf(codec):
|
||||
for pattern, file in common.MEDIA_FLAGS_VIDEO.iteritems():
|
||||
for pattern, file_type in MEDIA_FLAGS_VIDEO.iteritems():
|
||||
if re.match(pattern, codec):
|
||||
return file
|
||||
return file_type
|
||||
return codec
|
||||
|
||||
def br(text):
|
||||
@@ -116,9 +117,9 @@ DOCUMENTATION :: END
|
||||
<div class="col-md-9">
|
||||
<div class="summary-content-poster hidden-xs hidden-sm">
|
||||
% if data['media_type'] == 'track':
|
||||
<a href="${config['pms_web_url']}#!/server/${config['pms_identifier']}/details?key=%2Flibrary%2Fmetadata%2F${data['parent_rating_key']}" target="_blank" title="View in Plex Web">
|
||||
<a href="${config['pms_web_url']}#!/server/${config['pms_identifier']}/details?key=%2Flibrary%2Fmetadata%2F${data['parent_rating_key']}" target="_blank" title="View on Plex Web">
|
||||
% else:
|
||||
<a href="${config['pms_web_url']}#!/server/${config['pms_identifier']}/details?key=%2Flibrary%2Fmetadata%2F${data['rating_key']}" target="_blank" title="View in Plex Web">
|
||||
<a href="${config['pms_web_url']}#!/server/${config['pms_identifier']}/details?key=%2Flibrary%2Fmetadata%2F${data['rating_key']}" target="_blank" title="View on Plex Web">
|
||||
% endif
|
||||
% if data['media_type'] == 'episode':
|
||||
<div class="summary-poster-face-episode" style="background-image: url(pms_image_proxy?img=${data['thumb']}&width=500&height=280&fallback=art);">
|
||||
@@ -356,7 +357,7 @@ DOCUMENTATION :: END
|
||||
<div class="col-md-12">
|
||||
<div class="table-card-header">
|
||||
<div class="header-bar">
|
||||
<span>Movies in <strong>${data['title']}</strong> collection</span>
|
||||
<span>${MEDIA_TYPE_HEADERS[data['sub_media_type']]} in <strong>${data['title']}</strong> collection</span>
|
||||
</div>
|
||||
</div>
|
||||
<div class="table-card-back">
|
||||
|
@@ -28,22 +28,15 @@ DOCUMENTATION :: END
|
||||
|
||||
% if data != None:
|
||||
<%
|
||||
from plexpy.common import MEDIA_TYPE_HEADERS
|
||||
types = ('movie', 'show', 'artist', 'album')
|
||||
headers = {'movie': 'Movies',
|
||||
'show': 'TV Shows',
|
||||
'season': 'Seasons',
|
||||
'episode': 'Episodes',
|
||||
'artist': 'Artists',
|
||||
'album': 'Albums',
|
||||
'track': 'Tracks',
|
||||
}
|
||||
%>
|
||||
% for media_type in types:
|
||||
% if data['results_list'][media_type]:
|
||||
<div class="col-md-12">
|
||||
<div class="table-card-header">
|
||||
<div class="header-bar">
|
||||
<span>${headers[media_type]} in <strong>${title}</strong> collection</span>
|
||||
<span>${MEDIA_TYPE_HEADERS[media_type]} in <strong>${title}</strong> collection</span>
|
||||
</div>
|
||||
</div>
|
||||
<div class="table-card-back">
|
||||
|
@@ -2,7 +2,7 @@
|
||||
|
||||
PNotify.prototype.options.addclass = "stack-bottomright";
|
||||
PNotify.prototype.options.buttons.closer_hover = false;
|
||||
PNotify.prototype.options.desktop = { desktop: true, icon: 'images/logo.png' }
|
||||
PNotify.prototype.options.desktop = { desktop: true, icon: 'images/logo-circle.png' };
|
||||
PNotify.prototype.options.history = false;
|
||||
PNotify.prototype.options.shadow = false;
|
||||
PNotify.prototype.options.stack = { dir1: 'up', dir2: 'left', firstpos1: 25, firstpos2: 25 };
|
||||
@@ -21,7 +21,7 @@ function check_notifications() {
|
||||
$.getJSON('get_browser_notifications', function (data) {
|
||||
if (data) {
|
||||
$.each(data, function (i, notification) {
|
||||
if (notification.delay == 0) {
|
||||
if (notification.delay === 0) {
|
||||
PNotify.prototype.options.hide = false;
|
||||
} else {
|
||||
PNotify.prototype.options.hide = true;
|
||||
@@ -34,7 +34,7 @@ function check_notifications() {
|
||||
setTimeout(function () {
|
||||
"use strict";
|
||||
check_notifications();
|
||||
}, 3000);
|
||||
}, 5000);
|
||||
}
|
||||
|
||||
$(document).ready(function () {
|
||||
|
@@ -290,19 +290,9 @@ String.prototype.toProperCase = function () {
|
||||
|
||||
function millisecondsToMinutes(ms, roundToMinute) {
|
||||
if (ms > 0) {
|
||||
seconds = ms / 1000;
|
||||
minutes = seconds / 60;
|
||||
if (roundToMinute) {
|
||||
output = Math.round(minutes, 0)
|
||||
} else {
|
||||
minutesFloor = Math.floor(minutes);
|
||||
secondsReal = Math.round((seconds - (minutesFloor * 60)), 0);
|
||||
if (secondsReal < 10) {
|
||||
secondsReal = '0' + secondsReal;
|
||||
}
|
||||
output = minutesFloor + ':' + secondsReal;
|
||||
}
|
||||
return output;
|
||||
var minutes = Math.floor(ms / 60000);
|
||||
var seconds = ((ms % 60000) / 1000).toFixed(0);
|
||||
return (seconds == 60 ? (minutes+1) + ":00" : minutes + ":" + (seconds < 10 ? "0" : "") + seconds);
|
||||
} else {
|
||||
if (roundToMinute) {
|
||||
return '0';
|
||||
|
@@ -21,7 +21,7 @@ history_table_options = {
|
||||
"infoFiltered": "<span class='hidden-md hidden-sm hidden-xs'>(filtered from _MAX_ total entries)</span>",
|
||||
"emptyTable": "No data in table",
|
||||
"loadingRecords": '<i class="fa fa-refresh fa-spin"></i> Loading items...</div>'
|
||||
},
|
||||
},
|
||||
"pagingType": "full_numbers",
|
||||
"stateSave": true,
|
||||
"processing": false,
|
||||
@@ -172,7 +172,7 @@ history_table_options = {
|
||||
},
|
||||
"width": "33%",
|
||||
"className": "datatable-wrap"
|
||||
},
|
||||
},
|
||||
{
|
||||
"targets": [7],
|
||||
"data":"started",
|
||||
@@ -322,7 +322,7 @@ history_table_options = {
|
||||
$(row).addClass('current-activity-row');
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// Parent table platform modal
|
||||
$('.history_table').on('click', '> tbody > tr > td.modal-control', function () {
|
||||
|
@@ -28,9 +28,7 @@ libraries_list_table_options = {
|
||||
$(td).html('<div class="edit-library-toggles">' +
|
||||
'<button class="btn btn-xs btn-warning delete-library" data-id="' + rowData['section_id'] + '" data-toggle="button"><i class="fa fa-trash-o fa-fw"></i> Delete</button> ' +
|
||||
'<button class="btn btn-xs btn-warning purge-library" data-id="' + rowData['section_id'] + '" data-toggle="button"><i class="fa fa-eraser fa-fw"></i> Purge</button>   ' +
|
||||
'<input type="checkbox" id="do_notify-' + rowData['section_id'] + '" name="do_notify" value="1" ' + rowData['do_notify'] + '><label class="edit-tooltip" for="do_notify-' + rowData['section_id'] + '" data-toggle="tooltip" title="Toggle Notifications"><i class="fa fa-bell fa-lg fa-fw"></i></label> ' +
|
||||
'<input type="checkbox" id="keep_history-' + rowData['section_id'] + '" name="keep_history" value="1" ' + rowData['keep_history'] + '><label class="edit-tooltip" for="keep_history-' + rowData['section_id'] + '" data-toggle="tooltip" title="Toggle History"><i class="fa fa-history fa-lg fa-fw"></i></label> ' +
|
||||
'<input type="checkbox" id="do_notify_created-' + rowData['section_id'] + '" name="do_notify_created" value="1" ' + rowData['do_notify_created'] + '><label class="edit-tooltip" for="do_notify_created-' + rowData['section_id'] + '" data-toggle="tooltip" title="Toggle Recently Added"><i class="fa fa-download fa-lg fa-fw"></i></label> ' +
|
||||
'</div>');
|
||||
},
|
||||
"width": "7%",
|
||||
@@ -258,15 +256,7 @@ $('#libraries_list_table').on('change', 'td.edit-control > .edit-library-toggles
|
||||
var row = libraries_list_table.row(tr);
|
||||
var rowData = row.data();
|
||||
|
||||
var do_notify = 0;
|
||||
var do_notify_created = 0;
|
||||
var keep_history = 0;
|
||||
if ($('#do_notify-' + rowData['section_id']).is(':checked')) {
|
||||
do_notify = 1;
|
||||
}
|
||||
if ($('#do_notify_created-' + rowData['section_id']).is(':checked')) {
|
||||
do_notify_created = 1;
|
||||
}
|
||||
if ($('#keep_history-' + rowData['section_id']).is(':checked')) {
|
||||
keep_history = 1;
|
||||
}
|
||||
@@ -280,8 +270,6 @@ $('#libraries_list_table').on('change', 'td.edit-control > .edit-library-toggles
|
||||
url: 'edit_library',
|
||||
data: {
|
||||
section_id: rowData['section_id'],
|
||||
do_notify: do_notify,
|
||||
do_notify_created: do_notify_created,
|
||||
keep_history: keep_history,
|
||||
custom_thumb: custom_thumb
|
||||
},
|
||||
|
@@ -98,7 +98,7 @@ sync_table_options = {
|
||||
"data": "total_size",
|
||||
"createdCell": function (td, cellData, rowData, row, col) {
|
||||
if (cellData > 0 ) {
|
||||
megabytes = Math.round((cellData/1024)/1024, 0)
|
||||
megabytes = Math.round((cellData/1024)/1024, 0);
|
||||
$(td).html(megabytes + 'MB');
|
||||
} else {
|
||||
$(td).html('0MB');
|
||||
@@ -144,14 +144,16 @@ sync_table_options = {
|
||||
var msg = "<i class='fa fa-refresh fa-spin'></i> Fetching rows...";
|
||||
showMsg(msg, false, false, 0)
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
$('#sync_table').on('click', 'td.delete-control > .edit-sync-toggles > button.delete-sync', function () {
|
||||
var tr = $(this).parents('tr');
|
||||
var row = sync_table.row(tr);
|
||||
var rowData = row.data();
|
||||
|
||||
var index_delete = syncs_to_delete.findIndex(x => x.client_id == rowData['client_id'] && x.sync_id == rowData['sync_id']);
|
||||
var index_delete = syncs_to_delete.findIndex(function (x) {
|
||||
return x.client_id === rowData['client_id'] && x.sync_id === rowData['sync_id'];
|
||||
});
|
||||
|
||||
if (index_delete === -1) {
|
||||
syncs_to_delete.push({ client_id: rowData['client_id'], sync_id: rowData['sync_id'] });
|
||||
|
@@ -45,7 +45,6 @@ users_list_table_options = {
|
||||
$(td).html('<div class="edit-user-toggles">' +
|
||||
'<button class="btn btn-xs btn-warning delete-user" data-id="' + rowData['user_id'] + '" data-toggle="button"><i class="fa fa-trash-o fa-fw"></i> Delete</button> ' +
|
||||
'<button class="btn btn-xs btn-warning purge-user" data-id="' + rowData['user_id'] + '" data-toggle="button"><i class="fa fa-eraser fa-fw"></i> Purge</button>   ' +
|
||||
'<input type="checkbox" id="do_notify-' + rowData['user_id'] + '" name="do_notify" value="1" ' + rowData['do_notify'] + '><label class="edit-tooltip" for="do_notify-' + rowData['user_id'] + '" data-toggle="tooltip" title="Toggle Notifications"><i class="fa fa-bell fa-lg fa-fw"></i></label> ' +
|
||||
'<input type="checkbox" id="keep_history-' + rowData['user_id'] + '" name="keep_history" value="1" ' + rowData['keep_history'] + '><label class="edit-tooltip" for="keep_history-' + rowData['user_id'] + '" data-toggle="tooltip" title="Toggle History"><i class="fa fa-history fa-lg fa-fw"></i></label> ' +
|
||||
'<input type="checkbox" id="allow_guest-' + rowData['user_id'] + '" name="allow_guest" value="1" ' + rowData['allow_guest'] + '><label class="edit-tooltip" for="allow_guest-' + rowData['user_id'] + '" data-toggle="tooltip" title="Toggle Guest Access"><i class="fa fa-unlock-alt fa-lg fa-fw"></i></label> ' +
|
||||
'</div>');
|
||||
@@ -284,12 +283,8 @@ $('#users_list_table').on('change', 'td.edit-control > .edit-user-toggles > inpu
|
||||
var row = users_list_table.row(tr);
|
||||
var rowData = row.data();
|
||||
|
||||
var do_notify = 0;
|
||||
var keep_history = 0;
|
||||
var allow_guest = 0;
|
||||
if ($('#do_notify-' + rowData['user_id']).is(':checked')) {
|
||||
do_notify = 1;
|
||||
}
|
||||
if ($('#keep_history-' + rowData['user_id']).is(':checked')) {
|
||||
keep_history = 1;
|
||||
}
|
||||
@@ -304,7 +299,6 @@ $('#users_list_table').on('change', 'td.edit-control > .edit-user-toggles > inpu
|
||||
data: {
|
||||
user_id: rowData['user_id'],
|
||||
friendly_name: friendly_name,
|
||||
do_notify: do_notify,
|
||||
keep_history: keep_history,
|
||||
allow_guest: allow_guest,
|
||||
thumb: rowData['user_thumb']
|
||||
|
@@ -41,17 +41,15 @@
|
||||
</div>
|
||||
<div class="row">
|
||||
<div class="col-sm-6 col-sm-offset-3">
|
||||
<form action="${http_root}auth/login" method="post">
|
||||
% if msg:
|
||||
<div class="alert alert-danger" style="text-align: center; padding: 8px;">
|
||||
${msg}
|
||||
<form id="login-form">
|
||||
<div id="incorrect-login" class="alert alert-danger" style="text-align: center; padding: 8px; display: none;">
|
||||
Incorrect username or password.
|
||||
</div>
|
||||
% endif
|
||||
<div class="form-group">
|
||||
<label for="username" class="control-label">
|
||||
Username
|
||||
</label>
|
||||
<input type="text" id="username" name="username" class="form-control" autocorrect="off" autocapitalize="off" value="${username}" autofocus>
|
||||
<input type="text" id="username" name="username" class="form-control" autocorrect="off" autocapitalize="off" autofocus>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label for="password" class="control-label">
|
||||
@@ -65,7 +63,7 @@
|
||||
<input type="checkbox" id="remember_me" name="remember_me" title="for 30 days" value="1" checked="checked" /> Remember me
|
||||
</label>
|
||||
</div>
|
||||
<button type="submit" class="btn btn-bright login-button"><i class="fa fa-sign-in"></i> Sign In</button>
|
||||
<button id="sign-in" type="submit" class="btn btn-bright login-button"><i class="fa fa-sign-in"></i> Sign In</button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
@@ -75,5 +73,30 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script src="${http_root}js/jquery-2.1.4.min.js"></script>
|
||||
<script>
|
||||
$('#login-form').submit(function(event) {
|
||||
event.preventDefault();
|
||||
$('#sign-in').prop('disabled', true).html('<i class="fa fa-refresh fa-spin"></i> Sign In');
|
||||
$.ajax({
|
||||
url: '${http_root}auth/signin',
|
||||
type: 'POST',
|
||||
data: $(this).serialize(),
|
||||
dataType: 'json',
|
||||
statusCode: {
|
||||
200: function() {
|
||||
window.location = "${http_root}";
|
||||
},
|
||||
401: function() {
|
||||
$('#incorrect-login').show();
|
||||
$('#username').focus();
|
||||
}
|
||||
},
|
||||
complete: function() {
|
||||
$('#sign-in').prop('disabled', false).html('<i class="fa fa-sign-in"></i> Sign In');
|
||||
}
|
||||
});
|
||||
});
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
@@ -385,8 +385,9 @@
|
||||
|
||||
$("#clear-logs").click(function () {
|
||||
var logfile = $(".tab-pane.active").data('logfile')
|
||||
var title = $("#log_tabs li.active a").text()
|
||||
|
||||
$("#confirm-message").text("Are you sure you want to clear the Tautulli logs?");
|
||||
$("#confirm-message").text("Are you sure you want to clear the " + title + "?");
|
||||
$('#confirm-modal').modal();
|
||||
$('#confirm-modal').one('click', '#confirm-button', function () {
|
||||
$.ajax({
|
||||
@@ -421,7 +422,7 @@
|
||||
});
|
||||
|
||||
$("#clear-notify-logs").click(function () {
|
||||
$("#confirm-message").text("Are you sure you want to clear the Tautulli notification logs?");
|
||||
$("#confirm-message").text("Are you sure you want to clear the Tautulli Notification Logs?");
|
||||
$('#confirm-modal').modal();
|
||||
$('#confirm-modal').one('click', '#confirm-button', function () {
|
||||
$.ajax({
|
||||
@@ -442,7 +443,7 @@
|
||||
});
|
||||
|
||||
$("#clear-login-logs").click(function () {
|
||||
$("#confirm-message").text("Are you sure you want to clear the Tautulli login logs?");
|
||||
$("#confirm-message").text("Are you sure you want to clear the Tautulli Login Logs?");
|
||||
$('#confirm-modal').modal();
|
||||
$('#confirm-modal').one('click', '#confirm-button', function () {
|
||||
$.ajax({
|
||||
|
@@ -1,6 +1,10 @@
|
||||
<%!
|
||||
from plexpy import helpers, notifiers
|
||||
import json
|
||||
from plexpy import helpers, notifiers, users
|
||||
available_notification_actions = notifiers.available_notification_actions()
|
||||
|
||||
user_emails = [{'user': u['friendly_name'] or u['username'], 'email': u['email']} for u in users.Users().get_users() if u['email']]
|
||||
sorted(user_emails, key=lambda u: u['user'])
|
||||
%>
|
||||
% if notifier:
|
||||
<link href="${http_root}css/selectize.bootstrap3.css" rel="stylesheet" />
|
||||
@@ -39,7 +43,7 @@
|
||||
<div class="form-group">
|
||||
<label for="${item['name']}">${item['label']}</label>
|
||||
<div class="row">
|
||||
<div class="col-md-8">
|
||||
<div class="col-md-12">
|
||||
<input type="${item['input_type']}" class="form-control" id="${item['name']}" name="${item['name']}" value="${item['value']}" size="30" ${'readonly' if item.get('readonly') else ''}>
|
||||
% if item['name'] == 'osx_notify_app':
|
||||
<a href="javascript:void(0)" id="osxnotifyregister">Register</a>
|
||||
@@ -62,7 +66,7 @@
|
||||
<div class="form-group">
|
||||
<label for="${item['name']}">${item['label']}</label>
|
||||
<div class="row">
|
||||
<div class="col-md-8">
|
||||
<div class="col-md-12">
|
||||
<input type="button" class="btn btn-bright" id="${item['name']}" name="${item['name']}" value="${item['value']}">
|
||||
</div>
|
||||
</div>
|
||||
@@ -80,7 +84,7 @@
|
||||
<div class="form-group">
|
||||
<label for="${item['name']}">${item['label']}</label>
|
||||
<div class="row">
|
||||
<div class="col-md-8">
|
||||
<div class="col-md-12">
|
||||
<select class="form-control" id="${item['name']}" name="${item['name']}">
|
||||
% for key, value in sorted(item['select_options'].iteritems()):
|
||||
% if key == item['value']:
|
||||
@@ -94,6 +98,33 @@
|
||||
</div>
|
||||
<p class="help-block">${item['description'] | n}</p>
|
||||
</div>
|
||||
% elif item['input_type'] == 'selectize':
|
||||
<div class="form-group">
|
||||
<label for="${item['name']}">${item['label']}</label>
|
||||
<div class="row">
|
||||
<div class="col-md-12">
|
||||
<select class="form-control" id="${item['name']}" name="${item['name']}">
|
||||
<option value="select-all">Select All</option>
|
||||
<option value="remove-all">Remove All</option>
|
||||
% if isinstance(item['select_options'], dict):
|
||||
% for section, options in item['select_options'].iteritems():
|
||||
<optgroup label="${section}">
|
||||
% for option in sorted(options, key=lambda x: x['text'].lower()):
|
||||
<option value="${option['value']}">${option['text']}</option>
|
||||
% endfor
|
||||
</optgroup>
|
||||
% endfor
|
||||
% else:
|
||||
<option value="border-all"></option>
|
||||
% for option in sorted(item['select_options'], key=lambda x: x['text'].lower()):
|
||||
<option value="${option['value']}">${option['text']}</option>
|
||||
% endfor
|
||||
% endif
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<p class="help-block">${item['description'] | n}</p>
|
||||
</div>
|
||||
% endif
|
||||
% endfor
|
||||
</div>
|
||||
@@ -101,7 +132,7 @@
|
||||
<div class="form-group">
|
||||
<label for="friendly_name">Description</label>
|
||||
<div class="row">
|
||||
<div class="col-md-8">
|
||||
<div class="col-md-12">
|
||||
<input type="text" class="form-control" id="friendly_name" name="friendly_name" value="${notifier['friendly_name']}" size="30">
|
||||
</div>
|
||||
</div>
|
||||
@@ -132,12 +163,9 @@
|
||||
<div role="tabpanel" class="tab-pane" id="tabs-notify_conditions">
|
||||
<label>Notification Conditions</label>
|
||||
<p class="help-block">
|
||||
Add custom notification conditions.
|
||||
Add custom conditions to only <strong>allow certain notifications</strong>. By default, all notifications will be sent if there are no conditions.
|
||||
<a href="#notify-text-sub-modal" data-toggle="modal">Click here</a> for a description of all the parameters.
|
||||
</p>
|
||||
<p class="help-block">
|
||||
Note: Conditions are checked after the notification trigger and the notification will only be sent if the condition logic is satisfied.
|
||||
</p>
|
||||
<div id="condition-widget"></div>
|
||||
<input type="hidden" name="custom_conditions" id="custom_conditions" />
|
||||
|
||||
@@ -146,7 +174,8 @@
|
||||
<input type="text" class="form-control" name="custom_conditions_logic" id="custom_conditions_logic" value="${notifier['custom_conditions_logic']}" required />
|
||||
<div id="custom_conditions_logic_error" class="alert alert-danger" role="alert" style="padding-top: 5px; padding-bottom: 5px; margin: 0; display: none;"><i class="fa fa-exclamation-triangle" style="color: #a94442;"></i> <span></span></div>
|
||||
<p class="help-block">
|
||||
Enter the logic to use when evaluating the conditions (e.g. <span class="inline-pre">{1} and ({2} or {3})</span>).
|
||||
Optional: Enter custom logic to use when evaluating the conditions (e.g. <span class="inline-pre">{1} and ({2} or {3})</span>).
|
||||
Leave blank for implicit <span class="inline-pre">and</span> between all conditions.
|
||||
</p>
|
||||
<p class="help-block">
|
||||
Note: Only the keywords <span class="inline-pre">and</span>/<span class="inline-pre">or</span> and brackets <span class="inline-pre">()</span> are supported.
|
||||
@@ -187,7 +216,7 @@
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<div class="row">
|
||||
<div class="col-md-8">
|
||||
<div class="col-md-12">
|
||||
<input type="button" class="btn btn-bright notifier-text-preview" data-action="${action['name']}" value="Preview Arguments">
|
||||
</div>
|
||||
</div>
|
||||
@@ -214,7 +243,7 @@
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<div class="row">
|
||||
<div class="col-md-8">
|
||||
<div class="col-md-12">
|
||||
<input type="button" class="btn btn-bright notifier-text-preview" data-action="${action['name']}" value="Preview Text">
|
||||
</div>
|
||||
</div>
|
||||
@@ -280,7 +309,7 @@
|
||||
% endif
|
||||
<div class="form-group">
|
||||
<div class="row">
|
||||
<div class="col-md-8">
|
||||
<div class="col-md-12">
|
||||
<input type="button" class="btn btn-bright" id="test_notifier" name="test_notifier" value="Test ${notifier['agent_label']}">
|
||||
</div>
|
||||
</div>
|
||||
@@ -304,7 +333,7 @@
|
||||
$('#notifier-config-modal').unbind('hidden.bs.modal');
|
||||
|
||||
// Need this for setting conditions since conditions contain the character "
|
||||
$('#custom_conditions').val('${notifier['custom_conditions'] | n}')
|
||||
$('#custom_conditions').val(${json.dumps(notifier["custom_conditions"]) | n});
|
||||
|
||||
$('#condition-widget').filterer({
|
||||
parameters: ${parameters | n},
|
||||
@@ -312,7 +341,7 @@
|
||||
updateConditions: function(newConditions){
|
||||
$('#custom_conditions').val(JSON.stringify(newConditions));
|
||||
}
|
||||
})
|
||||
});
|
||||
|
||||
function reloadModal() {
|
||||
$.ajax({
|
||||
@@ -330,7 +359,7 @@
|
||||
if (jqXHR) {
|
||||
var result = $.parseJSON(jqXHR.responseText);
|
||||
var msg = result.message;
|
||||
if (result.result == 'success') {
|
||||
if (result.result === 'success') {
|
||||
showMsg('<i class="fa fa-check"></i> ' + msg, false, true, 5000)
|
||||
} else {
|
||||
showMsg('<i class="fa fa-times"></i> ' + msg, false, true, 5000, true)
|
||||
@@ -390,7 +419,7 @@
|
||||
|
||||
% if notifier['agent_name'] == 'facebook':
|
||||
function disableFacebookRequest() {
|
||||
if ($('#facebook_app_id').val() != '' && $('#facebook_app_secret').val() != '') { $('#facebook_facebookStep1').prop('disabled', false); }
|
||||
if ($('#facebook_app_id').val() !== '' && $('#facebook_app_secret').val() !== '') { $('#facebook_facebookStep1').prop('disabled', false); }
|
||||
else { $('#facebook_facebookStep1').prop('disabled', true); }
|
||||
}
|
||||
disableFacebookRequest();
|
||||
@@ -404,19 +433,20 @@
|
||||
$('#facebook_redirect_uri').val($('#facebook_redirect_uri').val().slice(0, -1));
|
||||
}
|
||||
|
||||
var facebook_token;
|
||||
$.ajax({
|
||||
url: 'facebookStep1',
|
||||
data: {
|
||||
app_id: $('#facebook_app_id').val(),
|
||||
app_secret: $('#facebook_app_secret').val(),
|
||||
redirect_uri: $('#facebook_redirect_uri').val(),
|
||||
redirect_uri: $('#facebook_redirect_uri').val()
|
||||
},
|
||||
cache: false,
|
||||
async: true,
|
||||
complete: function (xhr, status) {
|
||||
var result = $.parseJSON(xhr.responseText);
|
||||
var msg = result.msg;
|
||||
if (result.result == 'success') {
|
||||
if (result.result === 'success') {
|
||||
showMsg('<i class="fa fa-check"></i> ' + msg, false, true, 5000);
|
||||
window.open(result.url);
|
||||
|
||||
@@ -455,18 +485,101 @@
|
||||
|
||||
$('#notifier-config-modal').on('hidden.bs.modal', function () {
|
||||
facebook_token = false;
|
||||
})
|
||||
});
|
||||
|
||||
% elif notifier['agent_name'] == 'browser':
|
||||
$('#browser_allow_browser').click(function () {
|
||||
PNotify.desktop.permission();
|
||||
})
|
||||
});
|
||||
|
||||
% elif notifier['agent_name'] == 'osx':
|
||||
$('#osxnotifyregister').click(function () {
|
||||
var osx_notify_app = $('#osx_notify_app').val();
|
||||
$.get('osxnotifyregister', { 'app': osx_notify_app }, function (data) { showMsg('<i class="fa fa-check"></i> ' + data, false, true, 3000); });
|
||||
})
|
||||
});
|
||||
|
||||
% elif notifier['agent_name'] == 'email':
|
||||
var REGEX_EMAIL = '([a-z0-9!#$%&\'*+/=?^_`{|}~-]+(?:\.[a-z0-9!#$%&\'*+/=?^_`{|}~-]+)*@' +
|
||||
'(?:[a-z0-9](?:[a-z0-9-]*[a-z0-9])?\.)+[a-z0-9](?:[a-z0-9-]*[a-z0-9])?)';
|
||||
var $email_selectors = $('#email_to, #email_cc, #email_bcc').selectize({
|
||||
plugins: ['remove_button'],
|
||||
persist: false,
|
||||
maxItems: null,
|
||||
render: {
|
||||
item: function(item, escape) {
|
||||
return '<div>' +
|
||||
(item.text ? '<span class="item-text">' + escape(item.text) + '</span>' : '') +
|
||||
(item.value ? '<span class="item-value">' + escape(item.value) + '</span>' : '') +
|
||||
'</div>';
|
||||
},
|
||||
option: function(item, escape) {
|
||||
var label = item.text || item.value;
|
||||
var caption = item.text ? item.value : null;
|
||||
if (item.value.endsWith('-all')) {
|
||||
return '<div class="' + item.value + '">' + escape(label) + '</div>'
|
||||
}
|
||||
return '<div>' +
|
||||
escape(label) +
|
||||
(caption ? '<span class="caption">' + escape(caption) + '</span>' : '') +
|
||||
'</div>';
|
||||
}
|
||||
},
|
||||
onItemAdd: function(value) {
|
||||
if (value === 'select-all') {
|
||||
var all_keys = $.map(this.options, function(option){
|
||||
return option.value.endsWith('-all') ? null : option.value;
|
||||
});
|
||||
this.setValue(all_keys);
|
||||
} else if (value === 'remove-all') {
|
||||
this.clear();
|
||||
this.refreshOptions();
|
||||
this.positionDropdown();
|
||||
}
|
||||
},
|
||||
createFilter: function(input) {
|
||||
var match, regex;
|
||||
|
||||
// email@address.com
|
||||
regex = new RegExp('^' + REGEX_EMAIL + '$', 'i');
|
||||
match = input.match(regex);
|
||||
if (match) return !this.options.hasOwnProperty(match[0]);
|
||||
|
||||
// user <email@address.com>
|
||||
regex = new RegExp('^([^<]*)\<' + REGEX_EMAIL + '\>$', 'i');
|
||||
match = input.match(regex);
|
||||
if (match) return !this.options.hasOwnProperty(match[2]);
|
||||
|
||||
return false;
|
||||
},
|
||||
create: function(input) {
|
||||
if ((new RegExp('^' + REGEX_EMAIL + '$', 'i')).test(input)) {
|
||||
return {value: input};
|
||||
}
|
||||
var match = input.match(new RegExp('^([^<]*)\<' + REGEX_EMAIL + '\>$', 'i'));
|
||||
if (match) {
|
||||
return {
|
||||
value : match[2],
|
||||
text : $.trim(match[1])
|
||||
};
|
||||
}
|
||||
return false;
|
||||
}
|
||||
});
|
||||
var email_to = $email_selectors[0].selectize;
|
||||
var email_cc = $email_selectors[1].selectize;
|
||||
var email_bcc = $email_selectors[2].selectize;
|
||||
email_to.setValue(${json.dumps(next((c['value'] for c in notifier['config_options'] if c['name'] == 'email_to'), [])) | n});
|
||||
email_cc.setValue(${json.dumps(next((c['value'] for c in notifier['config_options'] if c['name'] == 'email_cc'), [])) | n});
|
||||
email_bcc.setValue(${json.dumps(next((c['value'] for c in notifier['config_options'] if c['name'] == 'email_bcc'), [])) | n});
|
||||
|
||||
% elif notifier['agent_name'] == 'join':
|
||||
var $join_device_names = $('#join_device_names').selectize({
|
||||
plugins: ['remove_button'],
|
||||
maxItems: null,
|
||||
create: true
|
||||
});
|
||||
var join_device_names = $join_device_names[0].selectize;
|
||||
join_device_names.setValue(${json.dumps(next((c['value'] for c in notifier['config_options'] if c['name'] == 'join_device_names'), [])) | n});
|
||||
% endif
|
||||
|
||||
function validateLogic() {
|
||||
@@ -597,7 +710,7 @@
|
||||
});
|
||||
|
||||
function sendTestNotification() {
|
||||
if ('${notifier["agent_name"]}' != 'browser') {
|
||||
if ('${notifier["agent_name"]}' !== 'browser') {
|
||||
$.ajax({
|
||||
url: 'send_notification',
|
||||
data: {
|
||||
@@ -621,7 +734,7 @@
|
||||
}
|
||||
});
|
||||
} else {
|
||||
if ($('#browser_auto_hide_delay').val() == "0") {
|
||||
if ($('#browser_auto_hide_delay').val() === "0") {
|
||||
PNotify.prototype.options.hide = false;
|
||||
} else {
|
||||
PNotify.prototype.options.hide = true;
|
||||
|
@@ -63,7 +63,7 @@ DOCUMENTATION :: END
|
||||
<h3 class="text-muted"> </h3>
|
||||
</div>
|
||||
% elif item['media_type'] == 'show':
|
||||
<a href="info?rating_key=${item['rating_key']}" title="${item['parent_title']}">
|
||||
<a href="info?rating_key=${item['rating_key']}" title="${item['title']}">
|
||||
<div class="dashboard-recent-media-poster">
|
||||
<div class="dashboard-recent-media-poster-face" style="background-image: url(pms_image_proxy?img=${item['thumb']}&width=300&height=450&fallback=poster);">
|
||||
<div class="dashboard-recent-media-overlay">
|
||||
|
@@ -42,7 +42,7 @@ DOCUMENTATION :: END
|
||||
<td>${arrow.get(next_run_interval).format('HH:mm:ss')}</td>
|
||||
<td>${arrow.get(sched_job.next_run_time).format('YYYY-MM-DD HH:mm:ss')}</td>
|
||||
</tr>
|
||||
% elif job in ('Check for active sessions', 'Check for recently added items') and plexpy.WS_CONNECTED:
|
||||
% elif job in ('Check for server response', 'Check for active sessions', 'Check for recently added items') and plexpy.WS_CONNECTED:
|
||||
<tr>
|
||||
<td>${job}</td>
|
||||
<td><i class="fa fa-sm fa-fw fa-check"></i> Websocket</td>
|
||||
|
@@ -13,6 +13,8 @@
|
||||
</%def>
|
||||
|
||||
<%def name="headerIncludes()">
|
||||
<link href="${http_root}css/selectize.bootstrap3.css" rel="stylesheet" />
|
||||
<link href="${http_root}css/selectize.min.css" rel="stylesheet" />
|
||||
</%def>
|
||||
|
||||
<%def name="body()">
|
||||
@@ -470,6 +472,13 @@
|
||||
</div>
|
||||
<input type="text" id="http_hashed_password" name="http_hashed_password" value="${config['http_hashed_password']}" style="display: none;" data-parsley-trigger="change" data-parsley-type="integer" data-parsley-range="[0, 1]"
|
||||
data-parsley-errors-container="#http_hash_password_error" data-parsley-error-message="Cannot un-hash password, please set a new password." data-parsley-no-focus required>
|
||||
<div class="checkbox">
|
||||
<label>
|
||||
<input type="checkbox" class="auth-settings" name="http_plex_admin" id="http_plex_admin" value="1" ${config['http_plex_admin']} data-parsley-trigger="change"> Allow Plex Admin
|
||||
</label>
|
||||
<span id="allowPlexCheck" style="color: #eb8600; padding-left: 10px;"></span>
|
||||
<p class="help-block">Allow the Plex server admin to login as a Tautulli admin using their Plex.tv account.</p>
|
||||
</div>
|
||||
<div class="checkbox">
|
||||
<label>
|
||||
<input type="checkbox" class="auth-settings" name="http_basic_auth" id="http_basic_auth" value="1" ${config['http_basic_auth']} data-parsley-trigger="change"> Use Basic Authentication
|
||||
@@ -477,6 +486,7 @@
|
||||
<p class="help-block">Use basic HTTP authentication instead of the HTML login form.</p>
|
||||
</div>
|
||||
|
||||
<input type="checkbox" name="auth_changed" id="auth_changed" value="1" style="display: none;">
|
||||
|
||||
<div class="padded-header">
|
||||
<h3>Guest Access</h3>
|
||||
@@ -543,9 +553,10 @@
|
||||
<div id="pms_update_options">
|
||||
<div class="form-group">
|
||||
<div class="row">
|
||||
<div class="col-md-2">
|
||||
<div class="col-md-3">
|
||||
<label for="pms_update_channel">Update Channel</label>
|
||||
<select class="form-control" id="pms_update_channel" name="pms_update_channel">
|
||||
<option value="plex">Use Server Setting</option>
|
||||
<option value="public">Public</option>
|
||||
</select>
|
||||
</div>
|
||||
@@ -918,7 +929,7 @@
|
||||
|
||||
<div class="checkbox">
|
||||
<label>
|
||||
<input type="checkbox" id="get_file_sizes" name="get_file_sizes" value="1" ${config['get_file_sizes']}> Calculate Total File Sizes <span style="color: #eb8600; padding-left: 10px;">[experimental]</span>
|
||||
<input type="checkbox" id="get_file_sizes" name="get_file_sizes" value="1" ${config['get_file_sizes']}> Calculate Total File Sizes
|
||||
</label>
|
||||
<p class="help-block">Enable if you want Tautulli to calculate the total file size for TV Shows/Seasons and Artists/Albums on the media info tables.</p>
|
||||
</div>
|
||||
@@ -1500,6 +1511,7 @@
|
||||
<%def name="javascriptIncludes()">
|
||||
<script src="${http_root}js/parsley.min.js"></script>
|
||||
<script src="${http_root}js/Sortable.min.js"></script>
|
||||
<script src="${http_root}js/selectize.min.js"></script>
|
||||
<script src="${http_root}js/moment-with-locale.js"></script>
|
||||
<script src="${http_root}js/jquery.qrcode.min.js"></script>
|
||||
<script>
|
||||
@@ -1774,6 +1786,7 @@ $(document).ready(function() {
|
||||
|
||||
$( ".auth-settings" ).change(function() {
|
||||
authChanged = true;
|
||||
$("#auth_changed").prop('checked', true);
|
||||
});
|
||||
|
||||
$( ".directory-settings" ).change(function() {
|
||||
@@ -2013,6 +2026,26 @@ $(document).ready(function() {
|
||||
}
|
||||
});
|
||||
|
||||
function allowPlexAdminCheck () {
|
||||
if ($("#http_basic_auth").is(":checked")) {
|
||||
$("#http_plex_admin").attr("disabled", true);
|
||||
$("#http_plex_admin").attr("checked", false);
|
||||
$("#allowPlexCheck").html("Plex admin login cannot be enabled with basic authentication.");
|
||||
} else if ($('#http_username').val() == '' || $('#http_password').val() == '') {
|
||||
$("#http_plex_admin").attr("disabled", true);
|
||||
$("#http_plex_admin").attr("checked", false);
|
||||
$("#allowPlexCheck").html("You must set an admin username and password above to allow Plex admin login.");
|
||||
} else {
|
||||
$("#http_plex_admin").attr("disabled", false);
|
||||
$("#allowPlexCheck").html("");
|
||||
}
|
||||
}
|
||||
allowPlexAdminCheck();
|
||||
|
||||
$('#http_username, #http_password, #http_basic_auth').change(function () {
|
||||
allowPlexAdminCheck();
|
||||
});
|
||||
|
||||
function allowGuestAccessCheck () {
|
||||
if ($("#http_basic_auth").is(":checked")) {
|
||||
$("#allow_guest_access").attr("disabled", true);
|
||||
@@ -2021,7 +2054,7 @@ $(document).ready(function() {
|
||||
} else if ($('#http_username').val() == '' || $('#http_password').val() == '') {
|
||||
$("#allow_guest_access").attr("disabled", true);
|
||||
$("#allow_guest_access").attr("checked", false);
|
||||
$("#allowGuestCheck").html("You must set an admin password above to allow guest access.");
|
||||
$("#allowGuestCheck").html("You must set an admin username and password above to allow guest access.");
|
||||
} else {
|
||||
$("#allow_guest_access").attr("disabled", false);
|
||||
$("#allowGuestCheck").html("");
|
||||
@@ -2072,32 +2105,41 @@ $(document).ready(function() {
|
||||
var update_channel = update_params.pms_update_channel;
|
||||
var update_distro = update_params.pms_update_distro;
|
||||
var update_distro_build = update_params.pms_update_distro_build;
|
||||
var plex_update_channel = update_params.plex_update_channel;
|
||||
|
||||
$("#pms_update_channel option[value='plexpass']").remove();
|
||||
$('#pms_update_channel option[value=beta]').remove();
|
||||
if (plexpass) {
|
||||
var selected = (update_channel == 'plexpass') ? true : false;
|
||||
var selected = (update_channel == 'beta') ? true : false;
|
||||
$('#pms_update_channel')
|
||||
.append($('<option></option>')
|
||||
.text('Plex Pass')
|
||||
.val('plexpass')
|
||||
.text('Beta')
|
||||
.val('beta')
|
||||
.prop('selected', selected));
|
||||
}
|
||||
|
||||
$.getJSON('https://plex.tv/api/downloads/1.json?channel=' + update_channel, function (downloads) {
|
||||
platform_downloads = downloads.computer[platform] || downloads.nas[platform];
|
||||
$.ajax({
|
||||
url: 'https://plex.tv/api/downloads/1.json?channel=' + plex_update_channel,
|
||||
type: 'GET',
|
||||
dataType: 'json',
|
||||
beforeSend: function (xhr) {
|
||||
xhr.setRequestHeader('X-Plex-Token', $('#pms_token').val());
|
||||
},
|
||||
success: function (downloads) {
|
||||
var platform_downloads = downloads.computer[platform] || downloads.nas[platform];
|
||||
if (platform_downloads) {
|
||||
$("#pms_update_distro_build option").remove();
|
||||
$.each(platform_downloads.releases, function (index, item) {
|
||||
var label = (platform_downloads.releases.length == 1) ? platform_downloads.name : platform_downloads.name + ' - ' + item.label;
|
||||
var selected = (item.distro == update_distro && item.build == update_distro_build) ? true : false;
|
||||
var label = (platform_downloads.releases.length === 1) ? platform_downloads.name : platform_downloads.name + ' - ' + item.label;
|
||||
var selected = (item.distro === update_distro && item.build === update_distro_build) ? true : false;
|
||||
$('#pms_update_distro_build')
|
||||
.append($('<option></option>')
|
||||
.text(label)
|
||||
.val(item.build)
|
||||
.attr('data-distro', item.distro)
|
||||
.prop('selected', selected));
|
||||
})
|
||||
$('#pms_update_distro').val($("#pms_update_distro_build option:selected").data('distro'))
|
||||
});
|
||||
$('#pms_update_distro').val($('#pms_update_distro_build option:selected').data('distro'))
|
||||
}
|
||||
}
|
||||
});
|
||||
});
|
||||
|
@@ -39,7 +39,7 @@ DOCUMENTATION :: END
|
||||
|
||||
% if data:
|
||||
<%
|
||||
import plexpy
|
||||
from plexpy.common import VIDEO_RESOLUTION_OVERRIDES, AUDIO_CODEC_OVERRIDES
|
||||
%>
|
||||
<div class="modal-dialog" role="document">
|
||||
<div class="modal-content">
|
||||
@@ -54,6 +54,11 @@ DOCUMENTATION :: END
|
||||
</h4>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
% if data['current_session']:
|
||||
<div class="col-sm-12 text-muted stream-info-current">
|
||||
<i class="fa fa-exclamation-circle"></i> Current session. Updated stream details below may be delayed.
|
||||
</div>
|
||||
% endif
|
||||
<table class="stream-info" style="margin-top: 0;">
|
||||
<thead>
|
||||
<tr>
|
||||
@@ -85,8 +90,8 @@ DOCUMENTATION :: END
|
||||
% if data['media_type'] != 'track':
|
||||
<tr>
|
||||
<td>Resolution</td>
|
||||
<td>${plexpy.common.VIDEO_RESOLUTION_OVERRIDES.get(data['stream_video_resolution'], data['stream_video_resolution'])}</td>
|
||||
<td>${plexpy.common.VIDEO_RESOLUTION_OVERRIDES.get(data['video_resolution'], data['video_resolution'])}</td>
|
||||
<td>${VIDEO_RESOLUTION_OVERRIDES.get(data['stream_video_resolution'], data['stream_video_resolution'])}</td>
|
||||
<td>${VIDEO_RESOLUTION_OVERRIDES.get(data['video_resolution'], data['video_resolution'])}</td>
|
||||
</tr>
|
||||
% endif
|
||||
<tr>
|
||||
@@ -124,8 +129,8 @@ DOCUMENTATION :: END
|
||||
<tbody>
|
||||
<tr>
|
||||
<td>Container</td>
|
||||
<td>${data['stream_container']}</td>
|
||||
<td>${data['container']}</td>
|
||||
<td>${data['stream_container'].upper()}</td>
|
||||
<td>${data['container'].upper()}</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
@@ -144,8 +149,8 @@ DOCUMENTATION :: END
|
||||
<tbody>
|
||||
<tr>
|
||||
<td>Codec</td>
|
||||
<td>${data['stream_video_codec']}</td>
|
||||
<td>${data['video_codec']}</td>
|
||||
<td>${data['stream_video_codec'].upper()} ${'(HW)' if data['transcode_hw_encoding'] else ''}</td>
|
||||
<td>${data['video_codec'].upper()} ${'(HW)' if data['transcode_hw_decoding'] else ''}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Bitrate</td>
|
||||
@@ -189,8 +194,8 @@ DOCUMENTATION :: END
|
||||
<tbody>
|
||||
<tr>
|
||||
<td>Codec</td>
|
||||
<td>${data['stream_audio_codec']}</td>
|
||||
<td>${data['audio_codec']}</td>
|
||||
<td>${AUDIO_CODEC_OVERRIDES.get(data['stream_audio_codec'], data['stream_audio_codec'].upper())}</td>
|
||||
<td>${AUDIO_CODEC_OVERRIDES.get(data['audio_codec'], data['audio_codec'].upper())}</td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td>Bitrate</td>
|
||||
@@ -219,8 +224,8 @@ DOCUMENTATION :: END
|
||||
<tbody>
|
||||
<tr>
|
||||
<td>Codec</td>
|
||||
<td>${data['stream_subtitle_codec']}</td>
|
||||
<td>${data['subtitle_codec']}</td>
|
||||
<td>${data['stream_subtitle_codec'].upper()}</td>
|
||||
<td>${data['subtitle_codec'].upper()}</td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
@@ -27,6 +27,16 @@
|
||||
</button> 
|
||||
</div>
|
||||
% endif
|
||||
% if _session['user_group'] == 'admin':
|
||||
<div class="btn-group" id="user-selection">
|
||||
<label>
|
||||
<select name="sync-user" id="sync-user" class="btn" style="color: inherit;">
|
||||
<option value="">All Users</option>
|
||||
<option disabled>────────────</option>
|
||||
</select>
|
||||
</label>
|
||||
</div>
|
||||
% endif
|
||||
<div class="btn-group">
|
||||
<button class="btn btn-dark refresh-syncs-button" id="refresh-syncs-list"><i class="fa fa-refresh"></i> Refresh synced items</button>
|
||||
</div>
|
||||
@@ -87,18 +97,46 @@
|
||||
<script src="${http_root}js/tables/sync_table.js${cache_param}"></script>
|
||||
<script>
|
||||
$(document).ready(function() {
|
||||
// Load user ids and names (for the selector)
|
||||
$.ajax({
|
||||
url: 'get_user_names',
|
||||
type: 'get',
|
||||
dataType: 'json',
|
||||
success: function (data) {
|
||||
var select = $('#sync-user');
|
||||
data.sort(function (a, b) {
|
||||
return a.friendly_name.localeCompare(b.friendly_name);
|
||||
});
|
||||
data.forEach(function (item) {
|
||||
select.append('<option value="' + item.user_id + '">' +
|
||||
item.friendly_name + '</option>');
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
function loadSyncTable(selected_user_id) {
|
||||
sync_table_options.ajax = {
|
||||
url: 'get_sync',
|
||||
data: function (d) {
|
||||
d.user_id = "${_session['user_id']}" == "None" ? null : "${_session['user_id']}"
|
||||
}
|
||||
}
|
||||
url: 'get_sync?user_id=' + selected_user_id
|
||||
};
|
||||
sync_table = $('#sync_table').DataTable(sync_table_options);
|
||||
var colvis = new $.fn.dataTable.ColVis( sync_table, { buttonText: '<i class="fa fa-columns"></i> Select columns', buttonClass: 'btn btn-dark', exclude: [0] } );
|
||||
$( colvis.button() ).appendTo('div.colvis-button-bar');
|
||||
var colvis = new $.fn.dataTable.ColVis(sync_table, {
|
||||
buttonText: '<i class="fa fa-columns"></i> Select columns',
|
||||
buttonClass: 'btn btn-dark',
|
||||
exclude: [0]
|
||||
});
|
||||
$(colvis.button()).appendTo('div.colvis-button-bar');
|
||||
|
||||
clearSearchButton('sync_table', sync_table);
|
||||
|
||||
$('#sync-user').on('change', function () {
|
||||
selected_user_id = $(this).val() || null;
|
||||
sync_table.ajax.url('get_sync?user_id=' + selected_user_id).load();
|
||||
});
|
||||
}
|
||||
|
||||
var selected_user_id = "${_session['user_id']}" == "None" ? null : "${_session['user_id']}";
|
||||
loadSyncTable(selected_user_id);
|
||||
|
||||
% if _session['user_group'] == 'admin':
|
||||
$('#row-edit-mode').on('click', function() {
|
||||
$('#row-edit-mode-alert').fadeIn(200);
|
||||
|
2
lib/idna/__init__.py
Normal file
2
lib/idna/__init__.py
Normal file
@@ -0,0 +1,2 @@
|
||||
from .package_data import __version__
|
||||
from .core import *
|
118
lib/idna/codec.py
Normal file
118
lib/idna/codec.py
Normal file
@@ -0,0 +1,118 @@
|
||||
from .core import encode, decode, alabel, ulabel, IDNAError
|
||||
import codecs
|
||||
import re
|
||||
|
||||
_unicode_dots_re = re.compile(u'[\u002e\u3002\uff0e\uff61]')
|
||||
|
||||
class Codec(codecs.Codec):
|
||||
|
||||
def encode(self, data, errors='strict'):
|
||||
|
||||
if errors != 'strict':
|
||||
raise IDNAError("Unsupported error handling \"{0}\"".format(errors))
|
||||
|
||||
if not data:
|
||||
return "", 0
|
||||
|
||||
return encode(data), len(data)
|
||||
|
||||
def decode(self, data, errors='strict'):
|
||||
|
||||
if errors != 'strict':
|
||||
raise IDNAError("Unsupported error handling \"{0}\"".format(errors))
|
||||
|
||||
if not data:
|
||||
return u"", 0
|
||||
|
||||
return decode(data), len(data)
|
||||
|
||||
class IncrementalEncoder(codecs.BufferedIncrementalEncoder):
|
||||
def _buffer_encode(self, data, errors, final):
|
||||
if errors != 'strict':
|
||||
raise IDNAError("Unsupported error handling \"{0}\"".format(errors))
|
||||
|
||||
if not data:
|
||||
return ("", 0)
|
||||
|
||||
labels = _unicode_dots_re.split(data)
|
||||
trailing_dot = u''
|
||||
if labels:
|
||||
if not labels[-1]:
|
||||
trailing_dot = '.'
|
||||
del labels[-1]
|
||||
elif not final:
|
||||
# Keep potentially unfinished label until the next call
|
||||
del labels[-1]
|
||||
if labels:
|
||||
trailing_dot = '.'
|
||||
|
||||
result = []
|
||||
size = 0
|
||||
for label in labels:
|
||||
result.append(alabel(label))
|
||||
if size:
|
||||
size += 1
|
||||
size += len(label)
|
||||
|
||||
# Join with U+002E
|
||||
result = ".".join(result) + trailing_dot
|
||||
size += len(trailing_dot)
|
||||
return (result, size)
|
||||
|
||||
class IncrementalDecoder(codecs.BufferedIncrementalDecoder):
|
||||
def _buffer_decode(self, data, errors, final):
|
||||
if errors != 'strict':
|
||||
raise IDNAError("Unsupported error handling \"{0}\"".format(errors))
|
||||
|
||||
if not data:
|
||||
return (u"", 0)
|
||||
|
||||
# IDNA allows decoding to operate on Unicode strings, too.
|
||||
if isinstance(data, unicode):
|
||||
labels = _unicode_dots_re.split(data)
|
||||
else:
|
||||
# Must be ASCII string
|
||||
data = str(data)
|
||||
unicode(data, "ascii")
|
||||
labels = data.split(".")
|
||||
|
||||
trailing_dot = u''
|
||||
if labels:
|
||||
if not labels[-1]:
|
||||
trailing_dot = u'.'
|
||||
del labels[-1]
|
||||
elif not final:
|
||||
# Keep potentially unfinished label until the next call
|
||||
del labels[-1]
|
||||
if labels:
|
||||
trailing_dot = u'.'
|
||||
|
||||
result = []
|
||||
size = 0
|
||||
for label in labels:
|
||||
result.append(ulabel(label))
|
||||
if size:
|
||||
size += 1
|
||||
size += len(label)
|
||||
|
||||
result = u".".join(result) + trailing_dot
|
||||
size += len(trailing_dot)
|
||||
return (result, size)
|
||||
|
||||
|
||||
class StreamWriter(Codec, codecs.StreamWriter):
|
||||
pass
|
||||
|
||||
class StreamReader(Codec, codecs.StreamReader):
|
||||
pass
|
||||
|
||||
def getregentry():
|
||||
return codecs.CodecInfo(
|
||||
name='idna',
|
||||
encode=Codec().encode,
|
||||
decode=Codec().decode,
|
||||
incrementalencoder=IncrementalEncoder,
|
||||
incrementaldecoder=IncrementalDecoder,
|
||||
streamwriter=StreamWriter,
|
||||
streamreader=StreamReader,
|
||||
)
|
12
lib/idna/compat.py
Normal file
12
lib/idna/compat.py
Normal file
@@ -0,0 +1,12 @@
|
||||
from .core import *
|
||||
from .codec import *
|
||||
|
||||
def ToASCII(label):
|
||||
return encode(label)
|
||||
|
||||
def ToUnicode(label):
|
||||
return decode(label)
|
||||
|
||||
def nameprep(s):
|
||||
raise NotImplementedError("IDNA 2008 does not utilise nameprep protocol")
|
||||
|
387
lib/idna/core.py
Normal file
387
lib/idna/core.py
Normal file
@@ -0,0 +1,387 @@
|
||||
from . import idnadata
|
||||
import bisect
|
||||
import unicodedata
|
||||
import re
|
||||
import sys
|
||||
from .intranges import intranges_contain
|
||||
|
||||
_virama_combining_class = 9
|
||||
_alabel_prefix = b'xn--'
|
||||
_unicode_dots_re = re.compile(u'[\u002e\u3002\uff0e\uff61]')
|
||||
|
||||
if sys.version_info[0] == 3:
|
||||
unicode = str
|
||||
unichr = chr
|
||||
|
||||
class IDNAError(UnicodeError):
|
||||
""" Base exception for all IDNA-encoding related problems """
|
||||
pass
|
||||
|
||||
|
||||
class IDNABidiError(IDNAError):
|
||||
""" Exception when bidirectional requirements are not satisfied """
|
||||
pass
|
||||
|
||||
|
||||
class InvalidCodepoint(IDNAError):
|
||||
""" Exception when a disallowed or unallocated codepoint is used """
|
||||
pass
|
||||
|
||||
|
||||
class InvalidCodepointContext(IDNAError):
|
||||
""" Exception when the codepoint is not valid in the context it is used """
|
||||
pass
|
||||
|
||||
|
||||
def _combining_class(cp):
|
||||
return unicodedata.combining(unichr(cp))
|
||||
|
||||
def _is_script(cp, script):
|
||||
return intranges_contain(ord(cp), idnadata.scripts[script])
|
||||
|
||||
def _punycode(s):
|
||||
return s.encode('punycode')
|
||||
|
||||
def _unot(s):
|
||||
return 'U+{0:04X}'.format(s)
|
||||
|
||||
|
||||
def valid_label_length(label):
|
||||
|
||||
if len(label) > 63:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def valid_string_length(label, trailing_dot):
|
||||
|
||||
if len(label) > (254 if trailing_dot else 253):
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def check_bidi(label, check_ltr=False):
|
||||
|
||||
# Bidi rules should only be applied if string contains RTL characters
|
||||
bidi_label = False
|
||||
for (idx, cp) in enumerate(label, 1):
|
||||
direction = unicodedata.bidirectional(cp)
|
||||
if direction == '':
|
||||
# String likely comes from a newer version of Unicode
|
||||
raise IDNABidiError('Unknown directionality in label {0} at position {1}'.format(repr(label), idx))
|
||||
if direction in ['R', 'AL', 'AN']:
|
||||
bidi_label = True
|
||||
break
|
||||
if not bidi_label and not check_ltr:
|
||||
return True
|
||||
|
||||
# Bidi rule 1
|
||||
direction = unicodedata.bidirectional(label[0])
|
||||
if direction in ['R', 'AL']:
|
||||
rtl = True
|
||||
elif direction == 'L':
|
||||
rtl = False
|
||||
else:
|
||||
raise IDNABidiError('First codepoint in label {0} must be directionality L, R or AL'.format(repr(label)))
|
||||
|
||||
valid_ending = False
|
||||
number_type = False
|
||||
for (idx, cp) in enumerate(label, 1):
|
||||
direction = unicodedata.bidirectional(cp)
|
||||
|
||||
if rtl:
|
||||
# Bidi rule 2
|
||||
if not direction in ['R', 'AL', 'AN', 'EN', 'ES', 'CS', 'ET', 'ON', 'BN', 'NSM']:
|
||||
raise IDNABidiError('Invalid direction for codepoint at position {0} in a right-to-left label'.format(idx))
|
||||
# Bidi rule 3
|
||||
if direction in ['R', 'AL', 'EN', 'AN']:
|
||||
valid_ending = True
|
||||
elif direction != 'NSM':
|
||||
valid_ending = False
|
||||
# Bidi rule 4
|
||||
if direction in ['AN', 'EN']:
|
||||
if not number_type:
|
||||
number_type = direction
|
||||
else:
|
||||
if number_type != direction:
|
||||
raise IDNABidiError('Can not mix numeral types in a right-to-left label')
|
||||
else:
|
||||
# Bidi rule 5
|
||||
if not direction in ['L', 'EN', 'ES', 'CS', 'ET', 'ON', 'BN', 'NSM']:
|
||||
raise IDNABidiError('Invalid direction for codepoint at position {0} in a left-to-right label'.format(idx))
|
||||
# Bidi rule 6
|
||||
if direction in ['L', 'EN']:
|
||||
valid_ending = True
|
||||
elif direction != 'NSM':
|
||||
valid_ending = False
|
||||
|
||||
if not valid_ending:
|
||||
raise IDNABidiError('Label ends with illegal codepoint directionality')
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def check_initial_combiner(label):
|
||||
|
||||
if unicodedata.category(label[0])[0] == 'M':
|
||||
raise IDNAError('Label begins with an illegal combining character')
|
||||
return True
|
||||
|
||||
|
||||
def check_hyphen_ok(label):
|
||||
|
||||
if label[2:4] == '--':
|
||||
raise IDNAError('Label has disallowed hyphens in 3rd and 4th position')
|
||||
if label[0] == '-' or label[-1] == '-':
|
||||
raise IDNAError('Label must not start or end with a hyphen')
|
||||
return True
|
||||
|
||||
|
||||
def check_nfc(label):
|
||||
|
||||
if unicodedata.normalize('NFC', label) != label:
|
||||
raise IDNAError('Label must be in Normalization Form C')
|
||||
|
||||
|
||||
def valid_contextj(label, pos):
|
||||
|
||||
cp_value = ord(label[pos])
|
||||
|
||||
if cp_value == 0x200c:
|
||||
|
||||
if pos > 0:
|
||||
if _combining_class(ord(label[pos - 1])) == _virama_combining_class:
|
||||
return True
|
||||
|
||||
ok = False
|
||||
for i in range(pos-1, -1, -1):
|
||||
joining_type = idnadata.joining_types.get(ord(label[i]))
|
||||
if joining_type == ord('T'):
|
||||
continue
|
||||
if joining_type in [ord('L'), ord('D')]:
|
||||
ok = True
|
||||
break
|
||||
|
||||
if not ok:
|
||||
return False
|
||||
|
||||
ok = False
|
||||
for i in range(pos+1, len(label)):
|
||||
joining_type = idnadata.joining_types.get(ord(label[i]))
|
||||
if joining_type == ord('T'):
|
||||
continue
|
||||
if joining_type in [ord('R'), ord('D')]:
|
||||
ok = True
|
||||
break
|
||||
return ok
|
||||
|
||||
if cp_value == 0x200d:
|
||||
|
||||
if pos > 0:
|
||||
if _combining_class(ord(label[pos - 1])) == _virama_combining_class:
|
||||
return True
|
||||
return False
|
||||
|
||||
else:
|
||||
|
||||
return False
|
||||
|
||||
|
||||
def valid_contexto(label, pos, exception=False):
|
||||
|
||||
cp_value = ord(label[pos])
|
||||
|
||||
if cp_value == 0x00b7:
|
||||
if 0 < pos < len(label)-1:
|
||||
if ord(label[pos - 1]) == 0x006c and ord(label[pos + 1]) == 0x006c:
|
||||
return True
|
||||
return False
|
||||
|
||||
elif cp_value == 0x0375:
|
||||
if pos < len(label)-1 and len(label) > 1:
|
||||
return _is_script(label[pos + 1], 'Greek')
|
||||
return False
|
||||
|
||||
elif cp_value == 0x05f3 or cp_value == 0x05f4:
|
||||
if pos > 0:
|
||||
return _is_script(label[pos - 1], 'Hebrew')
|
||||
return False
|
||||
|
||||
elif cp_value == 0x30fb:
|
||||
for cp in label:
|
||||
if cp == u'\u30fb':
|
||||
continue
|
||||
if _is_script(cp, 'Hiragana') or _is_script(cp, 'Katakana') or _is_script(cp, 'Han'):
|
||||
return True
|
||||
return False
|
||||
|
||||
elif 0x660 <= cp_value <= 0x669:
|
||||
for cp in label:
|
||||
if 0x6f0 <= ord(cp) <= 0x06f9:
|
||||
return False
|
||||
return True
|
||||
|
||||
elif 0x6f0 <= cp_value <= 0x6f9:
|
||||
for cp in label:
|
||||
if 0x660 <= ord(cp) <= 0x0669:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
def check_label(label):
|
||||
|
||||
if isinstance(label, (bytes, bytearray)):
|
||||
label = label.decode('utf-8')
|
||||
if len(label) == 0:
|
||||
raise IDNAError('Empty Label')
|
||||
|
||||
check_nfc(label)
|
||||
check_hyphen_ok(label)
|
||||
check_initial_combiner(label)
|
||||
|
||||
for (pos, cp) in enumerate(label):
|
||||
cp_value = ord(cp)
|
||||
if intranges_contain(cp_value, idnadata.codepoint_classes['PVALID']):
|
||||
continue
|
||||
elif intranges_contain(cp_value, idnadata.codepoint_classes['CONTEXTJ']):
|
||||
if not valid_contextj(label, pos):
|
||||
raise InvalidCodepointContext('Joiner {0} not allowed at position {1} in {2}'.format(_unot(cp_value), pos+1, repr(label)))
|
||||
elif intranges_contain(cp_value, idnadata.codepoint_classes['CONTEXTO']):
|
||||
if not valid_contexto(label, pos):
|
||||
raise InvalidCodepointContext('Codepoint {0} not allowed at position {1} in {2}'.format(_unot(cp_value), pos+1, repr(label)))
|
||||
else:
|
||||
raise InvalidCodepoint('Codepoint {0} at position {1} of {2} not allowed'.format(_unot(cp_value), pos+1, repr(label)))
|
||||
|
||||
check_bidi(label)
|
||||
|
||||
|
||||
def alabel(label):
|
||||
|
||||
try:
|
||||
label = label.encode('ascii')
|
||||
try:
|
||||
ulabel(label)
|
||||
except IDNAError:
|
||||
raise IDNAError('The label {0} is not a valid A-label'.format(label))
|
||||
if not valid_label_length(label):
|
||||
raise IDNAError('Label too long')
|
||||
return label
|
||||
except UnicodeEncodeError:
|
||||
pass
|
||||
|
||||
if not label:
|
||||
raise IDNAError('No Input')
|
||||
|
||||
label = unicode(label)
|
||||
check_label(label)
|
||||
label = _punycode(label)
|
||||
label = _alabel_prefix + label
|
||||
|
||||
if not valid_label_length(label):
|
||||
raise IDNAError('Label too long')
|
||||
|
||||
return label
|
||||
|
||||
|
||||
def ulabel(label):
|
||||
|
||||
if not isinstance(label, (bytes, bytearray)):
|
||||
try:
|
||||
label = label.encode('ascii')
|
||||
except UnicodeEncodeError:
|
||||
check_label(label)
|
||||
return label
|
||||
|
||||
label = label.lower()
|
||||
if label.startswith(_alabel_prefix):
|
||||
label = label[len(_alabel_prefix):]
|
||||
else:
|
||||
check_label(label)
|
||||
return label.decode('ascii')
|
||||
|
||||
label = label.decode('punycode')
|
||||
check_label(label)
|
||||
return label
|
||||
|
||||
|
||||
def uts46_remap(domain, std3_rules=True, transitional=False):
|
||||
"""Re-map the characters in the string according to UTS46 processing."""
|
||||
from .uts46data import uts46data
|
||||
output = u""
|
||||
try:
|
||||
for pos, char in enumerate(domain):
|
||||
code_point = ord(char)
|
||||
uts46row = uts46data[code_point if code_point < 256 else
|
||||
bisect.bisect_left(uts46data, (code_point, "Z")) - 1]
|
||||
status = uts46row[1]
|
||||
replacement = uts46row[2] if len(uts46row) == 3 else None
|
||||
if (status == "V" or
|
||||
(status == "D" and not transitional) or
|
||||
(status == "3" and std3_rules and replacement is None)):
|
||||
output += char
|
||||
elif replacement is not None and (status == "M" or
|
||||
(status == "3" and std3_rules) or
|
||||
(status == "D" and transitional)):
|
||||
output += replacement
|
||||
elif status != "I":
|
||||
raise IndexError()
|
||||
return unicodedata.normalize("NFC", output)
|
||||
except IndexError:
|
||||
raise InvalidCodepoint(
|
||||
"Codepoint {0} not allowed at position {1} in {2}".format(
|
||||
_unot(code_point), pos + 1, repr(domain)))
|
||||
|
||||
|
||||
def encode(s, strict=False, uts46=False, std3_rules=False, transitional=False):
|
||||
|
||||
if isinstance(s, (bytes, bytearray)):
|
||||
s = s.decode("ascii")
|
||||
if uts46:
|
||||
s = uts46_remap(s, std3_rules, transitional)
|
||||
trailing_dot = False
|
||||
result = []
|
||||
if strict:
|
||||
labels = s.split('.')
|
||||
else:
|
||||
labels = _unicode_dots_re.split(s)
|
||||
while labels and not labels[0]:
|
||||
del labels[0]
|
||||
if not labels:
|
||||
raise IDNAError('Empty domain')
|
||||
if labels[-1] == '':
|
||||
del labels[-1]
|
||||
trailing_dot = True
|
||||
for label in labels:
|
||||
result.append(alabel(label))
|
||||
if trailing_dot:
|
||||
result.append(b'')
|
||||
s = b'.'.join(result)
|
||||
if not valid_string_length(s, trailing_dot):
|
||||
raise IDNAError('Domain too long')
|
||||
return s
|
||||
|
||||
|
||||
def decode(s, strict=False, uts46=False, std3_rules=False):
|
||||
|
||||
if isinstance(s, (bytes, bytearray)):
|
||||
s = s.decode("ascii")
|
||||
if uts46:
|
||||
s = uts46_remap(s, std3_rules, False)
|
||||
trailing_dot = False
|
||||
result = []
|
||||
if not strict:
|
||||
labels = _unicode_dots_re.split(s)
|
||||
else:
|
||||
labels = s.split(u'.')
|
||||
while labels and not labels[0]:
|
||||
del labels[0]
|
||||
if not labels:
|
||||
raise IDNAError('Empty domain')
|
||||
if not labels[-1]:
|
||||
del labels[-1]
|
||||
trailing_dot = True
|
||||
for label in labels:
|
||||
result.append(ulabel(label))
|
||||
if trailing_dot:
|
||||
result.append(u'')
|
||||
return u'.'.join(result)
|
1585
lib/idna/idnadata.py
Normal file
1585
lib/idna/idnadata.py
Normal file
File diff suppressed because it is too large
Load Diff
53
lib/idna/intranges.py
Normal file
53
lib/idna/intranges.py
Normal file
@@ -0,0 +1,53 @@
|
||||
"""
|
||||
Given a list of integers, made up of (hopefully) a small number of long runs
|
||||
of consecutive integers, compute a representation of the form
|
||||
((start1, end1), (start2, end2) ...). Then answer the question "was x present
|
||||
in the original list?" in time O(log(# runs)).
|
||||
"""
|
||||
|
||||
import bisect
|
||||
|
||||
def intranges_from_list(list_):
|
||||
"""Represent a list of integers as a sequence of ranges:
|
||||
((start_0, end_0), (start_1, end_1), ...), such that the original
|
||||
integers are exactly those x such that start_i <= x < end_i for some i.
|
||||
|
||||
Ranges are encoded as single integers (start << 32 | end), not as tuples.
|
||||
"""
|
||||
|
||||
sorted_list = sorted(list_)
|
||||
ranges = []
|
||||
last_write = -1
|
||||
for i in range(len(sorted_list)):
|
||||
if i+1 < len(sorted_list):
|
||||
if sorted_list[i] == sorted_list[i+1]-1:
|
||||
continue
|
||||
current_range = sorted_list[last_write+1:i+1]
|
||||
ranges.append(_encode_range(current_range[0], current_range[-1] + 1))
|
||||
last_write = i
|
||||
|
||||
return tuple(ranges)
|
||||
|
||||
def _encode_range(start, end):
|
||||
return (start << 32) | end
|
||||
|
||||
def _decode_range(r):
|
||||
return (r >> 32), (r & ((1 << 32) - 1))
|
||||
|
||||
|
||||
def intranges_contain(int_, ranges):
|
||||
"""Determine if `int_` falls into one of the ranges in `ranges`."""
|
||||
tuple_ = _encode_range(int_, 0)
|
||||
pos = bisect.bisect_left(ranges, tuple_)
|
||||
# we could be immediately ahead of a tuple (start, end)
|
||||
# with start < int_ <= end
|
||||
if pos > 0:
|
||||
left, right = _decode_range(ranges[pos-1])
|
||||
if left <= int_ < right:
|
||||
return True
|
||||
# or we could be immediately behind a tuple (int_, end)
|
||||
if pos < len(ranges):
|
||||
left, _ = _decode_range(ranges[pos])
|
||||
if left == int_:
|
||||
return True
|
||||
return False
|
2
lib/idna/package_data.py
Normal file
2
lib/idna/package_data.py
Normal file
@@ -0,0 +1,2 @@
|
||||
__version__ = '2.6'
|
||||
|
7634
lib/idna/uts46data.py
Normal file
7634
lib/idna/uts46data.py
Normal file
File diff suppressed because it is too large
Load Diff
29
lib/jwt/__init__.py
Normal file
29
lib/jwt/__init__.py
Normal file
@@ -0,0 +1,29 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# flake8: noqa
|
||||
|
||||
"""
|
||||
JSON Web Token implementation
|
||||
|
||||
Minimum implementation based on this spec:
|
||||
http://self-issued.info/docs/draft-jones-json-web-token-01.html
|
||||
"""
|
||||
|
||||
|
||||
__title__ = 'pyjwt'
|
||||
__version__ = '1.4.0'
|
||||
__author__ = 'José Padilla'
|
||||
__license__ = 'MIT'
|
||||
__copyright__ = 'Copyright 2015 José Padilla'
|
||||
|
||||
|
||||
from .api_jwt import (
|
||||
encode, decode, register_algorithm, unregister_algorithm,
|
||||
get_unverified_header, PyJWT
|
||||
)
|
||||
from .api_jws import PyJWS
|
||||
from .exceptions import (
|
||||
InvalidTokenError, DecodeError, InvalidAudienceError,
|
||||
ExpiredSignatureError, ImmatureSignatureError, InvalidIssuedAtError,
|
||||
InvalidIssuerError, ExpiredSignature, InvalidAudience, InvalidIssuer,
|
||||
MissingRequiredClaimError
|
||||
)
|
135
lib/jwt/__main__.py
Normal file
135
lib/jwt/__main__.py
Normal file
@@ -0,0 +1,135 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
from __future__ import absolute_import, print_function
|
||||
|
||||
import json
|
||||
import optparse
|
||||
import sys
|
||||
import time
|
||||
|
||||
from . import DecodeError, __package__, __version__, decode, encode
|
||||
|
||||
|
||||
def main():
|
||||
|
||||
usage = '''Encodes or decodes JSON Web Tokens based on input.
|
||||
|
||||
%prog [options] input
|
||||
|
||||
Decoding examples:
|
||||
|
||||
%prog --key=secret json.web.token
|
||||
%prog --no-verify json.web.token
|
||||
|
||||
Encoding requires the key option and takes space separated key/value pairs
|
||||
separated by equals (=) as input. Examples:
|
||||
|
||||
%prog --key=secret iss=me exp=1302049071
|
||||
%prog --key=secret foo=bar exp=+10
|
||||
|
||||
The exp key is special and can take an offset to current Unix time.\
|
||||
'''
|
||||
p = optparse.OptionParser(
|
||||
usage=usage,
|
||||
prog=__package__,
|
||||
version='%s %s' % (__package__, __version__),
|
||||
)
|
||||
|
||||
p.add_option(
|
||||
'-n', '--no-verify',
|
||||
action='store_false',
|
||||
dest='verify',
|
||||
default=True,
|
||||
help='ignore signature verification on decode'
|
||||
)
|
||||
|
||||
p.add_option(
|
||||
'--key',
|
||||
dest='key',
|
||||
metavar='KEY',
|
||||
default=None,
|
||||
help='set the secret key to sign with'
|
||||
)
|
||||
|
||||
p.add_option(
|
||||
'--alg',
|
||||
dest='algorithm',
|
||||
metavar='ALG',
|
||||
default='HS256',
|
||||
help='set crypto algorithm to sign with. default=HS256'
|
||||
)
|
||||
|
||||
options, arguments = p.parse_args()
|
||||
|
||||
if len(arguments) > 0 or not sys.stdin.isatty():
|
||||
if len(arguments) == 1 and (not options.verify or options.key):
|
||||
# Try to decode
|
||||
try:
|
||||
if not sys.stdin.isatty():
|
||||
token = sys.stdin.read()
|
||||
else:
|
||||
token = arguments[0]
|
||||
|
||||
token = token.encode('utf-8')
|
||||
data = decode(token, key=options.key, verify=options.verify)
|
||||
|
||||
print(json.dumps(data))
|
||||
sys.exit(0)
|
||||
except DecodeError as e:
|
||||
print(e)
|
||||
sys.exit(1)
|
||||
|
||||
# Try to encode
|
||||
if options.key is None:
|
||||
print('Key is required when encoding. See --help for usage.')
|
||||
sys.exit(1)
|
||||
|
||||
# Build payload object to encode
|
||||
payload = {}
|
||||
|
||||
for arg in arguments:
|
||||
try:
|
||||
k, v = arg.split('=', 1)
|
||||
|
||||
# exp +offset special case?
|
||||
if k == 'exp' and v[0] == '+' and len(v) > 1:
|
||||
v = str(int(time.time()+int(v[1:])))
|
||||
|
||||
# Cast to integer?
|
||||
if v.isdigit():
|
||||
v = int(v)
|
||||
else:
|
||||
# Cast to float?
|
||||
try:
|
||||
v = float(v)
|
||||
except ValueError:
|
||||
pass
|
||||
|
||||
# Cast to true, false, or null?
|
||||
constants = {'true': True, 'false': False, 'null': None}
|
||||
|
||||
if v in constants:
|
||||
v = constants[v]
|
||||
|
||||
payload[k] = v
|
||||
except ValueError:
|
||||
print('Invalid encoding input at {}'.format(arg))
|
||||
sys.exit(1)
|
||||
|
||||
try:
|
||||
token = encode(
|
||||
payload,
|
||||
key=options.key,
|
||||
algorithm=options.algorithm
|
||||
)
|
||||
|
||||
print(token)
|
||||
sys.exit(0)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
sys.exit(1)
|
||||
else:
|
||||
p.print_help()
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
290
lib/jwt/algorithms.py
Normal file
290
lib/jwt/algorithms.py
Normal file
@@ -0,0 +1,290 @@
|
||||
import hashlib
|
||||
import hmac
|
||||
|
||||
from .compat import constant_time_compare, string_types, text_type
|
||||
from .exceptions import InvalidKeyError
|
||||
from .utils import der_to_raw_signature, raw_to_der_signature
|
||||
|
||||
try:
|
||||
from cryptography.hazmat.primitives import hashes
|
||||
from cryptography.hazmat.primitives.serialization import (
|
||||
load_pem_private_key, load_pem_public_key, load_ssh_public_key
|
||||
)
|
||||
from cryptography.hazmat.primitives.asymmetric.rsa import (
|
||||
RSAPrivateKey, RSAPublicKey
|
||||
)
|
||||
from cryptography.hazmat.primitives.asymmetric.ec import (
|
||||
EllipticCurvePrivateKey, EllipticCurvePublicKey
|
||||
)
|
||||
from cryptography.hazmat.primitives.asymmetric import ec, padding
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.exceptions import InvalidSignature
|
||||
|
||||
has_crypto = True
|
||||
except ImportError:
|
||||
has_crypto = False
|
||||
|
||||
|
||||
def get_default_algorithms():
|
||||
"""
|
||||
Returns the algorithms that are implemented by the library.
|
||||
"""
|
||||
default_algorithms = {
|
||||
'none': NoneAlgorithm(),
|
||||
'HS256': HMACAlgorithm(HMACAlgorithm.SHA256),
|
||||
'HS384': HMACAlgorithm(HMACAlgorithm.SHA384),
|
||||
'HS512': HMACAlgorithm(HMACAlgorithm.SHA512)
|
||||
}
|
||||
|
||||
if has_crypto:
|
||||
default_algorithms.update({
|
||||
'RS256': RSAAlgorithm(RSAAlgorithm.SHA256),
|
||||
'RS384': RSAAlgorithm(RSAAlgorithm.SHA384),
|
||||
'RS512': RSAAlgorithm(RSAAlgorithm.SHA512),
|
||||
'ES256': ECAlgorithm(ECAlgorithm.SHA256),
|
||||
'ES384': ECAlgorithm(ECAlgorithm.SHA384),
|
||||
'ES512': ECAlgorithm(ECAlgorithm.SHA512),
|
||||
'PS256': RSAPSSAlgorithm(RSAPSSAlgorithm.SHA256),
|
||||
'PS384': RSAPSSAlgorithm(RSAPSSAlgorithm.SHA384),
|
||||
'PS512': RSAPSSAlgorithm(RSAPSSAlgorithm.SHA512)
|
||||
})
|
||||
|
||||
return default_algorithms
|
||||
|
||||
|
||||
class Algorithm(object):
|
||||
"""
|
||||
The interface for an algorithm used to sign and verify tokens.
|
||||
"""
|
||||
def prepare_key(self, key):
|
||||
"""
|
||||
Performs necessary validation and conversions on the key and returns
|
||||
the key value in the proper format for sign() and verify().
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def sign(self, msg, key):
|
||||
"""
|
||||
Returns a digital signature for the specified message
|
||||
using the specified key value.
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def verify(self, msg, key, sig):
|
||||
"""
|
||||
Verifies that the specified digital signature is valid
|
||||
for the specified message and key values.
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
|
||||
class NoneAlgorithm(Algorithm):
|
||||
"""
|
||||
Placeholder for use when no signing or verification
|
||||
operations are required.
|
||||
"""
|
||||
def prepare_key(self, key):
|
||||
if key == '':
|
||||
key = None
|
||||
|
||||
if key is not None:
|
||||
raise InvalidKeyError('When alg = "none", key value must be None.')
|
||||
|
||||
return key
|
||||
|
||||
def sign(self, msg, key):
|
||||
return b''
|
||||
|
||||
def verify(self, msg, key, sig):
|
||||
return False
|
||||
|
||||
|
||||
class HMACAlgorithm(Algorithm):
|
||||
"""
|
||||
Performs signing and verification operations using HMAC
|
||||
and the specified hash function.
|
||||
"""
|
||||
SHA256 = hashlib.sha256
|
||||
SHA384 = hashlib.sha384
|
||||
SHA512 = hashlib.sha512
|
||||
|
||||
def __init__(self, hash_alg):
|
||||
self.hash_alg = hash_alg
|
||||
|
||||
def prepare_key(self, key):
|
||||
if not isinstance(key, string_types) and not isinstance(key, bytes):
|
||||
raise TypeError('Expecting a string- or bytes-formatted key.')
|
||||
|
||||
if isinstance(key, text_type):
|
||||
key = key.encode('utf-8')
|
||||
|
||||
invalid_strings = [
|
||||
b'-----BEGIN PUBLIC KEY-----',
|
||||
b'-----BEGIN CERTIFICATE-----',
|
||||
b'ssh-rsa'
|
||||
]
|
||||
|
||||
if any([string_value in key for string_value in invalid_strings]):
|
||||
raise InvalidKeyError(
|
||||
'The specified key is an asymmetric key or x509 certificate and'
|
||||
' should not be used as an HMAC secret.')
|
||||
|
||||
return key
|
||||
|
||||
def sign(self, msg, key):
|
||||
return hmac.new(key, msg, self.hash_alg).digest()
|
||||
|
||||
def verify(self, msg, key, sig):
|
||||
return constant_time_compare(sig, self.sign(msg, key))
|
||||
|
||||
if has_crypto:
|
||||
|
||||
class RSAAlgorithm(Algorithm):
|
||||
"""
|
||||
Performs signing and verification operations using
|
||||
RSASSA-PKCS-v1_5 and the specified hash function.
|
||||
"""
|
||||
SHA256 = hashes.SHA256
|
||||
SHA384 = hashes.SHA384
|
||||
SHA512 = hashes.SHA512
|
||||
|
||||
def __init__(self, hash_alg):
|
||||
self.hash_alg = hash_alg
|
||||
|
||||
def prepare_key(self, key):
|
||||
if isinstance(key, RSAPrivateKey) or \
|
||||
isinstance(key, RSAPublicKey):
|
||||
return key
|
||||
|
||||
if isinstance(key, string_types):
|
||||
if isinstance(key, text_type):
|
||||
key = key.encode('utf-8')
|
||||
|
||||
try:
|
||||
if key.startswith(b'ssh-rsa'):
|
||||
key = load_ssh_public_key(key, backend=default_backend())
|
||||
else:
|
||||
key = load_pem_private_key(key, password=None, backend=default_backend())
|
||||
except ValueError:
|
||||
key = load_pem_public_key(key, backend=default_backend())
|
||||
else:
|
||||
raise TypeError('Expecting a PEM-formatted key.')
|
||||
|
||||
return key
|
||||
|
||||
def sign(self, msg, key):
|
||||
signer = key.signer(
|
||||
padding.PKCS1v15(),
|
||||
self.hash_alg()
|
||||
)
|
||||
|
||||
signer.update(msg)
|
||||
return signer.finalize()
|
||||
|
||||
def verify(self, msg, key, sig):
|
||||
verifier = key.verifier(
|
||||
sig,
|
||||
padding.PKCS1v15(),
|
||||
self.hash_alg()
|
||||
)
|
||||
|
||||
verifier.update(msg)
|
||||
|
||||
try:
|
||||
verifier.verify()
|
||||
return True
|
||||
except InvalidSignature:
|
||||
return False
|
||||
|
||||
class ECAlgorithm(Algorithm):
|
||||
"""
|
||||
Performs signing and verification operations using
|
||||
ECDSA and the specified hash function
|
||||
"""
|
||||
SHA256 = hashes.SHA256
|
||||
SHA384 = hashes.SHA384
|
||||
SHA512 = hashes.SHA512
|
||||
|
||||
def __init__(self, hash_alg):
|
||||
self.hash_alg = hash_alg
|
||||
|
||||
def prepare_key(self, key):
|
||||
if isinstance(key, EllipticCurvePrivateKey) or \
|
||||
isinstance(key, EllipticCurvePublicKey):
|
||||
return key
|
||||
|
||||
if isinstance(key, string_types):
|
||||
if isinstance(key, text_type):
|
||||
key = key.encode('utf-8')
|
||||
|
||||
# Attempt to load key. We don't know if it's
|
||||
# a Signing Key or a Verifying Key, so we try
|
||||
# the Verifying Key first.
|
||||
try:
|
||||
key = load_pem_public_key(key, backend=default_backend())
|
||||
except ValueError:
|
||||
key = load_pem_private_key(key, password=None, backend=default_backend())
|
||||
|
||||
else:
|
||||
raise TypeError('Expecting a PEM-formatted key.')
|
||||
|
||||
return key
|
||||
|
||||
def sign(self, msg, key):
|
||||
signer = key.signer(ec.ECDSA(self.hash_alg()))
|
||||
|
||||
signer.update(msg)
|
||||
der_sig = signer.finalize()
|
||||
|
||||
return der_to_raw_signature(der_sig, key.curve)
|
||||
|
||||
def verify(self, msg, key, sig):
|
||||
try:
|
||||
der_sig = raw_to_der_signature(sig, key.curve)
|
||||
except ValueError:
|
||||
return False
|
||||
|
||||
verifier = key.verifier(der_sig, ec.ECDSA(self.hash_alg()))
|
||||
|
||||
verifier.update(msg)
|
||||
|
||||
try:
|
||||
verifier.verify()
|
||||
return True
|
||||
except InvalidSignature:
|
||||
return False
|
||||
|
||||
class RSAPSSAlgorithm(RSAAlgorithm):
|
||||
"""
|
||||
Performs a signature using RSASSA-PSS with MGF1
|
||||
"""
|
||||
|
||||
def sign(self, msg, key):
|
||||
signer = key.signer(
|
||||
padding.PSS(
|
||||
mgf=padding.MGF1(self.hash_alg()),
|
||||
salt_length=self.hash_alg.digest_size
|
||||
),
|
||||
self.hash_alg()
|
||||
)
|
||||
|
||||
signer.update(msg)
|
||||
return signer.finalize()
|
||||
|
||||
def verify(self, msg, key, sig):
|
||||
verifier = key.verifier(
|
||||
sig,
|
||||
padding.PSS(
|
||||
mgf=padding.MGF1(self.hash_alg()),
|
||||
salt_length=self.hash_alg.digest_size
|
||||
),
|
||||
self.hash_alg()
|
||||
)
|
||||
|
||||
verifier.update(msg)
|
||||
|
||||
try:
|
||||
verifier.verify()
|
||||
return True
|
||||
except InvalidSignature:
|
||||
return False
|
189
lib/jwt/api_jws.py
Normal file
189
lib/jwt/api_jws.py
Normal file
@@ -0,0 +1,189 @@
|
||||
import binascii
|
||||
import json
|
||||
import warnings
|
||||
|
||||
from collections import Mapping
|
||||
|
||||
from .algorithms import Algorithm, get_default_algorithms # NOQA
|
||||
from .compat import text_type
|
||||
from .exceptions import DecodeError, InvalidAlgorithmError
|
||||
from .utils import base64url_decode, base64url_encode, merge_dict
|
||||
|
||||
|
||||
class PyJWS(object):
|
||||
header_typ = 'JWT'
|
||||
|
||||
def __init__(self, algorithms=None, options=None):
|
||||
self._algorithms = get_default_algorithms()
|
||||
self._valid_algs = (set(algorithms) if algorithms is not None
|
||||
else set(self._algorithms))
|
||||
|
||||
# Remove algorithms that aren't on the whitelist
|
||||
for key in list(self._algorithms.keys()):
|
||||
if key not in self._valid_algs:
|
||||
del self._algorithms[key]
|
||||
|
||||
if not options:
|
||||
options = {}
|
||||
|
||||
self.options = merge_dict(self._get_default_options(), options)
|
||||
|
||||
@staticmethod
|
||||
def _get_default_options():
|
||||
return {
|
||||
'verify_signature': True
|
||||
}
|
||||
|
||||
def register_algorithm(self, alg_id, alg_obj):
|
||||
"""
|
||||
Registers a new Algorithm for use when creating and verifying tokens.
|
||||
"""
|
||||
if alg_id in self._algorithms:
|
||||
raise ValueError('Algorithm already has a handler.')
|
||||
|
||||
if not isinstance(alg_obj, Algorithm):
|
||||
raise TypeError('Object is not of type `Algorithm`')
|
||||
|
||||
self._algorithms[alg_id] = alg_obj
|
||||
self._valid_algs.add(alg_id)
|
||||
|
||||
def unregister_algorithm(self, alg_id):
|
||||
"""
|
||||
Unregisters an Algorithm for use when creating and verifying tokens
|
||||
Throws KeyError if algorithm is not registered.
|
||||
"""
|
||||
if alg_id not in self._algorithms:
|
||||
raise KeyError('The specified algorithm could not be removed'
|
||||
' because it is not registered.')
|
||||
|
||||
del self._algorithms[alg_id]
|
||||
self._valid_algs.remove(alg_id)
|
||||
|
||||
def get_algorithms(self):
|
||||
"""
|
||||
Returns a list of supported values for the 'alg' parameter.
|
||||
"""
|
||||
return list(self._valid_algs)
|
||||
|
||||
def encode(self, payload, key, algorithm='HS256', headers=None,
|
||||
json_encoder=None):
|
||||
segments = []
|
||||
|
||||
if algorithm is None:
|
||||
algorithm = 'none'
|
||||
|
||||
if algorithm not in self._valid_algs:
|
||||
pass
|
||||
|
||||
# Header
|
||||
header = {'typ': self.header_typ, 'alg': algorithm}
|
||||
|
||||
if headers:
|
||||
header.update(headers)
|
||||
|
||||
json_header = json.dumps(
|
||||
header,
|
||||
separators=(',', ':'),
|
||||
cls=json_encoder
|
||||
).encode('utf-8')
|
||||
|
||||
segments.append(base64url_encode(json_header))
|
||||
segments.append(base64url_encode(payload))
|
||||
|
||||
# Segments
|
||||
signing_input = b'.'.join(segments)
|
||||
try:
|
||||
alg_obj = self._algorithms[algorithm]
|
||||
key = alg_obj.prepare_key(key)
|
||||
signature = alg_obj.sign(signing_input, key)
|
||||
|
||||
except KeyError:
|
||||
raise NotImplementedError('Algorithm not supported')
|
||||
|
||||
segments.append(base64url_encode(signature))
|
||||
|
||||
return b'.'.join(segments)
|
||||
|
||||
def decode(self, jws, key='', verify=True, algorithms=None, options=None,
|
||||
**kwargs):
|
||||
payload, signing_input, header, signature = self._load(jws)
|
||||
|
||||
if verify:
|
||||
merged_options = merge_dict(self.options, options)
|
||||
if merged_options.get('verify_signature'):
|
||||
self._verify_signature(payload, signing_input, header, signature,
|
||||
key, algorithms)
|
||||
else:
|
||||
warnings.warn('The verify parameter is deprecated. '
|
||||
'Please use options instead.', DeprecationWarning)
|
||||
|
||||
return payload
|
||||
|
||||
def get_unverified_header(self, jwt):
|
||||
"""Returns back the JWT header parameters as a dict()
|
||||
|
||||
Note: The signature is not verified so the header parameters
|
||||
should not be fully trusted until signature verification is complete
|
||||
"""
|
||||
return self._load(jwt)[2]
|
||||
|
||||
def _load(self, jwt):
|
||||
if isinstance(jwt, text_type):
|
||||
jwt = jwt.encode('utf-8')
|
||||
|
||||
try:
|
||||
signing_input, crypto_segment = jwt.rsplit(b'.', 1)
|
||||
header_segment, payload_segment = signing_input.split(b'.', 1)
|
||||
except ValueError:
|
||||
raise DecodeError('Not enough segments')
|
||||
|
||||
try:
|
||||
header_data = base64url_decode(header_segment)
|
||||
except (TypeError, binascii.Error):
|
||||
raise DecodeError('Invalid header padding')
|
||||
|
||||
try:
|
||||
header = json.loads(header_data.decode('utf-8'))
|
||||
except ValueError as e:
|
||||
raise DecodeError('Invalid header string: %s' % e)
|
||||
|
||||
if not isinstance(header, Mapping):
|
||||
raise DecodeError('Invalid header string: must be a json object')
|
||||
|
||||
try:
|
||||
payload = base64url_decode(payload_segment)
|
||||
except (TypeError, binascii.Error):
|
||||
raise DecodeError('Invalid payload padding')
|
||||
|
||||
try:
|
||||
signature = base64url_decode(crypto_segment)
|
||||
except (TypeError, binascii.Error):
|
||||
raise DecodeError('Invalid crypto padding')
|
||||
|
||||
return (payload, signing_input, header, signature)
|
||||
|
||||
def _verify_signature(self, payload, signing_input, header, signature,
|
||||
key='', algorithms=None):
|
||||
|
||||
alg = header.get('alg')
|
||||
|
||||
if algorithms is not None and alg not in algorithms:
|
||||
raise InvalidAlgorithmError('The specified alg value is not allowed')
|
||||
|
||||
try:
|
||||
alg_obj = self._algorithms[alg]
|
||||
key = alg_obj.prepare_key(key)
|
||||
|
||||
if not alg_obj.verify(signing_input, key, signature):
|
||||
raise DecodeError('Signature verification failed')
|
||||
|
||||
except KeyError:
|
||||
raise InvalidAlgorithmError('Algorithm not supported')
|
||||
|
||||
|
||||
_jws_global_obj = PyJWS()
|
||||
encode = _jws_global_obj.encode
|
||||
decode = _jws_global_obj.decode
|
||||
register_algorithm = _jws_global_obj.register_algorithm
|
||||
unregister_algorithm = _jws_global_obj.unregister_algorithm
|
||||
get_unverified_header = _jws_global_obj.get_unverified_header
|
187
lib/jwt/api_jwt.py
Normal file
187
lib/jwt/api_jwt.py
Normal file
@@ -0,0 +1,187 @@
|
||||
import json
|
||||
import warnings
|
||||
|
||||
from calendar import timegm
|
||||
from collections import Mapping
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
from .api_jws import PyJWS
|
||||
from .algorithms import Algorithm, get_default_algorithms # NOQA
|
||||
from .compat import string_types, timedelta_total_seconds
|
||||
from .exceptions import (
|
||||
DecodeError, ExpiredSignatureError, ImmatureSignatureError,
|
||||
InvalidAudienceError, InvalidIssuedAtError,
|
||||
InvalidIssuerError, MissingRequiredClaimError
|
||||
)
|
||||
from .utils import merge_dict
|
||||
|
||||
|
||||
class PyJWT(PyJWS):
|
||||
header_type = 'JWT'
|
||||
|
||||
@staticmethod
|
||||
def _get_default_options():
|
||||
return {
|
||||
'verify_signature': True,
|
||||
'verify_exp': True,
|
||||
'verify_nbf': True,
|
||||
'verify_iat': True,
|
||||
'verify_aud': True,
|
||||
'verify_iss': True,
|
||||
'require_exp': False,
|
||||
'require_iat': False,
|
||||
'require_nbf': False
|
||||
}
|
||||
|
||||
def encode(self, payload, key, algorithm='HS256', headers=None,
|
||||
json_encoder=None):
|
||||
# Check that we get a mapping
|
||||
if not isinstance(payload, Mapping):
|
||||
raise TypeError('Expecting a mapping object, as JWT only supports '
|
||||
'JSON objects as payloads.')
|
||||
|
||||
# Payload
|
||||
for time_claim in ['exp', 'iat', 'nbf']:
|
||||
# Convert datetime to a intDate value in known time-format claims
|
||||
if isinstance(payload.get(time_claim), datetime):
|
||||
payload[time_claim] = timegm(payload[time_claim].utctimetuple())
|
||||
|
||||
json_payload = json.dumps(
|
||||
payload,
|
||||
separators=(',', ':'),
|
||||
cls=json_encoder
|
||||
).encode('utf-8')
|
||||
|
||||
return super(PyJWT, self).encode(
|
||||
json_payload, key, algorithm, headers, json_encoder
|
||||
)
|
||||
|
||||
def decode(self, jwt, key='', verify=True, algorithms=None, options=None,
|
||||
**kwargs):
|
||||
payload, signing_input, header, signature = self._load(jwt)
|
||||
|
||||
decoded = super(PyJWT, self).decode(jwt, key, verify, algorithms,
|
||||
options, **kwargs)
|
||||
|
||||
try:
|
||||
payload = json.loads(decoded.decode('utf-8'))
|
||||
except ValueError as e:
|
||||
raise DecodeError('Invalid payload string: %s' % e)
|
||||
if not isinstance(payload, Mapping):
|
||||
raise DecodeError('Invalid payload string: must be a json object')
|
||||
|
||||
if verify:
|
||||
merged_options = merge_dict(self.options, options)
|
||||
self._validate_claims(payload, merged_options, **kwargs)
|
||||
|
||||
return payload
|
||||
|
||||
def _validate_claims(self, payload, options, audience=None, issuer=None,
|
||||
leeway=0, **kwargs):
|
||||
|
||||
if 'verify_expiration' in kwargs:
|
||||
options['verify_exp'] = kwargs.get('verify_expiration', True)
|
||||
warnings.warn('The verify_expiration parameter is deprecated. '
|
||||
'Please use options instead.', DeprecationWarning)
|
||||
|
||||
if isinstance(leeway, timedelta):
|
||||
leeway = timedelta_total_seconds(leeway)
|
||||
|
||||
if not isinstance(audience, (string_types, type(None))):
|
||||
raise TypeError('audience must be a string or None')
|
||||
|
||||
self._validate_required_claims(payload, options)
|
||||
|
||||
now = timegm(datetime.utcnow().utctimetuple())
|
||||
|
||||
if 'iat' in payload and options.get('verify_iat'):
|
||||
self._validate_iat(payload, now, leeway)
|
||||
|
||||
if 'nbf' in payload and options.get('verify_nbf'):
|
||||
self._validate_nbf(payload, now, leeway)
|
||||
|
||||
if 'exp' in payload and options.get('verify_exp'):
|
||||
self._validate_exp(payload, now, leeway)
|
||||
|
||||
if options.get('verify_iss'):
|
||||
self._validate_iss(payload, issuer)
|
||||
|
||||
if options.get('verify_aud'):
|
||||
self._validate_aud(payload, audience)
|
||||
|
||||
def _validate_required_claims(self, payload, options):
|
||||
if options.get('require_exp') and payload.get('exp') is None:
|
||||
raise MissingRequiredClaimError('exp')
|
||||
|
||||
if options.get('require_iat') and payload.get('iat') is None:
|
||||
raise MissingRequiredClaimError('iat')
|
||||
|
||||
if options.get('require_nbf') and payload.get('nbf') is None:
|
||||
raise MissingRequiredClaimError('nbf')
|
||||
|
||||
def _validate_iat(self, payload, now, leeway):
|
||||
try:
|
||||
iat = int(payload['iat'])
|
||||
except ValueError:
|
||||
raise DecodeError('Issued At claim (iat) must be an integer.')
|
||||
|
||||
if iat > (now + leeway):
|
||||
raise InvalidIssuedAtError('Issued At claim (iat) cannot be in'
|
||||
' the future.')
|
||||
|
||||
def _validate_nbf(self, payload, now, leeway):
|
||||
try:
|
||||
nbf = int(payload['nbf'])
|
||||
except ValueError:
|
||||
raise DecodeError('Not Before claim (nbf) must be an integer.')
|
||||
|
||||
if nbf > (now + leeway):
|
||||
raise ImmatureSignatureError('The token is not yet valid (nbf)')
|
||||
|
||||
def _validate_exp(self, payload, now, leeway):
|
||||
try:
|
||||
exp = int(payload['exp'])
|
||||
except ValueError:
|
||||
raise DecodeError('Expiration Time claim (exp) must be an'
|
||||
' integer.')
|
||||
|
||||
if exp < (now - leeway):
|
||||
raise ExpiredSignatureError('Signature has expired')
|
||||
|
||||
def _validate_aud(self, payload, audience):
|
||||
if audience is None and 'aud' not in payload:
|
||||
return
|
||||
|
||||
if audience is not None and 'aud' not in payload:
|
||||
# Application specified an audience, but it could not be
|
||||
# verified since the token does not contain a claim.
|
||||
raise MissingRequiredClaimError('aud')
|
||||
|
||||
audience_claims = payload['aud']
|
||||
|
||||
if isinstance(audience_claims, string_types):
|
||||
audience_claims = [audience_claims]
|
||||
if not isinstance(audience_claims, list):
|
||||
raise InvalidAudienceError('Invalid claim format in token')
|
||||
if any(not isinstance(c, string_types) for c in audience_claims):
|
||||
raise InvalidAudienceError('Invalid claim format in token')
|
||||
if audience not in audience_claims:
|
||||
raise InvalidAudienceError('Invalid audience')
|
||||
|
||||
def _validate_iss(self, payload, issuer):
|
||||
if issuer is None:
|
||||
return
|
||||
|
||||
if 'iss' not in payload:
|
||||
raise MissingRequiredClaimError('iss')
|
||||
|
||||
if payload['iss'] != issuer:
|
||||
raise InvalidIssuerError('Invalid issuer')
|
||||
|
||||
|
||||
_jwt_global_obj = PyJWT()
|
||||
encode = _jwt_global_obj.encode
|
||||
decode = _jwt_global_obj.decode
|
||||
register_algorithm = _jwt_global_obj.register_algorithm
|
||||
unregister_algorithm = _jwt_global_obj.unregister_algorithm
|
||||
get_unverified_header = _jwt_global_obj.get_unverified_header
|
52
lib/jwt/compat.py
Normal file
52
lib/jwt/compat.py
Normal file
@@ -0,0 +1,52 @@
|
||||
"""
|
||||
The `compat` module provides support for backwards compatibility with older
|
||||
versions of python, and compatibility wrappers around optional packages.
|
||||
"""
|
||||
# flake8: noqa
|
||||
import sys
|
||||
import hmac
|
||||
|
||||
|
||||
PY3 = sys.version_info[0] == 3
|
||||
|
||||
|
||||
if PY3:
|
||||
string_types = str,
|
||||
text_type = str
|
||||
else:
|
||||
string_types = basestring,
|
||||
text_type = unicode
|
||||
|
||||
|
||||
def timedelta_total_seconds(delta):
|
||||
try:
|
||||
delta.total_seconds
|
||||
except AttributeError:
|
||||
# On Python 2.6, timedelta instances do not have
|
||||
# a .total_seconds() method.
|
||||
total_seconds = delta.days * 24 * 60 * 60 + delta.seconds
|
||||
else:
|
||||
total_seconds = delta.total_seconds()
|
||||
|
||||
return total_seconds
|
||||
|
||||
|
||||
try:
|
||||
constant_time_compare = hmac.compare_digest
|
||||
except AttributeError:
|
||||
# Fallback for Python < 2.7
|
||||
def constant_time_compare(val1, val2):
|
||||
"""
|
||||
Returns True if the two strings are equal, False otherwise.
|
||||
|
||||
The time taken is independent of the number of characters that match.
|
||||
"""
|
||||
if len(val1) != len(val2):
|
||||
return False
|
||||
|
||||
result = 0
|
||||
|
||||
for x, y in zip(val1, val2):
|
||||
result |= ord(x) ^ ord(y)
|
||||
|
||||
return result == 0
|
0
lib/jwt/contrib/__init__.py
Normal file
0
lib/jwt/contrib/__init__.py
Normal file
0
lib/jwt/contrib/algorithms/__init__.py
Normal file
0
lib/jwt/contrib/algorithms/__init__.py
Normal file
60
lib/jwt/contrib/algorithms/py_ecdsa.py
Normal file
60
lib/jwt/contrib/algorithms/py_ecdsa.py
Normal file
@@ -0,0 +1,60 @@
|
||||
# Note: This file is named py_ecdsa.py because import behavior in Python 2
|
||||
# would cause ecdsa.py to squash the ecdsa library that it depends upon.
|
||||
|
||||
import hashlib
|
||||
|
||||
import ecdsa
|
||||
|
||||
from jwt.algorithms import Algorithm
|
||||
from jwt.compat import string_types, text_type
|
||||
|
||||
|
||||
class ECAlgorithm(Algorithm):
|
||||
"""
|
||||
Performs signing and verification operations using
|
||||
ECDSA and the specified hash function
|
||||
|
||||
This class requires the ecdsa package to be installed.
|
||||
|
||||
This is based off of the implementation in PyJWT 0.3.2
|
||||
"""
|
||||
SHA256 = hashlib.sha256
|
||||
SHA384 = hashlib.sha384
|
||||
SHA512 = hashlib.sha512
|
||||
|
||||
def __init__(self, hash_alg):
|
||||
self.hash_alg = hash_alg
|
||||
|
||||
def prepare_key(self, key):
|
||||
|
||||
if isinstance(key, ecdsa.SigningKey) or \
|
||||
isinstance(key, ecdsa.VerifyingKey):
|
||||
return key
|
||||
|
||||
if isinstance(key, string_types):
|
||||
if isinstance(key, text_type):
|
||||
key = key.encode('utf-8')
|
||||
|
||||
# Attempt to load key. We don't know if it's
|
||||
# a Signing Key or a Verifying Key, so we try
|
||||
# the Verifying Key first.
|
||||
try:
|
||||
key = ecdsa.VerifyingKey.from_pem(key)
|
||||
except ecdsa.der.UnexpectedDER:
|
||||
key = ecdsa.SigningKey.from_pem(key)
|
||||
|
||||
else:
|
||||
raise TypeError('Expecting a PEM-formatted key.')
|
||||
|
||||
return key
|
||||
|
||||
def sign(self, msg, key):
|
||||
return key.sign(msg, hashfunc=self.hash_alg,
|
||||
sigencode=ecdsa.util.sigencode_string)
|
||||
|
||||
def verify(self, msg, key, sig):
|
||||
try:
|
||||
return key.verify(sig, msg, hashfunc=self.hash_alg,
|
||||
sigdecode=ecdsa.util.sigdecode_string)
|
||||
except AssertionError:
|
||||
return False
|
47
lib/jwt/contrib/algorithms/pycrypto.py
Normal file
47
lib/jwt/contrib/algorithms/pycrypto.py
Normal file
@@ -0,0 +1,47 @@
|
||||
import Crypto.Hash.SHA256
|
||||
import Crypto.Hash.SHA384
|
||||
import Crypto.Hash.SHA512
|
||||
|
||||
from Crypto.PublicKey import RSA
|
||||
from Crypto.Signature import PKCS1_v1_5
|
||||
|
||||
from jwt.algorithms import Algorithm
|
||||
from jwt.compat import string_types, text_type
|
||||
|
||||
|
||||
class RSAAlgorithm(Algorithm):
|
||||
"""
|
||||
Performs signing and verification operations using
|
||||
RSASSA-PKCS-v1_5 and the specified hash function.
|
||||
|
||||
This class requires PyCrypto package to be installed.
|
||||
|
||||
This is based off of the implementation in PyJWT 0.3.2
|
||||
"""
|
||||
SHA256 = Crypto.Hash.SHA256
|
||||
SHA384 = Crypto.Hash.SHA384
|
||||
SHA512 = Crypto.Hash.SHA512
|
||||
|
||||
def __init__(self, hash_alg):
|
||||
self.hash_alg = hash_alg
|
||||
|
||||
def prepare_key(self, key):
|
||||
|
||||
if isinstance(key, RSA._RSAobj):
|
||||
return key
|
||||
|
||||
if isinstance(key, string_types):
|
||||
if isinstance(key, text_type):
|
||||
key = key.encode('utf-8')
|
||||
|
||||
key = RSA.importKey(key)
|
||||
else:
|
||||
raise TypeError('Expecting a PEM- or RSA-formatted key.')
|
||||
|
||||
return key
|
||||
|
||||
def sign(self, msg, key):
|
||||
return PKCS1_v1_5.new(key).sign(self.hash_alg.new(msg))
|
||||
|
||||
def verify(self, msg, key, sig):
|
||||
return PKCS1_v1_5.new(key).verify(self.hash_alg.new(msg), sig)
|
48
lib/jwt/exceptions.py
Normal file
48
lib/jwt/exceptions.py
Normal file
@@ -0,0 +1,48 @@
|
||||
class InvalidTokenError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class DecodeError(InvalidTokenError):
|
||||
pass
|
||||
|
||||
|
||||
class ExpiredSignatureError(InvalidTokenError):
|
||||
pass
|
||||
|
||||
|
||||
class InvalidAudienceError(InvalidTokenError):
|
||||
pass
|
||||
|
||||
|
||||
class InvalidIssuerError(InvalidTokenError):
|
||||
pass
|
||||
|
||||
|
||||
class InvalidIssuedAtError(InvalidTokenError):
|
||||
pass
|
||||
|
||||
|
||||
class ImmatureSignatureError(InvalidTokenError):
|
||||
pass
|
||||
|
||||
|
||||
class InvalidKeyError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class InvalidAlgorithmError(InvalidTokenError):
|
||||
pass
|
||||
|
||||
|
||||
class MissingRequiredClaimError(InvalidTokenError):
|
||||
def __init__(self, claim):
|
||||
self.claim = claim
|
||||
|
||||
def __str__(self):
|
||||
return 'Token is missing the "%s" claim' % self.claim
|
||||
|
||||
|
||||
# Compatibility aliases (deprecated)
|
||||
ExpiredSignature = ExpiredSignatureError
|
||||
InvalidAudience = InvalidAudienceError
|
||||
InvalidIssuer = InvalidIssuerError
|
67
lib/jwt/utils.py
Normal file
67
lib/jwt/utils.py
Normal file
@@ -0,0 +1,67 @@
|
||||
import base64
|
||||
import binascii
|
||||
|
||||
try:
|
||||
from cryptography.hazmat.primitives.asymmetric.utils import (
|
||||
decode_rfc6979_signature, encode_rfc6979_signature
|
||||
)
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
|
||||
def base64url_decode(input):
|
||||
rem = len(input) % 4
|
||||
|
||||
if rem > 0:
|
||||
input += b'=' * (4 - rem)
|
||||
|
||||
return base64.urlsafe_b64decode(input)
|
||||
|
||||
|
||||
def base64url_encode(input):
|
||||
return base64.urlsafe_b64encode(input).replace(b'=', b'')
|
||||
|
||||
|
||||
def merge_dict(original, updates):
|
||||
if not updates:
|
||||
return original
|
||||
|
||||
try:
|
||||
merged_options = original.copy()
|
||||
merged_options.update(updates)
|
||||
except (AttributeError, ValueError) as e:
|
||||
raise TypeError('original and updates must be a dictionary: %s' % e)
|
||||
|
||||
return merged_options
|
||||
|
||||
|
||||
def number_to_bytes(num, num_bytes):
|
||||
padded_hex = '%0*x' % (2 * num_bytes, num)
|
||||
big_endian = binascii.a2b_hex(padded_hex.encode('ascii'))
|
||||
return big_endian
|
||||
|
||||
|
||||
def bytes_to_number(string):
|
||||
return int(binascii.b2a_hex(string), 16)
|
||||
|
||||
|
||||
def der_to_raw_signature(der_sig, curve):
|
||||
num_bits = curve.key_size
|
||||
num_bytes = (num_bits + 7) // 8
|
||||
|
||||
r, s = decode_rfc6979_signature(der_sig)
|
||||
|
||||
return number_to_bytes(r, num_bytes) + number_to_bytes(s, num_bytes)
|
||||
|
||||
|
||||
def raw_to_der_signature(raw_sig, curve):
|
||||
num_bits = curve.key_size
|
||||
num_bytes = (num_bits + 7) // 8
|
||||
|
||||
if len(raw_sig) != 2 * num_bytes:
|
||||
raise ValueError('Invalid signature')
|
||||
|
||||
r = bytes_to_number(raw_sig[:num_bytes])
|
||||
s = bytes_to_number(raw_sig[num_bytes:])
|
||||
|
||||
return encode_rfc6979_signature(r, s)
|
@@ -34,7 +34,7 @@ from apscheduler.triggers.interval import IntervalTrigger
|
||||
|
||||
import activity_handler
|
||||
import activity_pinger
|
||||
import config
|
||||
import common
|
||||
import database
|
||||
import libraries
|
||||
import logger
|
||||
@@ -42,7 +42,6 @@ import mobile_app
|
||||
import notification_handler
|
||||
import notifiers
|
||||
import plextv
|
||||
import pmsconnect
|
||||
import users
|
||||
import versioncheck
|
||||
import plexpy.config
|
||||
@@ -83,6 +82,7 @@ INSTALL_TYPE = None
|
||||
CURRENT_VERSION = None
|
||||
LATEST_VERSION = None
|
||||
COMMITS_BEHIND = None
|
||||
PREV_RELEASE = None
|
||||
|
||||
UMASK = None
|
||||
|
||||
@@ -102,7 +102,9 @@ def initialize(config_file):
|
||||
global _INITIALIZED
|
||||
global CURRENT_VERSION
|
||||
global LATEST_VERSION
|
||||
global PREV_RELEASE
|
||||
global UMASK
|
||||
|
||||
CONFIG = plexpy.config.Config(config_file)
|
||||
CONFIG_FILE = config_file
|
||||
|
||||
@@ -175,17 +177,32 @@ def initialize(config_file):
|
||||
# Check if Tautulli has a uuid
|
||||
if CONFIG.PMS_UUID == '' or not CONFIG.PMS_UUID:
|
||||
logger.debug(u"Generating UUID...")
|
||||
my_uuid = generate_uuid()
|
||||
CONFIG.__setattr__('PMS_UUID', my_uuid)
|
||||
CONFIG.PMS_UUID = generate_uuid()
|
||||
CONFIG.write()
|
||||
|
||||
# Check if Tautulli has an API key
|
||||
if CONFIG.API_KEY == '':
|
||||
logger.debug(u"Generating API key...")
|
||||
api_key = generate_uuid()
|
||||
CONFIG.__setattr__('API_KEY', api_key)
|
||||
CONFIG.API_KEY = generate_uuid()
|
||||
CONFIG.write()
|
||||
|
||||
# Check if Tautulli has a jwt_secret
|
||||
if CONFIG.JWT_SECRET == '' or not CONFIG.JWT_SECRET:
|
||||
logger.debug(u"Generating JWT secret...")
|
||||
CONFIG.JWT_SECRET = generate_uuid()
|
||||
CONFIG.write()
|
||||
|
||||
# Get the previous version from the file
|
||||
version_lock_file = os.path.join(DATA_DIR, "version.lock")
|
||||
prev_version = None
|
||||
if os.path.isfile(version_lock_file):
|
||||
try:
|
||||
with open(version_lock_file, "r") as fp:
|
||||
prev_version = fp.read()
|
||||
except IOError as e:
|
||||
logger.error(u"Unable to read previous version from file '%s': %s" %
|
||||
(version_lock_file, e))
|
||||
|
||||
# Get the currently installed version. Returns None, 'win32' or the git
|
||||
# hash.
|
||||
CURRENT_VERSION, CONFIG.GIT_REMOTE, CONFIG.GIT_BRANCH = versioncheck.getVersion()
|
||||
@@ -194,8 +211,6 @@ def initialize(config_file):
|
||||
# This allowes one to restore to that version. The idea is that if we
|
||||
# arrive here, most parts of Tautulli seem to work.
|
||||
if CURRENT_VERSION:
|
||||
version_lock_file = os.path.join(DATA_DIR, "version.lock")
|
||||
|
||||
try:
|
||||
with open(version_lock_file, "w") as fp:
|
||||
fp.write(CURRENT_VERSION)
|
||||
@@ -213,6 +228,32 @@ def initialize(config_file):
|
||||
else:
|
||||
LATEST_VERSION = CURRENT_VERSION
|
||||
|
||||
# Get the previous release from the file
|
||||
release_file = os.path.join(DATA_DIR, "release.lock")
|
||||
PREV_RELEASE = common.VERSION_NUMBER
|
||||
if os.path.isfile(release_file):
|
||||
try:
|
||||
with open(release_file, "r") as fp:
|
||||
PREV_RELEASE = fp.read()
|
||||
except IOError as e:
|
||||
logger.error(u"Unable to read previous release from file '%s': %s" %
|
||||
(release_file, e))
|
||||
elif prev_version == 'cfd30996264b7e9fe4ef87f02d1cc52d1ae8bfca': # Commit hash for v1.4.25
|
||||
PREV_RELEASE = 'v1.4.25'
|
||||
|
||||
# Check if the release was updated
|
||||
if common.VERSION_NUMBER != PREV_RELEASE:
|
||||
CONFIG.UPDATE_SHOW_CHANGELOG = 1
|
||||
CONFIG.write()
|
||||
|
||||
# Write current release version to file for update checking
|
||||
try:
|
||||
with open(release_file, "w") as fp:
|
||||
fp.write(common.VERSION_NUMBER)
|
||||
except IOError as e:
|
||||
logger.error(u"Unable to write current release to file '%s': %s" %
|
||||
(release_file, e))
|
||||
|
||||
# Get the real PMS urls for SSL and remote access
|
||||
if CONFIG.PMS_TOKEN and CONFIG.PMS_IP and CONFIG.PMS_PORT:
|
||||
plextv.get_server_resources()
|
||||
@@ -341,7 +382,7 @@ def initialize_scheduler():
|
||||
schedule_job(libraries.refresh_libraries, 'Refresh libraries list',
|
||||
hours=library_hours, minutes=0, seconds=0)
|
||||
|
||||
schedule_job(activity_pinger.check_server_response, 'Check server response',
|
||||
schedule_job(activity_pinger.check_server_response, 'Check for server response',
|
||||
hours=0, minutes=0, seconds=0)
|
||||
|
||||
else:
|
||||
@@ -363,7 +404,7 @@ def initialize_scheduler():
|
||||
response_seconds = CONFIG.WEBSOCKET_CONNECTION_ATTEMPTS * CONFIG.WEBSOCKET_CONNECTION_TIMEOUT
|
||||
response_seconds = 60 if response_seconds < 60 else response_seconds
|
||||
|
||||
schedule_job(activity_pinger.check_server_response, 'Check server response',
|
||||
schedule_job(activity_pinger.check_server_response, 'Check for server response',
|
||||
hours=0, minutes=0, seconds=response_seconds)
|
||||
|
||||
# Start scheduler
|
||||
@@ -406,6 +447,7 @@ def start():
|
||||
|
||||
# Start background notification thread
|
||||
notification_handler.start_threads(num_threads=CONFIG.NOTIFICATION_THREADS)
|
||||
notifiers.check_browser_enabled()
|
||||
|
||||
_STARTED = True
|
||||
|
||||
@@ -443,6 +485,7 @@ def dbcheck():
|
||||
'transcode_protocol TEXT, transcode_container TEXT, '
|
||||
'transcode_video_codec TEXT, transcode_audio_codec TEXT, transcode_audio_channels INTEGER,'
|
||||
'transcode_width INTEGER, transcode_height INTEGER, '
|
||||
'transcode_hw_decoding INTEGER, transcode_hw_encoding INTEGER, '
|
||||
'optimized_version INTEGER, optimized_version_profile TEXT, optimized_version_title TEXT, '
|
||||
'synced_version INTEGER, synced_version_profile TEXT, '
|
||||
'buffer_count INTEGER DEFAULT 0, buffer_last_triggered INTEGER, last_paused INTEGER, write_attempts INTEGER DEFAULT 0, '
|
||||
@@ -468,8 +511,9 @@ def dbcheck():
|
||||
'audio_bitrate INTEGER, audio_codec TEXT, audio_channels INTEGER, transcode_protocol TEXT, '
|
||||
'transcode_container TEXT, transcode_video_codec TEXT, transcode_audio_codec TEXT, '
|
||||
'transcode_audio_channels INTEGER, transcode_width INTEGER, transcode_height INTEGER, '
|
||||
'transcode_hw_requested INTEGER, transcode_hw_full_pipeline INTEGER, transcode_hw_decode TEXT, '
|
||||
'transcode_hw_decode_title TEXT, transcode_hw_encode TEXT, transcode_hw_encode_title TEXT, '
|
||||
'transcode_hw_requested INTEGER, transcode_hw_full_pipeline INTEGER, '
|
||||
'transcode_hw_decode TEXT, transcode_hw_decode_title TEXT, transcode_hw_decoding INTEGER, '
|
||||
'transcode_hw_encode TEXT, transcode_hw_encode_title TEXT, transcode_hw_encoding INTEGER, '
|
||||
'stream_container TEXT, stream_container_decision TEXT, stream_bitrate INTEGER, '
|
||||
'stream_video_decision TEXT, stream_video_bitrate INTEGER, stream_video_codec TEXT, stream_video_codec_level TEXT, '
|
||||
'stream_video_bit_depth INTEGER, stream_video_height INTEGER, stream_video_width INTEGER, stream_video_resolution TEXT, '
|
||||
@@ -496,7 +540,7 @@ def dbcheck():
|
||||
c_db.execute(
|
||||
'CREATE TABLE IF NOT EXISTS users (id INTEGER PRIMARY KEY AUTOINCREMENT, '
|
||||
'user_id INTEGER DEFAULT NULL UNIQUE, username TEXT NOT NULL, friendly_name TEXT, '
|
||||
'thumb TEXT, custom_avatar_url TEXT, email TEXT, is_home_user INTEGER DEFAULT NULL, '
|
||||
'thumb TEXT, custom_avatar_url TEXT, email TEXT, is_admin INTEGER DEFAULT 0, is_home_user INTEGER DEFAULT NULL, '
|
||||
'is_allow_sync INTEGER DEFAULT NULL, is_restricted INTEGER DEFAULT NULL, do_notify INTEGER DEFAULT 1, '
|
||||
'keep_history INTEGER DEFAULT 1, deleted_user INTEGER DEFAULT 0, allow_guest INTEGER DEFAULT 0, '
|
||||
'user_token TEXT, server_token TEXT, shared_libraries TEXT, filter_all TEXT, filter_movies TEXT, filter_tv TEXT, '
|
||||
@@ -917,6 +961,18 @@ def dbcheck():
|
||||
'ALTER TABLE sessions ADD COLUMN optimized_version_title TEXT'
|
||||
)
|
||||
|
||||
# Upgrade sessions table from earlier versions
|
||||
try:
|
||||
c_db.execute('SELECT transcode_hw_decoding FROM sessions')
|
||||
except sqlite3.OperationalError:
|
||||
logger.debug(u"Altering database. Updating database table sessions.")
|
||||
c_db.execute(
|
||||
'ALTER TABLE sessions ADD COLUMN transcode_hw_decoding INTEGER'
|
||||
)
|
||||
c_db.execute(
|
||||
'ALTER TABLE sessions ADD COLUMN transcode_hw_encoding INTEGER'
|
||||
)
|
||||
|
||||
# Upgrade session_history table from earlier versions
|
||||
try:
|
||||
c_db.execute('SELECT reference_id FROM session_history')
|
||||
@@ -1159,6 +1215,43 @@ def dbcheck():
|
||||
'ALTER TABLE session_history_media_info ADD COLUMN optimized_version_title TEXT '
|
||||
)
|
||||
|
||||
# Upgrade session_history_media_info table from earlier versions
|
||||
try:
|
||||
c_db.execute('SELECT transcode_hw_decoding FROM session_history_media_info')
|
||||
except sqlite3.OperationalError:
|
||||
logger.debug(u"Altering database. Updating database table session_history_media_info.")
|
||||
c_db.execute(
|
||||
'ALTER TABLE session_history_media_info ADD COLUMN transcode_hw_decoding INTEGER '
|
||||
)
|
||||
c_db.execute(
|
||||
'ALTER TABLE session_history_media_info ADD COLUMN transcode_hw_encoding INTEGER '
|
||||
)
|
||||
c_db.execute(
|
||||
'UPDATE session_history_media_info SET subtitle_codec = "" WHERE subtitle_codec IS NULL '
|
||||
)
|
||||
|
||||
|
||||
# Upgrade session_history_media_info table from earlier versions
|
||||
try:
|
||||
result = c_db.execute('SELECT stream_container FROM session_history_media_info '
|
||||
'WHERE stream_container IS NULL').fetchall()
|
||||
if len(result) > 0:
|
||||
logger.debug(u"Altering database. Removing NULL values from session_history_media_info table.")
|
||||
c_db.execute(
|
||||
'UPDATE session_history_media_info SET stream_container = "" WHERE stream_container IS NULL '
|
||||
)
|
||||
c_db.execute(
|
||||
'UPDATE session_history_media_info SET stream_video_codec = "" WHERE stream_video_codec IS NULL '
|
||||
)
|
||||
c_db.execute(
|
||||
'UPDATE session_history_media_info SET stream_audio_codec = "" WHERE stream_audio_codec IS NULL '
|
||||
)
|
||||
c_db.execute(
|
||||
'UPDATE session_history_media_info SET stream_subtitle_codec = "" WHERE stream_subtitle_codec IS NULL '
|
||||
)
|
||||
except sqlite3.OperationalError:
|
||||
logger.warn(u"Unable to remove NULL values from session_history_media_info table.")
|
||||
|
||||
# Upgrade users table from earlier versions
|
||||
try:
|
||||
c_db.execute('SELECT do_notify FROM users')
|
||||
@@ -1234,6 +1327,15 @@ def dbcheck():
|
||||
'ALTER TABLE users ADD COLUMN filter_photos TEXT'
|
||||
)
|
||||
|
||||
# Upgrade users table from earlier versions
|
||||
try:
|
||||
c_db.execute('SELECT is_admin FROM users')
|
||||
except sqlite3.OperationalError:
|
||||
logger.debug(u"Altering database. Updating database table users.")
|
||||
c_db.execute(
|
||||
'ALTER TABLE users ADD COLUMN is_admin INTEGER DEFAULT 0'
|
||||
)
|
||||
|
||||
# Upgrade notify_log table from earlier versions
|
||||
try:
|
||||
c_db.execute('SELECT poster_url FROM notify_log')
|
||||
@@ -1340,8 +1442,8 @@ def dbcheck():
|
||||
|
||||
# Upgrade library_sections table from earlier versions (remove duplicated libraries)
|
||||
try:
|
||||
result = c_db.execute('SELECT * FROM library_sections WHERE server_id = ""')
|
||||
if result.rowcount > 0:
|
||||
result = c_db.execute('SELECT * FROM library_sections WHERE server_id = ""').fetchall()
|
||||
if len(result) > 0:
|
||||
logger.debug(u"Altering database. Removing duplicate libraries from library_sections table.")
|
||||
c_db.execute(
|
||||
'DELETE FROM library_sections WHERE server_id = ""'
|
||||
@@ -1485,6 +1587,7 @@ def upgrade():
|
||||
def shutdown(restart=False, update=False, checkout=False):
|
||||
cherrypy.engine.exit()
|
||||
SCHED.shutdown(wait=False)
|
||||
activity_handler.ACTIVITY_SCHED.shutdown(wait=False)
|
||||
|
||||
# Stop the notification threads
|
||||
for i in range(CONFIG.NOTIFICATION_THREADS):
|
||||
@@ -1515,23 +1618,35 @@ def shutdown(restart=False, update=False, checkout=False):
|
||||
|
||||
if restart:
|
||||
logger.info(u"Tautulli is restarting...")
|
||||
|
||||
exe = sys.executable
|
||||
args = [exe, FULL_PATH]
|
||||
args += ARGS
|
||||
if '--nolaunch' not in args:
|
||||
args += ['--nolaunch']
|
||||
|
||||
# os.execv fails with spaced names on Windows
|
||||
# https://bugs.python.org/issue19066
|
||||
# Separate out logger so we can shutdown logger after
|
||||
if NOFORK:
|
||||
logger.info('Running as service, not forking. Exiting...')
|
||||
elif os.name == 'nt':
|
||||
logger.info('Restarting Tautulli with %s', args)
|
||||
subprocess.Popen(args, cwd=os.getcwd())
|
||||
else:
|
||||
logger.info('Restarting Tautulli with %s', args)
|
||||
|
||||
logger.shutdown()
|
||||
|
||||
# os.execv fails with spaced names on Windows
|
||||
# https://bugs.python.org/issue19066
|
||||
if NOFORK:
|
||||
pass
|
||||
elif os.name == 'nt':
|
||||
subprocess.Popen(args, cwd=os.getcwd())
|
||||
else:
|
||||
os.execv(exe, args)
|
||||
|
||||
else:
|
||||
logger.shutdown()
|
||||
|
||||
os._exit(0)
|
||||
|
||||
|
||||
|
@@ -14,7 +14,7 @@
|
||||
# along with Tautulli. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import datetime
|
||||
import threading
|
||||
import os
|
||||
import time
|
||||
|
||||
from apscheduler.schedulers.background import BackgroundScheduler
|
||||
@@ -26,7 +26,6 @@ import datafactory
|
||||
import helpers
|
||||
import logger
|
||||
import notification_handler
|
||||
import notifiers
|
||||
import pmsconnect
|
||||
|
||||
|
||||
@@ -34,6 +33,7 @@ ACTIVITY_SCHED = BackgroundScheduler()
|
||||
|
||||
RECENTLY_ADDED_QUEUE = {}
|
||||
|
||||
|
||||
class ActivityHandler(object):
|
||||
|
||||
def __init__(self, timeline):
|
||||
@@ -54,7 +54,7 @@ class ActivityHandler(object):
|
||||
|
||||
def get_rating_key(self):
|
||||
if self.is_valid_session():
|
||||
return int(self.timeline['ratingKey'])
|
||||
return self.timeline['ratingKey']
|
||||
|
||||
return None
|
||||
|
||||
@@ -65,6 +65,10 @@ class ActivityHandler(object):
|
||||
if session_list:
|
||||
for session in session_list['sessions']:
|
||||
if int(session['session_key']) == self.get_session_key():
|
||||
# Live sessions don't have rating keys in sessions
|
||||
# Get it from the websocket data
|
||||
if not session['rating_key']:
|
||||
session['rating_key'] = self.get_rating_key()
|
||||
return session
|
||||
|
||||
return None
|
||||
@@ -75,9 +79,12 @@ class ActivityHandler(object):
|
||||
monitor_proc.write_session(session=session, notify=False)
|
||||
|
||||
def on_start(self):
|
||||
if self.is_valid_session() and self.get_live_session():
|
||||
if self.is_valid_session():
|
||||
session = self.get_live_session()
|
||||
|
||||
if not session:
|
||||
return
|
||||
|
||||
# Some DLNA clients create a new session temporarily when browsing the library
|
||||
# Wait and get session again to make sure it is an actual session
|
||||
if session['platform'] == 'DLNA':
|
||||
@@ -90,14 +97,15 @@ class ActivityHandler(object):
|
||||
% (str(session['session_key']), str(session['user_id']), session['username'],
|
||||
str(session['rating_key']), session['full_title']))
|
||||
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': session, 'notify_action': 'on_play'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': session.copy(), 'notify_action': 'on_play'})
|
||||
|
||||
# Write the new session to our temp session table
|
||||
self.update_db_session(session=session)
|
||||
|
||||
def on_stop(self, force_stop=False):
|
||||
if self.is_valid_session():
|
||||
logger.debug(u"Tautulli ActivityHandler :: Session %s stopped." % str(self.get_session_key()))
|
||||
logger.debug(u"Tautulli ActivityHandler :: Session %s %sstopped."
|
||||
% (str(self.get_session_key()), 'force ' if force_stop else ''))
|
||||
|
||||
# Set the session last_paused timestamp
|
||||
ap = activity_processor.ActivityProcessor()
|
||||
@@ -114,16 +122,23 @@ class ActivityHandler(object):
|
||||
# Retrieve the session data from our temp table
|
||||
db_session = ap.get_session_by_key(session_key=self.get_session_key())
|
||||
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': db_session, 'notify_action': 'on_stop'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': db_session.copy(), 'notify_action': 'on_stop'})
|
||||
|
||||
# Write it to the history table
|
||||
monitor_proc = activity_processor.ActivityProcessor()
|
||||
monitor_proc.write_session_history(session=db_session)
|
||||
row_id = monitor_proc.write_session_history(session=db_session)
|
||||
|
||||
if row_id:
|
||||
schedule_callback('session_key-{}'.format(self.get_session_key()), remove_job=True)
|
||||
|
||||
# Remove the session from our temp session table
|
||||
logger.debug(u"Tautulli ActivityHandler :: Removing sessionKey %s ratingKey %s from session queue"
|
||||
% (str(self.get_session_key()), str(self.get_rating_key())))
|
||||
ap.delete_session(session_key=self.get_session_key())
|
||||
ap.delete_session(row_id=row_id)
|
||||
delete_metadata_cache(self.get_session_key())
|
||||
else:
|
||||
schedule_callback('session_key-{}'.format(self.get_session_key()), func=force_stop_stream,
|
||||
args=[self.get_session_key()], seconds=30)
|
||||
|
||||
def on_pause(self, still_paused=False):
|
||||
if self.is_valid_session():
|
||||
@@ -144,7 +159,7 @@ class ActivityHandler(object):
|
||||
db_session = ap.get_session_by_key(session_key=self.get_session_key())
|
||||
|
||||
if not still_paused:
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': db_session, 'notify_action': 'on_pause'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': db_session.copy(), 'notify_action': 'on_pause'})
|
||||
|
||||
def on_resume(self):
|
||||
if self.is_valid_session():
|
||||
@@ -163,7 +178,7 @@ class ActivityHandler(object):
|
||||
# Retrieve the session data from our temp table
|
||||
db_session = ap.get_session_by_key(session_key=self.get_session_key())
|
||||
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': db_session, 'notify_action': 'on_resume'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': db_session.copy(), 'notify_action': 'on_resume'})
|
||||
|
||||
def on_buffer(self):
|
||||
if self.is_valid_session():
|
||||
@@ -201,7 +216,7 @@ class ActivityHandler(object):
|
||||
# Retrieve the session data from our temp table
|
||||
db_session = ap.get_session_by_key(session_key=self.get_session_key())
|
||||
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': db_session, 'notify_action': 'on_buffer'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': db_session.copy(), 'notify_action': 'on_buffer'})
|
||||
|
||||
# This function receives events from our websocket connection
|
||||
def process(self):
|
||||
@@ -216,7 +231,7 @@ class ActivityHandler(object):
|
||||
if db_session:
|
||||
# Re-schedule the callback to reset the 5 minutes timer
|
||||
schedule_callback('session_key-{}'.format(self.get_session_key()),
|
||||
function=force_stop_stream, args=[self.get_session_key()], minutes=5)
|
||||
func=force_stop_stream, args=[self.get_session_key()], minutes=5)
|
||||
|
||||
last_state = db_session['state']
|
||||
last_key = str(db_session['rating_key'])
|
||||
@@ -226,6 +241,8 @@ class ActivityHandler(object):
|
||||
# Update the session state and viewOffset
|
||||
if this_state == 'playing':
|
||||
# Update the session in our temp session table
|
||||
# if the last set temporary stopped time exceeds 15 seconds
|
||||
if int(time.time()) - db_session['stopped'] > 60:
|
||||
session = self.get_live_session()
|
||||
if session:
|
||||
self.update_db_session(session=session)
|
||||
@@ -239,9 +256,6 @@ class ActivityHandler(object):
|
||||
elif this_state == 'stopped':
|
||||
self.on_stop()
|
||||
|
||||
# Remove the callback if the stream is stopped
|
||||
schedule_callback('session_key-{}'.format(self.get_session_key()), remove_job=True)
|
||||
|
||||
elif this_state == 'buffering':
|
||||
self.on_buffer()
|
||||
|
||||
@@ -265,7 +279,7 @@ class ActivityHandler(object):
|
||||
db_session['media_type'] == 'episode' and progress_percent >= plexpy.CONFIG.TV_WATCHED_PERCENT or
|
||||
db_session['media_type'] == 'track' and progress_percent >= plexpy.CONFIG.MUSIC_WATCHED_PERCENT) \
|
||||
and not any(d['notify_action'] == 'on_watched' for d in notify_states):
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': db_session, 'notify_action': 'on_watched'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': db_session.copy(), 'notify_action': 'on_watched'})
|
||||
|
||||
else:
|
||||
# We don't have this session in our table yet, start a new one.
|
||||
@@ -274,7 +288,7 @@ class ActivityHandler(object):
|
||||
|
||||
# Schedule a callback to force stop a stale stream 5 minutes later
|
||||
schedule_callback('session_key-{}'.format(self.get_session_key()),
|
||||
function=force_stop_stream, args=[self.get_session_key()], minutes=5)
|
||||
func=force_stop_stream, args=[self.get_session_key()], minutes=5)
|
||||
|
||||
|
||||
class TimelineHandler(object):
|
||||
@@ -318,6 +332,7 @@ class TimelineHandler(object):
|
||||
9: 'album',
|
||||
10: 'track'}
|
||||
|
||||
identifier = self.timeline.get('identifier')
|
||||
state_type = self.timeline.get('state')
|
||||
media_type = media_types.get(self.timeline.get('type'))
|
||||
section_id = self.timeline.get('sectionID', 0)
|
||||
@@ -326,6 +341,10 @@ class TimelineHandler(object):
|
||||
media_state = self.timeline.get('mediaState')
|
||||
queue_size = self.timeline.get('queueSize')
|
||||
|
||||
# Return if it is not a library event (i.e. DVR EPG event)
|
||||
if identifier != 'com.plexapp.plugins.library':
|
||||
return
|
||||
|
||||
# Add a new media item to the recently added queue
|
||||
if media_type and section_id > 0 and \
|
||||
((state_type == 0 and metadata_state == 'created')): # or \
|
||||
@@ -352,7 +371,7 @@ class TimelineHandler(object):
|
||||
% (title, str(rating_key), str(grandparent_rating_key)))
|
||||
|
||||
# Schedule a callback to clear the recently added queue
|
||||
schedule_callback('rating_key-{}'.format(grandparent_rating_key), function=clear_recently_added_queue,
|
||||
schedule_callback('rating_key-{}'.format(grandparent_rating_key), func=clear_recently_added_queue,
|
||||
args=[grandparent_rating_key], seconds=plexpy.CONFIG.NOTIFY_RECENTLY_ADDED_DELAY)
|
||||
|
||||
elif media_type in ('season', 'album'):
|
||||
@@ -368,7 +387,7 @@ class TimelineHandler(object):
|
||||
% (title, str(rating_key), str(parent_rating_key)))
|
||||
|
||||
# Schedule a callback to clear the recently added queue
|
||||
schedule_callback('rating_key-{}'.format(parent_rating_key), function=clear_recently_added_queue,
|
||||
schedule_callback('rating_key-{}'.format(parent_rating_key), func=clear_recently_added_queue,
|
||||
args=[parent_rating_key], seconds=plexpy.CONFIG.NOTIFY_RECENTLY_ADDED_DELAY)
|
||||
|
||||
else:
|
||||
@@ -379,7 +398,7 @@ class TimelineHandler(object):
|
||||
% (title, str(rating_key)))
|
||||
|
||||
# Schedule a callback to clear the recently added queue
|
||||
schedule_callback('rating_key-{}'.format(rating_key), function=clear_recently_added_queue,
|
||||
schedule_callback('rating_key-{}'.format(rating_key), func=clear_recently_added_queue,
|
||||
args=[rating_key], seconds=plexpy.CONFIG.NOTIFY_RECENTLY_ADDED_DELAY)
|
||||
|
||||
# A movie, show, or artist is done processing
|
||||
@@ -409,7 +428,7 @@ def del_keys(key):
|
||||
del_keys(RECENTLY_ADDED_QUEUE.pop(key))
|
||||
|
||||
|
||||
def schedule_callback(id, function=None, remove_job=False, args=None, **kwargs):
|
||||
def schedule_callback(id, func=None, remove_job=False, args=None, **kwargs):
|
||||
if ACTIVITY_SCHED.get_job(id):
|
||||
if remove_job:
|
||||
ACTIVITY_SCHED.remove_job(id)
|
||||
@@ -419,7 +438,7 @@ def schedule_callback(id, function=None, remove_job=False, args=None, **kwargs):
|
||||
run_date=datetime.datetime.now() + datetime.timedelta(**kwargs)))
|
||||
elif not remove_job:
|
||||
ACTIVITY_SCHED.add_job(
|
||||
function, args=args, id=id, trigger=DateTrigger(
|
||||
func, args=args, id=id, trigger=DateTrigger(
|
||||
run_date=datetime.datetime.now() + datetime.timedelta(**kwargs)))
|
||||
|
||||
|
||||
@@ -427,34 +446,36 @@ def force_stop_stream(session_key):
|
||||
ap = activity_processor.ActivityProcessor()
|
||||
session = ap.get_session_by_key(session_key=session_key)
|
||||
|
||||
success = ap.write_session_history(session=session)
|
||||
row_id = ap.write_session_history(session=session)
|
||||
|
||||
if success:
|
||||
# If session is written to the databaase successfully, remove the session from the session table
|
||||
if row_id:
|
||||
# If session is written to the database successfully, remove the session from the session table
|
||||
logger.info(u"Tautulli ActivityHandler :: Removing stale stream with sessionKey %s ratingKey %s from session queue"
|
||||
% (session['session_key'], session['rating_key']))
|
||||
ap.delete_session(session_key=session_key)
|
||||
ap.delete_session(row_id=row_id)
|
||||
delete_metadata_cache(session_key)
|
||||
|
||||
else:
|
||||
sessions['write_attempts'] += 1
|
||||
session['write_attempts'] += 1
|
||||
|
||||
if sessions['write_attempts'] < plexpy.CONFIG.SESSION_DB_WRITE_ATTEMPTS:
|
||||
if session['write_attempts'] < plexpy.CONFIG.SESSION_DB_WRITE_ATTEMPTS:
|
||||
logger.warn(u"Tautulli ActivityHandler :: Failed to write stream with sessionKey %s ratingKey %s to the database. " \
|
||||
"Will try again in 30 seconds. Write attempt %s."
|
||||
% (sessions['session_key'], sessions['rating_key'], str(sessions['write_attempts'])))
|
||||
% (session['session_key'], session['rating_key'], str(session['write_attempts'])))
|
||||
ap.increment_write_attempts(session_key=session_key)
|
||||
|
||||
# Reschedule for 30 seconds later
|
||||
schedule_callback('session_key={}'.format(session_key), function=force_stop_stream,
|
||||
schedule_callback('session_key-{}'.format(session_key), func=force_stop_stream,
|
||||
args=[session_key], seconds=30)
|
||||
|
||||
else:
|
||||
logger.warn(u"Tautulli Monitor :: Failed to write stream with sessionKey %s ratingKey %s to the database. " \
|
||||
logger.warn(u"Tautulli ActivityHandler :: Failed to write stream with sessionKey %s ratingKey %s to the database. " \
|
||||
"Removing session from the database. Write attempt %s."
|
||||
% (sessions['session_key'], sessions['rating_key'], str(sessions['write_attempts'])))
|
||||
logger.info(u"Tautulli Monitor :: Removing stale stream with sessionKey %s ratingKey %s from session queue"
|
||||
% (sessions['session_key'], sessions['rating_key']))
|
||||
% (session['session_key'], session['rating_key'], str(session['write_attempts'])))
|
||||
logger.info(u"Tautulli ActivityHandler :: Removing stale stream with sessionKey %s ratingKey %s from session queue"
|
||||
% (session['session_key'], session['rating_key']))
|
||||
ap.delete_session(session_key=session_key)
|
||||
delete_metadata_cache(session_key)
|
||||
|
||||
|
||||
def clear_recently_added_queue(rating_key):
|
||||
@@ -491,16 +512,18 @@ def on_created(rating_key, **kwargs):
|
||||
|
||||
if metadata:
|
||||
notify = True
|
||||
now = int(time.time())
|
||||
|
||||
if helpers.cast_to_int(metadata['updated_at']) < now - 86400: # Updated more than 24 hours ago
|
||||
logger.debug(u"Tautulli TimelineHandler :: Library item %s updated more than 24 hours ago. Not notifying." % str(rating_key))
|
||||
notify = False
|
||||
# now = int(time.time())
|
||||
#
|
||||
# if helpers.cast_to_int(metadata['added_at']) < now - 86400: # Updated more than 24 hours ago
|
||||
# logger.debug(u"Tautulli TimelineHandler :: Library item %s added more than 24 hours ago. Not notifying."
|
||||
# % str(rating_key))
|
||||
# notify = False
|
||||
|
||||
data_factory = datafactory.DataFactory()
|
||||
if 'child_keys' not in kwargs:
|
||||
if data_factory.get_recently_added_item(rating_key):
|
||||
logger.debug(u"Tautulli TimelineHandler :: Library item %s added already. Not notifying again." % str(rating_key))
|
||||
logger.debug(u"Tautulli TimelineHandler :: Library item %s added already. Not notifying again."
|
||||
% str(rating_key))
|
||||
notify = False
|
||||
|
||||
if notify:
|
||||
@@ -519,3 +542,11 @@ def on_created(rating_key, **kwargs):
|
||||
|
||||
else:
|
||||
logger.error(u"Tautulli TimelineHandler :: Unable to retrieve metadata for rating_key %s" % str(rating_key))
|
||||
|
||||
|
||||
def delete_metadata_cache(session_key):
|
||||
try:
|
||||
os.remove(os.path.join(plexpy.CONFIG.CACHE_DIR, 'metadata-sessionKey-%s.json' % session_key))
|
||||
except IOError as e:
|
||||
logger.error(u"Tautulli ActivityHandler :: Failed to remove metadata cache file (sessionKey %s): %s"
|
||||
% (session_key, e))
|
||||
|
@@ -61,12 +61,12 @@ def check_active_sessions(ws_request=False):
|
||||
if session['state'] == 'paused':
|
||||
logger.debug(u"Tautulli Monitor :: Session %s paused." % stream['session_key'])
|
||||
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream, 'notify_action': 'on_pause'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream.copy(), 'notify_action': 'on_pause'})
|
||||
|
||||
if session['state'] == 'playing' and stream['state'] == 'paused':
|
||||
logger.debug(u"Tautulli Monitor :: Session %s resumed." % stream['session_key'])
|
||||
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream, 'notify_action': 'on_resume'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream.copy(), 'notify_action': 'on_resume'})
|
||||
|
||||
if stream['state'] == 'paused' and not ws_request:
|
||||
# The stream is still paused so we need to increment the paused_counter
|
||||
@@ -104,7 +104,7 @@ def check_active_sessions(ws_request=False):
|
||||
'WHERE session_key = ? AND rating_key = ?',
|
||||
[stream['session_key'], stream['rating_key']])
|
||||
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream, 'notify_action': 'on_buffer'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream.copy(), 'notify_action': 'on_buffer'})
|
||||
|
||||
else:
|
||||
# Subsequent buffer notifications after wait time
|
||||
@@ -118,7 +118,7 @@ def check_active_sessions(ws_request=False):
|
||||
'WHERE session_key = ? AND rating_key = ?',
|
||||
[stream['session_key'], stream['rating_key']])
|
||||
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream, 'notify_action': 'on_buffer'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream.copy(), 'notify_action': 'on_buffer'})
|
||||
|
||||
logger.debug(u"Tautulli Monitor :: Session %s is buffering. Count is now %s. Last triggered %s."
|
||||
% (stream['session_key'],
|
||||
@@ -135,7 +135,7 @@ def check_active_sessions(ws_request=False):
|
||||
session['media_type'] == 'episode' and progress_percent >= plexpy.CONFIG.TV_WATCHED_PERCENT or
|
||||
session['media_type'] == 'track' and progress_percent >= plexpy.CONFIG.MUSIC_WATCHED_PERCENT) \
|
||||
and not any(d['notify_action'] == 'on_watched' for d in notify_states):
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream, 'notify_action': 'on_watched'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream.copy(), 'notify_action': 'on_watched'})
|
||||
|
||||
else:
|
||||
# The user has stopped playing a stream
|
||||
@@ -155,19 +155,18 @@ def check_active_sessions(ws_request=False):
|
||||
stream['media_type'] == 'episode' and progress_percent >= plexpy.CONFIG.TV_WATCHED_PERCENT or
|
||||
stream['media_type'] == 'track' and progress_percent >= plexpy.CONFIG.MUSIC_WATCHED_PERCENT) \
|
||||
and not any(d['notify_action'] == 'on_watched' for d in notify_states):
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream, 'notify_action': 'on_watched'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream.copy(), 'notify_action': 'on_watched'})
|
||||
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream, 'notify_action': 'on_stop'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream.copy(), 'notify_action': 'on_stop'})
|
||||
|
||||
# Write the item history on playback stop
|
||||
success = monitor_process.write_session_history(session=stream)
|
||||
row_id = monitor_process.write_session_history(session=stream)
|
||||
|
||||
if success:
|
||||
if row_id:
|
||||
# If session is written to the databaase successfully, remove the session from the session table
|
||||
logger.debug(u"Tautulli Monitor :: Removing sessionKey %s ratingKey %s from session queue"
|
||||
% (stream['session_key'], stream['rating_key']))
|
||||
monitor_db.action('DELETE FROM sessions WHERE session_key = ? AND rating_key = ?',
|
||||
[stream['session_key'], stream['rating_key']])
|
||||
monitor_process.delete_session(row_id=row_id)
|
||||
else:
|
||||
stream['write_attempts'] += 1
|
||||
|
||||
@@ -175,18 +174,14 @@ def check_active_sessions(ws_request=False):
|
||||
logger.warn(u"Tautulli Monitor :: Failed to write sessionKey %s ratingKey %s to the database. " \
|
||||
"Will try again on the next pass. Write attempt %s."
|
||||
% (stream['session_key'], stream['rating_key'], str(stream['write_attempts'])))
|
||||
monitor_db.action('UPDATE sessions SET write_attempts = ? '
|
||||
'WHERE session_key = ? AND rating_key = ?',
|
||||
[stream['write_attempts'], stream['session_key'], stream['rating_key']])
|
||||
monitor_process.increment_write_attempts(session_key=stream['session_key'])
|
||||
else:
|
||||
logger.warn(u"Tautulli Monitor :: Failed to write sessionKey %s ratingKey %s to the database. " \
|
||||
"Removing session from the database. Write attempt %s."
|
||||
% (stream['session_key'], stream['rating_key'], str(stream['write_attempts'])))
|
||||
logger.debug(u"Tautulli Monitor :: Removing sessionKey %s ratingKey %s from session queue"
|
||||
% (stream['session_key'], stream['rating_key']))
|
||||
monitor_db.action('DELETE FROM sessions WHERE session_key = ? AND rating_key = ?',
|
||||
[stream['session_key'], stream['rating_key']])
|
||||
|
||||
monitor_process.delete_session(session_key=stream['session_key'])
|
||||
|
||||
# Process the newly received session data
|
||||
for session in media_container:
|
||||
@@ -248,7 +243,7 @@ def check_recently_added():
|
||||
if 0 < time_threshold - int(item['added_at']) <= time_interval:
|
||||
logger.debug(u"Tautulli Monitor :: Library item %s added to Plex." % str(item['rating_key']))
|
||||
|
||||
plexpy.NOTIFY_QUEUE.put({'timeline_data': item, 'notify_action': 'on_created'})
|
||||
plexpy.NOTIFY_QUEUE.put({'timeline_data': item.copy(), 'notify_action': 'on_created'})
|
||||
|
||||
else:
|
||||
item = max(metadata, key=lambda x:x['added_at'])
|
||||
@@ -266,7 +261,7 @@ def check_recently_added():
|
||||
logger.debug(u"Tautulli Monitor :: Library item %s added to Plex." % str(item['rating_key']))
|
||||
|
||||
# Check if any notification agents have notifications enabled
|
||||
plexpy.NOTIFY_QUEUE.put({'timeline_data': item, 'notify_action': 'on_created'})
|
||||
plexpy.NOTIFY_QUEUE.put({'timeline_data': item.copy(), 'notify_action': 'on_created'})
|
||||
|
||||
|
||||
def check_server_response():
|
||||
|
@@ -58,7 +58,7 @@ class ActivityProcessor(object):
|
||||
'grandparent_thumb': session.get('grandparent_thumb', ''),
|
||||
'year': session.get('year', ''),
|
||||
'friendly_name': session.get('friendly_name', ''),
|
||||
#'ip_address': session.get('ip_address', ''),
|
||||
'ip_address': session.get('ip_address', ''),
|
||||
'player': session.get('player', ''),
|
||||
'platform': session.get('platform', ''),
|
||||
'parent_rating_key': session.get('parent_rating_key', ''),
|
||||
@@ -90,6 +90,8 @@ class ActivityProcessor(object):
|
||||
'transcode_audio_channels': session.get('transcode_audio_channels', ''),
|
||||
'transcode_width': session.get('stream_video_width', ''),
|
||||
'transcode_height': session.get('stream_video_height', ''),
|
||||
'transcode_hw_decoding': session.get('transcode_hw_decoding', ''),
|
||||
'transcode_hw_encoding': session.get('transcode_hw_encoding', ''),
|
||||
'synced_version': session.get('synced_version', ''),
|
||||
'synced_version_profile': session.get('synced_version_profile', ''),
|
||||
'optimized_version': session.get('optimized_version', ''),
|
||||
@@ -117,10 +119,6 @@ class ActivityProcessor(object):
|
||||
'stopped': int(time.time())
|
||||
}
|
||||
|
||||
# Add ip_address back into values
|
||||
if session['ip_address']:
|
||||
values.update({'ip_address': session.get('ip_address', 'N/A')})
|
||||
|
||||
keys = {'session_key': session.get('session_key', ''),
|
||||
'rating_key': session.get('rating_key', '')}
|
||||
|
||||
@@ -129,8 +127,7 @@ class ActivityProcessor(object):
|
||||
if result == 'insert':
|
||||
# Check if any notification agents have notifications enabled
|
||||
if notify:
|
||||
values.update({'ip_address': session.get('ip_address', 'N/A')})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': values, 'notify_action': 'on_play'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': values.copy(), 'notify_action': 'on_play'})
|
||||
|
||||
# If it's our first write then time stamp it.
|
||||
started = int(time.time())
|
||||
@@ -158,7 +155,12 @@ class ActivityProcessor(object):
|
||||
|
||||
# Reload json from raw stream info
|
||||
if session.get('raw_stream_info'):
|
||||
session.update(json.loads(session['raw_stream_info']))
|
||||
raw_stream_info = json.loads(session['raw_stream_info'])
|
||||
# Don't overwrite id, session_key, stopped
|
||||
raw_stream_info.pop('id', None)
|
||||
raw_stream_info.pop('session_key', None)
|
||||
raw_stream_info.pop('stopped', None)
|
||||
session.update(raw_stream_info)
|
||||
|
||||
session = defaultdict(str, session)
|
||||
|
||||
@@ -180,6 +182,7 @@ class ActivityProcessor(object):
|
||||
else:
|
||||
logger.debug(u"Tautulli ActivityProcessor :: ratingKey %s not logged. Does not meet logging criteria. "
|
||||
u"Media type is '%s'" % (session['rating_key'], session['media_type']))
|
||||
return session['id']
|
||||
|
||||
if str(session['paused_counter']).isdigit():
|
||||
real_play_time = stopped - session['started'] - int(session['paused_counter'])
|
||||
@@ -232,7 +235,8 @@ class ActivityProcessor(object):
|
||||
## TODO: Fix media info from imports. Temporary media info from import session.
|
||||
media_info = session
|
||||
|
||||
# logger.debug(u"Tautulli ActivityProcessor :: Attempting to write to session_history table...")
|
||||
# logger.debug(u"Tautulli ActivityProcessor :: Attempting to write sessionKey %s to session_history table..."
|
||||
# % session['session_key'])
|
||||
keys = {'id': None}
|
||||
values = {'started': session['started'],
|
||||
'stopped': stopped,
|
||||
@@ -257,7 +261,8 @@ class ActivityProcessor(object):
|
||||
'view_offset': session['view_offset']
|
||||
}
|
||||
|
||||
# logger.debug(u"Tautulli ActivityProcessor :: Writing session_history transaction...")
|
||||
# logger.debug(u"Tautulli ActivityProcessor :: Writing sessionKey %s session_history transaction..."
|
||||
# % session['session_key'])
|
||||
self.db.upsert(table_name='session_history', key_dict=keys, value_dict=values)
|
||||
|
||||
# Check if we should group the session, select the last two rows from the user
|
||||
@@ -287,7 +292,7 @@ class ActivityProcessor(object):
|
||||
|
||||
query = 'UPDATE session_history SET reference_id = ? WHERE id = ? '
|
||||
# If rating_key is the same in the previous session, then set the reference_id to the previous row, else set the reference_id to the new id
|
||||
if prev_session == new_session == None:
|
||||
if prev_session is None and new_session is None:
|
||||
args = [last_id, last_id]
|
||||
elif prev_session['rating_key'] == new_session['rating_key'] and prev_session['view_offset'] <= new_session['view_offset']:
|
||||
args = [prev_session['reference_id'], new_session['id']]
|
||||
@@ -301,7 +306,8 @@ class ActivityProcessor(object):
|
||||
|
||||
# Write the session_history_media_info table
|
||||
|
||||
# logger.debug(u"Tautulli ActivityProcessor :: Attempting to write to session_history_media_info table...")
|
||||
# logger.debug(u"Tautulli ActivityProcessor :: Attempting to write to sessionKey %s session_history_media_info table..."
|
||||
# % session['session_key'])
|
||||
keys = {'id': last_id}
|
||||
values = {'rating_key': session['rating_key'],
|
||||
'video_decision': session['video_decision'],
|
||||
@@ -324,6 +330,7 @@ class ActivityProcessor(object):
|
||||
'audio_codec': session['audio_codec'],
|
||||
'audio_bitrate': session['audio_bitrate'],
|
||||
'audio_channels': session['audio_channels'],
|
||||
'subtitle_codec': session['subtitle_codec'],
|
||||
'transcode_protocol': session['transcode_protocol'],
|
||||
'transcode_container': session['transcode_container'],
|
||||
'transcode_video_codec': session['transcode_video_codec'],
|
||||
@@ -333,9 +340,11 @@ class ActivityProcessor(object):
|
||||
'transcode_height': session['transcode_height'],
|
||||
'transcode_hw_requested': session['transcode_hw_requested'],
|
||||
'transcode_hw_full_pipeline': session['transcode_hw_full_pipeline'],
|
||||
'transcode_hw_decoding': session['transcode_hw_decoding'],
|
||||
'transcode_hw_decode': session['transcode_hw_decode'],
|
||||
'transcode_hw_encode': session['transcode_hw_encode'],
|
||||
'transcode_hw_decode_title': session['transcode_hw_decode_title'],
|
||||
'transcode_hw_encoding': session['transcode_hw_encoding'],
|
||||
'transcode_hw_encode': session['transcode_hw_encode'],
|
||||
'transcode_hw_encode_title': session['transcode_hw_encode_title'],
|
||||
'stream_container': session['stream_container'],
|
||||
'stream_container_decision': session['stream_container_decision'],
|
||||
@@ -365,7 +374,8 @@ class ActivityProcessor(object):
|
||||
'optimized_version_title': session['optimized_version_title']
|
||||
}
|
||||
|
||||
# logger.debug(u"Tautulli ActivityProcessor :: Writing session_history_media_info transaction...")
|
||||
# logger.debug(u"Tautulli ActivityProcessor :: Writing sessionKey %s session_history_media_info transaction..."
|
||||
# % session['session_key'])
|
||||
self.db.upsert(table_name='session_history_media_info', key_dict=keys, value_dict=values)
|
||||
|
||||
# Write the session_history_metadata table
|
||||
@@ -375,7 +385,8 @@ class ActivityProcessor(object):
|
||||
genres = ";".join(metadata['genres'])
|
||||
labels = ";".join(metadata['labels'])
|
||||
|
||||
# logger.debug(u"Tautulli ActivityProcessor :: Attempting to write to session_history_metadata table...")
|
||||
# logger.debug(u"Tautulli ActivityProcessor :: Attempting to write to sessionKey %s session_history_metadata table..."
|
||||
# % session['session_key'])
|
||||
keys = {'id': last_id}
|
||||
values = {'rating_key': session['rating_key'],
|
||||
'parent_rating_key': session['parent_rating_key'],
|
||||
@@ -411,11 +422,12 @@ class ActivityProcessor(object):
|
||||
'labels': labels
|
||||
}
|
||||
|
||||
# logger.debug(u"Tautulli ActivityProcessor :: Writing session_history_metadata transaction...")
|
||||
# logger.debug(u"Tautulli ActivityProcessor :: Writing sessionKey %s session_history_metadata transaction..."
|
||||
# % session['session_key'])
|
||||
self.db.upsert(table_name='session_history_metadata', key_dict=keys, value_dict=values)
|
||||
|
||||
# Return true when the session is successfully written to the database
|
||||
return True
|
||||
# Return the session row id when the session is successfully written to the database
|
||||
return session['id']
|
||||
|
||||
def get_sessions(self, user_id=None, ip_address=None):
|
||||
query = 'SELECT * FROM sessions'
|
||||
@@ -456,11 +468,13 @@ class ActivityProcessor(object):
|
||||
|
||||
return None
|
||||
|
||||
def delete_session(self, session_key=None):
|
||||
def delete_session(self, session_key=None, row_id=None):
|
||||
if str(session_key).isdigit():
|
||||
self.db.action('DELETE FROM sessions WHERE session_key = ?', [session_key])
|
||||
elif str(row_id).isdigit():
|
||||
self.db.action('DELETE FROM sessions WHERE id = ?', [row_id])
|
||||
|
||||
def set_session_last_paused(self, session_key=None, timestamp=None ):
|
||||
def set_session_last_paused(self, session_key=None, timestamp=None):
|
||||
if str(session_key).isdigit():
|
||||
result = self.db.select('SELECT last_paused, paused_counter '
|
||||
'FROM sessions '
|
||||
@@ -469,7 +483,7 @@ class ActivityProcessor(object):
|
||||
paused_counter = None
|
||||
for session in result:
|
||||
if session['last_paused']:
|
||||
paused_offset = timestamp - int(session['last_paused'])
|
||||
paused_offset = int(time.time()) - int(session['last_paused'])
|
||||
if session['paused_counter']:
|
||||
paused_counter = int(session['paused_counter']) + int(paused_offset)
|
||||
else:
|
||||
|
@@ -35,6 +35,8 @@ import database
|
||||
import libraries
|
||||
import logger
|
||||
import mobile_app
|
||||
import notification_handler
|
||||
import notifiers
|
||||
import users
|
||||
|
||||
|
||||
@@ -397,6 +399,50 @@ class API2:
|
||||
|
||||
return
|
||||
|
||||
def notify(self, notifier_id='', subject='Tautulli', body='Test notification', **kwargs):
|
||||
""" Send a notification using Tautulli.
|
||||
|
||||
```
|
||||
Required parameters:
|
||||
notifier_id (int): The ID number of the notification agent
|
||||
subject (str): The subject of the message
|
||||
body (str): The body of the message
|
||||
|
||||
Optional parameters:
|
||||
None
|
||||
|
||||
Returns:
|
||||
None
|
||||
```
|
||||
"""
|
||||
if not notifier_id:
|
||||
self._api_msg = 'Notification failed: no notifier id provided.'
|
||||
self._api_result_type = 'error'
|
||||
return
|
||||
|
||||
notifier = notifiers.get_notifier_config(notifier_id=notifier_id)
|
||||
|
||||
if not notifier:
|
||||
self._api_msg = 'Notification failed: invalid notifier_id provided %s.' % notifier_id
|
||||
self._api_result_type = 'error'
|
||||
return
|
||||
|
||||
logger.api_debug(u'Tautulli APIv2 :: Sending notification.')
|
||||
success = notification_handler.notify(notifier_id=notifier_id,
|
||||
notify_action='api',
|
||||
subject=subject,
|
||||
body=body,
|
||||
**kwargs)
|
||||
|
||||
if success:
|
||||
self._api_msg = 'Notification sent.'
|
||||
self._api_result_type = 'success'
|
||||
else:
|
||||
self._api_msg = 'Notification failed.'
|
||||
self._api_result_type = 'error'
|
||||
|
||||
return
|
||||
|
||||
def _api_make_md(self):
|
||||
""" Tries to make a API.md to simplify the api docs. """
|
||||
|
||||
@@ -581,8 +627,8 @@ General optional parameters:
|
||||
if isinstance(result, (dict, list)):
|
||||
ret = result
|
||||
else:
|
||||
raise
|
||||
except:
|
||||
raise Exception
|
||||
except Exception:
|
||||
try:
|
||||
ret = json.loads(result)
|
||||
except (ValueError, TypeError):
|
||||
|
175
plexpy/common.py
175
plexpy/common.py
@@ -32,17 +32,30 @@ DEFAULT_POSTER_THUMB = "interfaces/default/images/poster.png"
|
||||
DEFAULT_COVER_THUMB = "interfaces/default/images/cover.png"
|
||||
DEFAULT_ART = "interfaces/default/images/art.png"
|
||||
|
||||
PLATFORM_NAME_OVERRIDES = {'Konvergo': 'Plex Media Player',
|
||||
MEDIA_TYPE_HEADERS = {
|
||||
'movie': 'Movies',
|
||||
'show': 'TV Shows',
|
||||
'season': 'Seasons',
|
||||
'episode': 'Episodes',
|
||||
'artist': 'Artists',
|
||||
'album': 'Albums',
|
||||
'track': 'Tracks',
|
||||
}
|
||||
|
||||
PLATFORM_NAME_OVERRIDES = {
|
||||
'Konvergo': 'Plex Media Player',
|
||||
'Mystery 3': 'Playstation 3',
|
||||
'Mystery 4': 'Playstation 4',
|
||||
'Mystery 5': 'Xbox 360',
|
||||
'WebMAF': 'Playstation 4'
|
||||
}
|
||||
}
|
||||
|
||||
PMS_PLATFORM_NAME_OVERRIDES = {'MacOSX': 'Mac'
|
||||
}
|
||||
PMS_PLATFORM_NAME_OVERRIDES = {
|
||||
'MacOSX': 'Mac'
|
||||
}
|
||||
|
||||
PLATFORM_NAMES = {'android': 'android',
|
||||
PLATFORM_NAMES = {
|
||||
'android': 'android',
|
||||
'apple tv': 'atv',
|
||||
'chrome': 'chrome',
|
||||
'chromecast': 'chromecast',
|
||||
@@ -76,41 +89,48 @@ PLATFORM_NAMES = {'android': 'android',
|
||||
'windows phone': 'wp',
|
||||
'xbmc': 'xbmc',
|
||||
'xbox': 'xbox'
|
||||
}
|
||||
}
|
||||
PLATFORM_NAMES = OrderedDict(sorted(PLATFORM_NAMES.items(), key=lambda k: k[0], reverse=True))
|
||||
|
||||
MEDIA_FLAGS_AUDIO = {'ac.?3': 'dolby_digital',
|
||||
MEDIA_FLAGS_AUDIO = {
|
||||
'ac.?3': 'dolby_digital',
|
||||
'truehd': 'dolby_truehd',
|
||||
'(dca|dta)': 'dts',
|
||||
'dts(hd_|-hd|-)?ma': 'dca-ma',
|
||||
'vorbis': 'ogg'
|
||||
}
|
||||
MEDIA_FLAGS_VIDEO = {'avc1': 'h264',
|
||||
}
|
||||
MEDIA_FLAGS_VIDEO = {
|
||||
'avc1': 'h264',
|
||||
'wmv(1|2)': 'wmv',
|
||||
'wmv3': 'wmvhd'
|
||||
}
|
||||
}
|
||||
|
||||
AUDIO_CODEC_OVERRIDES = {'truehd': 'TrueHD'}
|
||||
AUDIO_CODEC_OVERRIDES = {
|
||||
'truehd': 'TrueHD'
|
||||
}
|
||||
|
||||
VIDEO_RESOLUTION_OVERRIDES = {'sd': 'SD',
|
||||
VIDEO_RESOLUTION_OVERRIDES = {
|
||||
'sd': 'SD',
|
||||
'480': '480p',
|
||||
'540': '540p',
|
||||
'576': '576p',
|
||||
'720': '720p',
|
||||
'1080': '1080p',
|
||||
'4k': '4k'
|
||||
}
|
||||
}
|
||||
|
||||
AUDIO_CHANNELS = {'1': 'Mono',
|
||||
AUDIO_CHANNELS = {
|
||||
'1': 'Mono',
|
||||
'2': 'Stereo',
|
||||
'3': '2.1',
|
||||
'4': '3.1',
|
||||
'6': '5.1',
|
||||
'7': '6.1',
|
||||
'8': '7.1'
|
||||
}
|
||||
}
|
||||
|
||||
VIDEO_QUALITY_PROFILES = {20000: '20 Mbps 1080p',
|
||||
VIDEO_QUALITY_PROFILES = {
|
||||
20000: '20 Mbps 1080p',
|
||||
12000: '12 Mbps 1080p',
|
||||
10000: '10 Mbps 1080p',
|
||||
8000: '8 Mbps 1080p',
|
||||
@@ -123,39 +143,65 @@ VIDEO_QUALITY_PROFILES = {20000: '20 Mbps 1080p',
|
||||
208: '0.2 Mbps 160p',
|
||||
96: '0.096 Mbps',
|
||||
64: '0.064 Mbps'
|
||||
}
|
||||
}
|
||||
VIDEO_QUALITY_PROFILES = OrderedDict(sorted(VIDEO_QUALITY_PROFILES.items(), key=lambda k: k[0], reverse=True))
|
||||
|
||||
AUDIO_QUALITY_PROFILES = {512: '512 kbps',
|
||||
AUDIO_QUALITY_PROFILES = {
|
||||
512: '512 kbps',
|
||||
320: '320 kbps',
|
||||
256: '256 kbps',
|
||||
192: '192 kbps',
|
||||
128: '128 kbps',
|
||||
96: '96 kbps'
|
||||
}
|
||||
}
|
||||
AUDIO_QUALITY_PROFILES = OrderedDict(sorted(AUDIO_QUALITY_PROFILES.items(), key=lambda k: k[0], reverse=True))
|
||||
|
||||
SCHEDULER_LIST = ['Check GitHub for updates',
|
||||
HW_DECODERS = [
|
||||
'dxva2',
|
||||
'videotoolbox',
|
||||
'mediacodecndk',
|
||||
'vaapi'
|
||||
]
|
||||
HW_ENCODERS = [
|
||||
'qsv',
|
||||
'nvenc',
|
||||
'mf',
|
||||
'videotoolbox',
|
||||
'mediacodecndk',
|
||||
'vaapi',
|
||||
'nvenc'
|
||||
]
|
||||
|
||||
SCHEDULER_LIST = [
|
||||
'Check GitHub for updates',
|
||||
'Check for server response',
|
||||
'Check for active sessions',
|
||||
'Check for recently added items',
|
||||
'Check for Plex updates',
|
||||
'Check for Plex remote access',
|
||||
'Check server response',
|
||||
'Refresh users list',
|
||||
'Refresh libraries list',
|
||||
'Refresh Plex server URLs',
|
||||
'Backup Tautulli database',
|
||||
'Backup Tautulli config'
|
||||
]
|
||||
]
|
||||
|
||||
DATE_TIME_FORMATS = [
|
||||
{
|
||||
'category': 'Year',
|
||||
'parameters': [
|
||||
{'value': 'YYYY', 'description': 'Numeric, four digits', 'example': '1999, 2003'},
|
||||
{'value': 'YY', 'description': 'Numeric, two digits', 'example': '99, 03'}
|
||||
]
|
||||
},
|
||||
{
|
||||
'category': 'Month',
|
||||
'parameters': [
|
||||
{'value': 'MMMM', 'description': 'Textual, full', 'example': 'January-December'},
|
||||
{'value': 'MMM', 'description': 'Textual, three letters', 'example': 'Jan-Dec'},
|
||||
{'value': 'MM', 'description': 'Numeric, with leading zeros', 'example': '42747'},
|
||||
{'value': 'M', 'description': 'Numeric, without leading zeros', 'example': '42747'},
|
||||
{'value': 'MM', 'description': 'Numeric, with leading zeros', 'example': '01-12'},
|
||||
{'value': 'M', 'description': 'Numeric, without leading zeros', 'example': '1-12'},
|
||||
{'value': 'Mo', 'description': 'Numeric, with suffix', 'example': '1st, 2nd ... 12th'},
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -163,14 +209,15 @@ DATE_TIME_FORMATS = [
|
||||
'parameters': [
|
||||
{'value': 'DDDD', 'description': 'Numeric, with leading zeros', 'example': '001-365'},
|
||||
{'value': 'DDD', 'description': 'Numeric, without leading zeros', 'example': '1-365'},
|
||||
{'value': 'DDDo', 'description': 'Numeric, with suffix', 'example': '1st, 2nd, ... 365th'},
|
||||
]
|
||||
},
|
||||
{
|
||||
'category': 'Day of the Month',
|
||||
'parameters': [
|
||||
{'value': 'DD', 'description': 'Numeric, with leading zeros', 'example': '42766'},
|
||||
{'value': 'D', 'description': 'Numeric, without leading zeros', 'example': '42766'},
|
||||
{'value': 'Do', 'description': 'Numeric, with suffix', 'example': 'E.g. 1st, 2nd ... 31st.'},
|
||||
{'value': 'DD', 'description': 'Numeric, with leading zeros', 'example': '01-31'},
|
||||
{'value': 'D', 'description': 'Numeric, without leading zeros', 'example': '1-31'},
|
||||
{'value': 'Do', 'description': 'Numeric, with suffix', 'example': '1st, 2nd ... 31st'},
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -178,7 +225,9 @@ DATE_TIME_FORMATS = [
|
||||
'parameters': [
|
||||
{'value': 'dddd', 'description': 'Textual, full', 'example': 'Sunday-Saturday'},
|
||||
{'value': 'ddd', 'description': 'Textual, three letters', 'example': 'Sun-Sat'},
|
||||
{'value': 'dd', 'description': 'Textual, two letters', 'example': 'Su-Sa'},
|
||||
{'value': 'd', 'description': 'Numeric', 'example': '0-6'},
|
||||
{'value': 'do', 'description': 'Numeric, with suffix', 'example': '0th, 1st ... 6th'},
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -186,8 +235,8 @@ DATE_TIME_FORMATS = [
|
||||
'parameters': [
|
||||
{'value': 'HH', 'description': '24-hour, with leading zeros', 'example': '00-23'},
|
||||
{'value': 'H', 'description': '24-hour, without leading zeros', 'example': '0-23'},
|
||||
{'value': 'hh', 'description': '12-hour, with leading zeros', 'example': '42747'},
|
||||
{'value': 'h', 'description': '12-hour, without leading zeros', 'example': '42747'},
|
||||
{'value': 'hh', 'description': '12-hour, with leading zeros', 'example': '01-12'},
|
||||
{'value': 'h', 'description': '12-hour, without leading zeros', 'example': '1-12'},
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -214,8 +263,8 @@ DATE_TIME_FORMATS = [
|
||||
{
|
||||
'category': 'Timezone',
|
||||
'parameters': [
|
||||
{'value': 'ZZ', 'description': 'UTC offset', 'example': 'E.g. +0100, -0700'},
|
||||
{'value': 'Z', 'description': 'UTC offset', 'example': 'E.g. +01:00, -07:00'},
|
||||
{'value': 'ZZ', 'description': 'UTC offset', 'example': '+0100, -0700'},
|
||||
{'value': 'Z', 'description': 'UTC offset', 'example': '+01:00, -07:00'},
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -224,21 +273,27 @@ DATE_TIME_FORMATS = [
|
||||
{'value': 'X', 'description': 'Unix timestamp', 'example': 'E.g. 1456887825'},
|
||||
]
|
||||
},
|
||||
]
|
||||
]
|
||||
|
||||
NOTIFICATION_PARAMETERS = [
|
||||
{
|
||||
'category': 'Global',
|
||||
'parameters': [
|
||||
{'name': 'Tautulli Version', 'type': 'str', 'value': 'plexpy_version', 'description': 'The current version of Tautulli.'},
|
||||
{'name': 'Tautulli Branch', 'type': 'str', 'value': 'plexpy_branch', 'description': 'The current git branch of Tautulli.'},
|
||||
{'name': 'Tautulli Commit', 'type': 'str', 'value': 'plexpy_commit', 'description': 'The current git commit hash of Tautulli.'},
|
||||
{'name': 'Tautulli Version', 'type': 'str', 'value': 'tautulli_version', 'description': 'The current version of Tautulli.'},
|
||||
{'name': 'Tautulli Remote', 'type': 'str', 'value': 'tautulli_remote', 'description': 'The current git remote of Tautulli.'},
|
||||
{'name': 'Tautulli Branch', 'type': 'str', 'value': 'tautulli_branch', 'description': 'The current git branch of Tautulli.'},
|
||||
{'name': 'Tautulli Commit', 'type': 'str', 'value': 'tautulli_commit', 'description': 'The current git commit hash of Tautulli.'},
|
||||
{'name': 'Server Name', 'type': 'str', 'value': 'server_name', 'description': 'The name of your Plex Server.'},
|
||||
{'name': 'Server Uptime', 'type': 'str', 'value': 'server_uptime', 'description': 'The uptime (in days, hours, mins, secs) of your Plex Server.'},
|
||||
{'name': 'Server IP', 'type': 'str', 'value': 'server_ip', 'description': 'The connection IP address for your Plex Server.'},
|
||||
{'name': 'Server Port', 'type': 'int', 'value': 'server_port', 'description': 'The connection port for your Plex Server.'},
|
||||
{'name': 'Server URL', 'type': 'str', 'value': 'server_url', 'description': 'The connection URL for your Plex Server.'},
|
||||
{'name': 'Server Platform', 'type': 'str', 'value': 'server_platform', 'description': 'The platform of your Plex Server.'},
|
||||
{'name': 'Server Version', 'type': 'str', 'value': 'server_version', 'description': 'The current version of your Plex Server.'},
|
||||
{'name': 'Server ID', 'type': 'str', 'value': 'server_machine_id', 'description': 'The unique identifier for your Plex Server.'},
|
||||
{'name': 'Action', 'type': 'str', 'value': 'action', 'description': 'The action that triggered the notification.'},
|
||||
{'name': 'Datestamp', 'type': 'int', 'value': 'datestamp', 'description': 'The date (in date format) the notification was triggered.'},
|
||||
{'name': 'Timestamp', 'type': 'int', 'value': 'timestamp', 'description': 'The time (in time format) the notification was triggered.'},
|
||||
{'name': 'Datestamp', 'type': 'str', 'value': 'datestamp', 'description': 'The date (in date format) when the notification was triggered.'},
|
||||
{'name': 'Timestamp', 'type': 'str', 'value': 'timestamp', 'description': 'The time (in time format) when the notification was triggered.'},
|
||||
{'name': 'Unix Time', 'type': 'int', 'value': 'unixtime', 'description': 'The unix timestamp when the notification was triggered.'},
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -306,7 +361,13 @@ NOTIFICATION_PARAMETERS = [
|
||||
{'name': 'Transcode Video Height', 'type': 'int', 'value': 'transcode_video_height', 'description': 'The video height of the transcoded stream.'},
|
||||
{'name': 'Transcode Audio Codec', 'type': 'str', 'value': 'transcode_audio_codec', 'description': 'The audio codec of the transcoded stream.'},
|
||||
{'name': 'Transcode Audio Channels', 'type': 'float', 'value': 'transcode_audio_channels', 'description': 'The audio channels of the transcoded stream.'},
|
||||
{'name': 'Transcode Hardware', 'type': 'int', 'value': 'transcode_hardware', 'description': 'If hardware transcoding is used.', 'example': '0 or 1'},
|
||||
{'name': 'Transcode HW Requested', 'type': 'int', 'value': 'transcode_hw_requested', 'description': 'If hardware decoding/encoding was requested.', 'example': '0 or 1'},
|
||||
{'name': 'Transcode HW Decoding', 'type': 'int', 'value': 'transcode_hw_decoding', 'description': 'If hardware decoding is used.', 'example': '0 or 1'},
|
||||
{'name': 'Transcode HW Decoding Codec', 'type': 'str', 'value': 'transcode_hw_decode', 'description': 'The hardware decoding codec.'},
|
||||
{'name': 'Transcode HW Decoding Title', 'type': 'str', 'value': 'transcode_hw_decode_title', 'description': 'The hardware decoding codec title.'},
|
||||
{'name': 'Transcode HW Encoding', 'type': 'int', 'value': 'transcode_hw_encoding', 'description': 'If hardware encoding is used.', 'example': '0 or 1'},
|
||||
{'name': 'Transcode HW Encoding Codec', 'type': 'str', 'value': 'transcode_hw_encode', 'description': 'The hardware encoding codec.'},
|
||||
{'name': 'Transcode HW Encoding Title', 'type': 'str', 'value': 'transcode_hw_encode_title', 'description': 'The hardware encoding codec title.'},
|
||||
{'name': 'Session Key', 'type': 'str', 'value': 'session_key', 'description': 'The unique identifier for the session.'},
|
||||
{'name': 'Transcode Key', 'type': 'str', 'value': 'transcode_key', 'description': 'The unique identifier for the transcode session.'},
|
||||
{'name': 'Session ID', 'type': 'str', 'value': 'session_id', 'description': 'The unique identifier for the stream.'},
|
||||
@@ -333,20 +394,22 @@ NOTIFICATION_PARAMETERS = [
|
||||
{'name': 'Track Number 00', 'type': 'int', 'value': 'track_num00', 'description': 'The two digit track number.', 'example': 'e.g. 04, or 04-10'},
|
||||
{'name': 'Year', 'type': 'int', 'value': 'year', 'description': 'The release year for the item.'},
|
||||
{'name': 'Release Date', 'type': 'int', 'value': 'release_date', 'description': 'The release date (in date format) for the item.'},
|
||||
{'name': 'Air Date', 'type': 'int', 'value': 'air_date', 'description': 'The air date (in date format) for the item.'},
|
||||
{'name': 'Added Date', 'type': 'int', 'value': 'added_date', 'description': 'The date (in date format) the item was added to Plex.'},
|
||||
{'name': 'Updated Date', 'type': 'int', 'value': 'updated_date', 'description': 'The date (in date format) the item was updated on Plex.'},
|
||||
{'name': 'Last Viewed Date', 'type': 'int', 'value': 'last_viewed_date', 'description': 'The date (in date format) the item was last viewed on Plex.'},
|
||||
{'name': 'Air Date', 'type': 'str', 'value': 'air_date', 'description': 'The air date (in date format) for the item.'},
|
||||
{'name': 'Added Date', 'type': 'str', 'value': 'added_date', 'description': 'The date (in date format) the item was added to Plex.'},
|
||||
{'name': 'Updated Date', 'type': 'str', 'value': 'updated_date', 'description': 'The date (in date format) the item was updated on Plex.'},
|
||||
{'name': 'Last Viewed Date', 'type': 'str', 'value': 'last_viewed_date', 'description': 'The date (in date format) the item was last viewed on Plex.'},
|
||||
{'name': 'Studio', 'type': 'str', 'value': 'studio', 'description': 'The studio for the item.'},
|
||||
{'name': 'Content Rating', 'type': 'int', 'value': 'content_rating', 'description': 'The content rating for the item.', 'example': 'e.g. TV-MA, TV-PG, etc.'},
|
||||
{'name': 'Director', 'type': 'str', 'value': 'directors', 'description': 'A list of directors for the item.'},
|
||||
{'name': 'Writer', 'type': 'str', 'value': 'writers', 'description': 'A list of writers for the item.'},
|
||||
{'name': 'Actor', 'type': 'str', 'value': 'actors', 'description': 'A list of actors for the item.'},
|
||||
{'name': 'Genre', 'type': 'str', 'value': 'genres', 'description': 'A list of genres for the item.'},
|
||||
{'name': 'Directors', 'type': 'str', 'value': 'directors', 'description': 'A list of directors for the item.'},
|
||||
{'name': 'Writers', 'type': 'str', 'value': 'writers', 'description': 'A list of writers for the item.'},
|
||||
{'name': 'Actors', 'type': 'str', 'value': 'actors', 'description': 'A list of actors for the item.'},
|
||||
{'name': 'Genres', 'type': 'str', 'value': 'genres', 'description': 'A list of genres for the item.'},
|
||||
{'name': 'Labels', 'type': 'str', 'value': 'labels', 'description': 'A list of labels for the item.'},
|
||||
{'name': 'Collections', 'type': 'str', 'value': 'collections', 'description': 'A list of collections for the item.'},
|
||||
{'name': 'Summary', 'type': 'str', 'value': 'summary', 'description': 'A short plot summary for the item.'},
|
||||
{'name': 'Tagline', 'type': 'str', 'value': 'tagline', 'description': 'A tagline for the media item.'},
|
||||
{'name': 'Rating', 'type': 'int', 'value': 'rating', 'description': 'The rating (out of 10) for the item.'},
|
||||
{'name': 'Audience Rating', 'type': 'int', 'value': 'audience_rating', 'description': 'The audience rating (%) for the item.', 'help_text': 'Ratings source must be Rotten Tomatoes for the Plex Movie agent'},
|
||||
{'name': 'Rating', 'type': 'float', 'value': 'rating', 'description': 'The rating (out of 10) for the item.'},
|
||||
{'name': 'Audience Rating', 'type': 'float', 'value': 'audience_rating', 'description': 'The audience rating (%) for the item.', 'help_text': 'Ratings source must be Rotten Tomatoes for the Plex Movie agent'},
|
||||
{'name': 'Duration', 'type': 'int', 'value': 'duration', 'description': 'The duration (in minutes) for the item.'},
|
||||
{'name': 'Poster URL', 'type': 'str', 'value': 'poster_url', 'description': 'A URL for the movie, TV show, or album poster.'},
|
||||
{'name': 'Plex URL', 'type': 'str', 'value': 'plex_url', 'description': 'The Plex URL to your server for the item.'},
|
||||
@@ -422,12 +485,12 @@ NOTIFICATION_PARAMETERS = [
|
||||
{
|
||||
'category': 'Tautulli Update Available',
|
||||
'parameters': [
|
||||
{'name': 'Plexpy Update Version', 'type': 'int', 'value': 'plexpy_update_version', 'description': 'The available update version for Tautulli.'},
|
||||
{'name': 'Plexpy Update Tar', 'type': 'int', 'value': 'plexpy_update_tar', 'description': 'The tar download URL for the available update.'},
|
||||
{'name': 'Plexpy Update Zip', 'type': 'int', 'value': 'plexpy_update_zip', 'description': 'The zip download URL for the available update.'},
|
||||
{'name': 'Plexpy Update Commit', 'type': 'int', 'value': 'plexpy_update_commit', 'description': 'The commit hash for the available update.'},
|
||||
{'name': 'Plexpy Update Behind', 'type': 'int', 'value': 'plexpy_update_behind', 'description': 'The number of commits behind for the available update.'},
|
||||
{'name': 'Plexpy Update Changelog', 'type': 'int', 'value': 'plexpy_update_changelog', 'description': 'The changelog for the available update.'},
|
||||
{'name': 'Tautulli Update Version', 'type': 'int', 'value': 'tautulli_update_version', 'description': 'The available update version for Tautulli.'},
|
||||
{'name': 'Tautulli Update Tar', 'type': 'int', 'value': 'tautulli_update_tar', 'description': 'The tar download URL for the available update.'},
|
||||
{'name': 'Tautulli Update Zip', 'type': 'int', 'value': 'tautulli_update_zip', 'description': 'The zip download URL for the available update.'},
|
||||
{'name': 'Tautulli Update Commit', 'type': 'int', 'value': 'tautulli_update_commit', 'description': 'The commit hash for the available update.'},
|
||||
{'name': 'Tautulli Update Behind', 'type': 'int', 'value': 'tautulli_update_behind', 'description': 'The number of commits behind for the available update.'},
|
||||
{'name': 'Tautulli Update Changelog', 'type': 'int', 'value': 'tautulli_update_changelog', 'description': 'The changelog for the available update.'},
|
||||
]
|
||||
},
|
||||
]
|
||||
]
|
||||
|
@@ -61,7 +61,7 @@ _CONFIG_DEFINITIONS = {
|
||||
'PMS_PLEXPASS': (int, 'PMS', 0),
|
||||
'PMS_PLATFORM': (str, 'PMS', ''),
|
||||
'PMS_VERSION': (str, 'PMS', ''),
|
||||
'PMS_UPDATE_CHANNEL': (str, 'PMS', 'public'),
|
||||
'PMS_UPDATE_CHANNEL': (str, 'PMS', 'plex'),
|
||||
'PMS_UPDATE_DISTRO': (str, 'PMS', ''),
|
||||
'PMS_UPDATE_DISTRO_BUILD': (str, 'PMS', ''),
|
||||
'PMS_WEB_URL': (str, 'PMS', 'https://app.plex.tv/desktop'),
|
||||
@@ -225,6 +225,7 @@ _CONFIG_DEFINITIONS = {
|
||||
'HTTP_PROXY': (int, 'General', 0),
|
||||
'HTTP_ROOT': (str, 'General', ''),
|
||||
'HTTP_USERNAME': (str, 'General', ''),
|
||||
'HTTP_PLEX_ADMIN': (int, 'General', 0),
|
||||
'HIPCHAT_URL': (str, 'Hipchat', ''),
|
||||
'HIPCHAT_COLOR': (str, 'Hipchat', ''),
|
||||
'HIPCHAT_INCL_SUBJECT': (int, 'Hipchat', 1),
|
||||
@@ -289,6 +290,7 @@ _CONFIG_DEFINITIONS = {
|
||||
'LOG_BLACKLIST': (int, 'General', 1),
|
||||
'LOG_DIR': (str, 'General', ''),
|
||||
'LOGGING_IGNORE_INTERVAL': (int, 'Monitoring', 120),
|
||||
'METADATA_CACHE_SECONDS': (int, 'Advanced', 1800),
|
||||
'MOVIE_LOGGING_ENABLE': (int, 'Monitoring', 1),
|
||||
'MOVIE_NOTIFY_ENABLE': (int, 'Monitoring', 0),
|
||||
'MOVIE_NOTIFY_ON_START': (int, 'Monitoring', 1),
|
||||
@@ -610,7 +612,8 @@ _CONFIG_DEFINITIONS = {
|
||||
'XBMC_ON_INTUP': (int, 'XBMC', 0),
|
||||
'XBMC_ON_PMSUPDATE': (int, 'XBMC', 0),
|
||||
'XBMC_ON_CONCURRENT': (int, 'XBMC', 0),
|
||||
'XBMC_ON_NEWDEVICE': (int, 'XBMC', 0)
|
||||
'XBMC_ON_NEWDEVICE': (int, 'XBMC', 0),
|
||||
'JWT_SECRET': (str, 'Advanced', ''),
|
||||
}
|
||||
|
||||
_BLACKLIST_KEYS = ['_APITOKEN', '_TOKEN', '_KEY', '_SECRET', '_PASSWORD', '_APIKEY', '_ID', '_HOOK']
|
||||
@@ -873,3 +876,9 @@ class Config(object):
|
||||
self.MUSIC_WATCHED_PERCENT = self.NOTIFY_WATCHED_PERCENT
|
||||
|
||||
self.CONFIG_VERSION = 9
|
||||
|
||||
if self.CONFIG_VERSION == 9:
|
||||
if self.PMS_UPDATE_CHANNEL == 'plexpass':
|
||||
self.PMS_UPDATE_CHANNEL = 'beta'
|
||||
|
||||
self.CONFIG_VERSION = 10
|
||||
|
@@ -882,6 +882,7 @@ class DataFactory(object):
|
||||
'stream_video_framerate, ' \
|
||||
'stream_audio_decision, stream_audio_codec, stream_audio_bitrate, stream_audio_channels, ' \
|
||||
'subtitles, stream_subtitle_decision, stream_subtitle_codec, ' \
|
||||
'transcode_hw_decoding, transcode_hw_encoding, ' \
|
||||
'session_history_metadata.media_type, title, grandparent_title ' \
|
||||
'FROM session_history_media_info ' \
|
||||
'JOIN session_history ON session_history_media_info.id = session_history.id ' \
|
||||
@@ -899,6 +900,7 @@ class DataFactory(object):
|
||||
'stream_video_framerate, ' \
|
||||
'stream_audio_decision, stream_audio_codec, stream_audio_bitrate, stream_audio_channels, ' \
|
||||
'subtitles, stream_subtitle_decision, stream_subtitle_codec, ' \
|
||||
'transcode_hw_decoding, transcode_hw_encoding, ' \
|
||||
'media_type, title, grandparent_title ' \
|
||||
'FROM sessions ' \
|
||||
'WHERE session_key = ? %s' % user_cond
|
||||
@@ -945,11 +947,15 @@ class DataFactory(object):
|
||||
'subtitles': item['subtitles'],
|
||||
'stream_subtitle_decision': item['stream_subtitle_decision'],
|
||||
'stream_subtitle_codec': item['stream_subtitle_codec'],
|
||||
'transcode_hw_decoding': item['transcode_hw_decoding'],
|
||||
'transcode_hw_encoding': item['transcode_hw_encoding'],
|
||||
'media_type': item['media_type'],
|
||||
'title': item['title'],
|
||||
'grandparent_title': item['grandparent_title']
|
||||
'grandparent_title': item['grandparent_title'],
|
||||
'current_session': 1 if session_key else 0
|
||||
}
|
||||
|
||||
stream_output = {k: v or '' for k, v in stream_output.iteritems()}
|
||||
return stream_output
|
||||
|
||||
def get_metadata_details(self, rating_key):
|
||||
|
@@ -698,6 +698,10 @@ class Graphs(object):
|
||||
series_3 = []
|
||||
|
||||
for item in result:
|
||||
if item['resolution'] not in ('4k', 'unknown'):
|
||||
item['resolution'] = item['resolution'].upper()
|
||||
if item['resolution'].isdigit():
|
||||
item['resolution'] += 'p'
|
||||
categories.append(item['resolution'])
|
||||
series_1.append(item['dp_count'])
|
||||
series_2.append(item['ds_count'])
|
||||
@@ -729,16 +733,18 @@ class Graphs(object):
|
||||
try:
|
||||
if y_axis == 'plays':
|
||||
query = 'SELECT ' \
|
||||
'(CASE WHEN session_history_media_info.stream_video_resolution IS NULL THEN ' \
|
||||
'(CASE WHEN session_history_media_info.video_decision = "transcode" THEN ' \
|
||||
'(CASE ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 360 THEN "sd" ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 360 THEN "SD" ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 480 THEN "480" ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 576 THEN "576" ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 720 THEN "720" ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 1080 THEN "1080" ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 1440 THEN "QHD" ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 2160 THEN "4K" ' \
|
||||
'ELSE "unknown" END) ELSE session_history_media_info.video_resolution END) AS resolution, ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 2160 THEN "4k" ' \
|
||||
'ELSE "unknown" END) ELSE session_history_media_info.video_resolution END) ' \
|
||||
'ELSE session_history_media_info.stream_video_resolution END) AS resolution, ' \
|
||||
'SUM(CASE WHEN session_history_media_info.transcode_decision = "direct play" ' \
|
||||
'THEN 1 ELSE 0 END) AS dp_count, ' \
|
||||
'SUM(CASE WHEN session_history_media_info.transcode_decision = "copy" ' \
|
||||
@@ -758,16 +764,18 @@ class Graphs(object):
|
||||
result = monitor_db.select(query)
|
||||
else:
|
||||
query = 'SELECT ' \
|
||||
'(CASE WHEN session_history_media_info.stream_video_resolution IS NULL THEN ' \
|
||||
'(CASE WHEN session_history_media_info.video_decision = "transcode" THEN ' \
|
||||
'(CASE ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 360 THEN "sd" ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 360 THEN "SD" ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 480 THEN "480" ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 576 THEN "576" ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 720 THEN "720" ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 1080 THEN "1080" ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 1440 THEN "QHD" ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 2160 THEN "4K" ' \
|
||||
'ELSE "unknown" END) ELSE session_history_media_info.video_resolution END) AS resolution, ' \
|
||||
'WHEN session_history_media_info.transcode_height <= 2160 THEN "4k" ' \
|
||||
'ELSE "unknown" END) ELSE session_history_media_info.video_resolution END) ' \
|
||||
'ELSE session_history_media_info.stream_video_resolution END) AS resolution, ' \
|
||||
'SUM(CASE WHEN session_history_media_info.transcode_decision = "direct play" ' \
|
||||
'AND session_history.stopped > 0 THEN (session_history.stopped - session_history.started) ' \
|
||||
' - (CASE WHEN paused_counter IS NULL THEN 0 ELSE paused_counter END) ELSE 0 END) AS dp_count, ' \
|
||||
@@ -799,6 +807,10 @@ class Graphs(object):
|
||||
series_3 = []
|
||||
|
||||
for item in result:
|
||||
if item['resolution'] not in ('4k', 'unknown'):
|
||||
item['resolution'] = item['resolution'].upper()
|
||||
if item['resolution'].isdigit():
|
||||
item['resolution'] += 'p'
|
||||
categories.append(item['resolution'])
|
||||
series_1.append(item['dp_count'])
|
||||
series_2.append(item['ds_count'])
|
||||
|
@@ -154,12 +154,12 @@ class HTTPHandler(object):
|
||||
try:
|
||||
if self.output_format == 'text':
|
||||
output = response_content.decode('utf-8', 'ignore')
|
||||
if self.output_format == 'dict':
|
||||
output = helpers.convert_xml_to_dict(response_content.decode('utf-8', 'ignore'))
|
||||
elif self.output_format == 'dict':
|
||||
output = helpers.convert_xml_to_dict(response_content)
|
||||
elif self.output_format == 'json':
|
||||
output = helpers.convert_xml_to_json(response_content.decode('utf-8', 'ignore'))
|
||||
output = helpers.convert_xml_to_json(response_content)
|
||||
elif self.output_format == 'xml':
|
||||
output = helpers.parse_xml(response_content.decode('utf-8', 'ignore'))
|
||||
output = helpers.parse_xml(response_content)
|
||||
else:
|
||||
output = response_content
|
||||
|
||||
|
@@ -306,6 +306,11 @@ def initHooks(global_exceptions=True, thread_exceptions=True, pass_original=True
|
||||
# Monkey patch the run() by monkey patching the __init__ method
|
||||
threading.Thread.__init__ = new_init
|
||||
|
||||
|
||||
def shutdown():
|
||||
logging.shutdown()
|
||||
|
||||
|
||||
# Expose logger methods
|
||||
# Main Tautulli logger
|
||||
info = logger.info
|
||||
|
@@ -16,7 +16,7 @@
|
||||
|
||||
import arrow
|
||||
import bleach
|
||||
from collections import Counter
|
||||
from collections import Counter, defaultdict
|
||||
from itertools import groupby
|
||||
import json
|
||||
from operator import itemgetter
|
||||
@@ -122,8 +122,8 @@ def add_notifier_each(notifier_id=None, notify_action=None, stream_data=None, ti
|
||||
|
||||
# Add on_concurrent and on_newdevice to queue if action is on_play
|
||||
if notify_action == 'on_play':
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream_data, 'notify_action': 'on_concurrent'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream_data, 'notify_action': 'on_newdevice'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream_data.copy(), 'notify_action': 'on_concurrent'})
|
||||
plexpy.NOTIFY_QUEUE.put({'stream_data': stream_data.copy(), 'notify_action': 'on_newdevice'})
|
||||
|
||||
|
||||
def notify_conditions(notify_action=None, stream_data=None, timeline_data=None):
|
||||
@@ -131,19 +131,19 @@ def notify_conditions(notify_action=None, stream_data=None, timeline_data=None):
|
||||
if stream_data:
|
||||
|
||||
# Check if notifications enabled for user and library
|
||||
user_data = users.Users()
|
||||
user_details = user_data.get_details(user_id=stream_data['user_id'])
|
||||
# user_data = users.Users()
|
||||
# user_details = user_data.get_details(user_id=stream_data['user_id'])
|
||||
#
|
||||
# library_data = libraries.Libraries()
|
||||
# library_details = library_data.get_details(section_id=stream_data['section_id'])
|
||||
|
||||
library_data = libraries.Libraries()
|
||||
library_details = library_data.get_details(section_id=stream_data['section_id'])
|
||||
|
||||
if not user_details['do_notify']:
|
||||
logger.debug(u"Tautulli NotificationHandler :: Notifications for user '%s' are disabled." % user_details['username'])
|
||||
return False
|
||||
|
||||
elif not library_details['do_notify'] and notify_action not in ('on_concurrent', 'on_newdevice'):
|
||||
logger.debug(u"Tautulli NotificationHandler :: Notifications for library '%s' are disabled." % library_details['section_name'])
|
||||
return False
|
||||
# if not user_details['do_notify']:
|
||||
# logger.debug(u"Tautulli NotificationHandler :: Notifications for user '%s' are disabled." % user_details['username'])
|
||||
# return False
|
||||
#
|
||||
# elif not library_details['do_notify'] and notify_action not in ('on_concurrent', 'on_newdevice'):
|
||||
# logger.debug(u"Tautulli NotificationHandler :: Notifications for library '%s' are disabled." % library_details['section_name'])
|
||||
# return False
|
||||
|
||||
if notify_action == 'on_concurrent':
|
||||
pms_connect = pmsconnect.PmsConnect()
|
||||
@@ -188,12 +188,12 @@ def notify_conditions(notify_action=None, stream_data=None, timeline_data=None):
|
||||
elif timeline_data:
|
||||
|
||||
# Check if notifications enabled for library
|
||||
library_data = libraries.Libraries()
|
||||
library_details = library_data.get_details(section_id=timeline_data['section_id'])
|
||||
|
||||
if not library_details['do_notify_created']:
|
||||
# logger.debug(u"Tautulli NotificationHandler :: Notifications for library '%s' is disabled." % library_details['section_name'])
|
||||
return False
|
||||
# library_data = libraries.Libraries()
|
||||
# library_details = library_data.get_details(section_id=timeline_data['section_id'])
|
||||
#
|
||||
# if not library_details['do_notify_created']:
|
||||
# # logger.debug(u"Tautulli NotificationHandler :: Notifications for library '%s' is disabled." % library_details['section_name'])
|
||||
# return False
|
||||
|
||||
return True
|
||||
|
||||
@@ -206,12 +206,14 @@ def notify_custom_conditions(notifier_id=None, parameters=None):
|
||||
notifier_config = notifiers.get_notifier_config(notifier_id=notifier_id)
|
||||
|
||||
custom_conditions_logic = notifier_config['custom_conditions_logic']
|
||||
custom_conditions = json.loads(notifier_config['custom_conditions']) or []
|
||||
|
||||
if custom_conditions_logic or any(c for c in custom_conditions if c['value']):
|
||||
logger.debug(u"Tautulli NotificationHandler :: Checking custom notification conditions for notifier_id %s."
|
||||
% notifier_id)
|
||||
|
||||
logic_groups = None
|
||||
if custom_conditions_logic:
|
||||
logger.debug(u"Tautulli NotificationHandler :: Checking custom notification conditions for notifier_id %s." % notifier_id)
|
||||
|
||||
custom_conditions = json.loads(notifier_config['custom_conditions'])
|
||||
|
||||
try:
|
||||
# Parse and validate the custom conditions logic
|
||||
logic_groups = helpers.parse_condition_logic_string(custom_conditions_logic, len(custom_conditions))
|
||||
@@ -227,10 +229,11 @@ def notify_custom_conditions(notifier_id=None, parameters=None):
|
||||
operator = condition['operator']
|
||||
values = condition['value']
|
||||
parameter_type = condition['type']
|
||||
parameter_value = parameters.get(parameter, "")
|
||||
|
||||
# Set blank conditions to None
|
||||
# Set blank conditions to True (skip)
|
||||
if not parameter or not operator or not values:
|
||||
evaluated_conditions.append(None)
|
||||
evaluated_conditions.append(True)
|
||||
continue
|
||||
|
||||
# Make sure the condition values is in a list
|
||||
@@ -248,25 +251,25 @@ def notify_custom_conditions(notifier_id=None, parameters=None):
|
||||
elif parameter_type == 'float':
|
||||
values = [float(v) for v in values]
|
||||
|
||||
except Exception as e:
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to cast condition '%s' to type '%s'."
|
||||
% (parameter, parameter_type))
|
||||
except ValueError as e:
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to cast condition '%s', values '%s', to type '%s'."
|
||||
% (parameter, values, parameter_type))
|
||||
return False
|
||||
|
||||
# Cast the parameter value to the correct type
|
||||
try:
|
||||
if parameter_type == 'str':
|
||||
parameter_value = unicode(parameters[parameter]).lower()
|
||||
parameter_value = unicode(parameter_value).lower()
|
||||
|
||||
elif parameter_type == 'int':
|
||||
parameter_value = int(parameters[parameter])
|
||||
parameter_value = int(parameter_value)
|
||||
|
||||
elif parameter_type == 'float':
|
||||
parameter_value = float(parameters[parameter])
|
||||
parameter_value = float(parameter_value)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to cast parameter '%s' to type '%s'."
|
||||
% (parameter, parameter_type))
|
||||
except ValueError as e:
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to cast parameter '%s', value '%s', to type '%s'."
|
||||
% (parameter, parameter_value, parameter_type))
|
||||
return False
|
||||
|
||||
# Check each condition
|
||||
@@ -298,12 +301,15 @@ def notify_custom_conditions(notifier_id=None, parameters=None):
|
||||
logger.warn(u"Tautulli NotificationHandler :: Invalid condition operator '%s'." % operator)
|
||||
evaluated_conditions.append(None)
|
||||
|
||||
if logic_groups:
|
||||
# Format and evaluate the logic string
|
||||
try:
|
||||
evaluated_logic = helpers.eval_logic_groups_to_bool(logic_groups, evaluated_conditions)
|
||||
except Exception as e:
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to evaluate custom condition logic: %s." % e)
|
||||
return False
|
||||
else:
|
||||
evaluated_logic = all(evaluated_conditions[1:])
|
||||
|
||||
logger.debug(u"Tautulli NotificationHandler :: Custom condition evaluated to '%s'." % str(evaluated_logic))
|
||||
return evaluated_logic
|
||||
@@ -326,7 +332,7 @@ def notify(notifier_id=None, notify_action=None, stream_data=None, timeline_data
|
||||
if not notifier_config:
|
||||
return
|
||||
|
||||
if notify_action == 'test':
|
||||
if notify_action in ('test', 'api'):
|
||||
subject = kwargs.pop('subject', 'Tautulli')
|
||||
body = kwargs.pop('body', 'Test Notification')
|
||||
script_args = kwargs.pop('script_args', [])
|
||||
@@ -344,8 +350,8 @@ def notify(notifier_id=None, notify_action=None, stream_data=None, timeline_data
|
||||
|
||||
# Set the notification state in the db
|
||||
notification_id = set_notify_state(session=stream_data or timeline_data,
|
||||
notify_action=notify_action,
|
||||
notifier=notifier_config,
|
||||
notify_action=notify_action,
|
||||
subject=subject,
|
||||
body=body,
|
||||
script_args=script_args)
|
||||
@@ -384,9 +390,9 @@ def get_notify_state(session):
|
||||
return notify_states
|
||||
|
||||
|
||||
def set_notify_state(notify_action, notifier, subject, body, script_args, session=None):
|
||||
def set_notify_state(notifier, notify_action, subject='', body='', script_args='', session=None):
|
||||
|
||||
if notify_action and notifier:
|
||||
if notifier and notify_action:
|
||||
monitor_db = database.MonitorDatabase()
|
||||
|
||||
session = session or {}
|
||||
@@ -429,37 +435,26 @@ def build_media_notify_params(notify_action=None, session=None, timeline=None, m
|
||||
time_format = plexpy.CONFIG.TIME_FORMAT.replace('Do','')
|
||||
duration_format = plexpy.CONFIG.TIME_FORMAT.replace('Do','').replace('a','').replace('A','')
|
||||
|
||||
# Get the server name
|
||||
server_name = plexpy.CONFIG.PMS_NAME
|
||||
|
||||
# Get the server uptime
|
||||
plex_tv = plextv.PlexTV()
|
||||
server_times = plex_tv.get_server_times()
|
||||
|
||||
if server_times:
|
||||
updated_at = server_times['updated_at']
|
||||
server_uptime = helpers.human_duration(int(time.time() - helpers.cast_to_int(updated_at)))
|
||||
else:
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to retrieve server uptime.")
|
||||
server_uptime = 'N/A'
|
||||
|
||||
# Get metadata for the item
|
||||
if session:
|
||||
rating_key = session['rating_key']
|
||||
elif timeline:
|
||||
rating_key = timeline['rating_key']
|
||||
|
||||
pms_connect = pmsconnect.PmsConnect()
|
||||
metadata = pms_connect.get_metadata_details(rating_key=rating_key)
|
||||
notify_params = defaultdict(str)
|
||||
if session:
|
||||
# Reload json from raw stream info
|
||||
if session.get('raw_stream_info'):
|
||||
session.update(json.loads(session['raw_stream_info']))
|
||||
notify_params.update(session)
|
||||
|
||||
if not metadata:
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to retrieve metadata for rating_key %s" % str(rating_key))
|
||||
return None
|
||||
if timeline:
|
||||
notify_params.update(timeline)
|
||||
|
||||
## TODO: Check list of media info items, currently only grabs first item
|
||||
media_info = media_part_info = {}
|
||||
if 'media_info' in metadata and len(metadata['media_info']) > 0:
|
||||
media_info = metadata['media_info'][0]
|
||||
if 'media_info' in notify_params and len(notify_params['media_info']) > 0:
|
||||
media_info = notify_params['media_info'][0]
|
||||
if 'parts' in media_info and len(media_info['parts']) > 0:
|
||||
media_part_info = media_info.pop('parts')[0]
|
||||
|
||||
@@ -476,11 +471,14 @@ def build_media_notify_params(notify_action=None, session=None, timeline=None, m
|
||||
media_part_info.update(stream)
|
||||
stream_subtitle = True
|
||||
|
||||
notify_params.update(media_info)
|
||||
notify_params.update(media_part_info)
|
||||
|
||||
child_metadata = grandchild_metadata = []
|
||||
for key in kwargs.pop('child_keys', []):
|
||||
child_metadata.append(pms_connect.get_metadata_details(rating_key=key))
|
||||
child_metadata.append(pmsconnect.PmsConnect().get_metadata_details(rating_key=key))
|
||||
for key in kwargs.pop('grandchild_keys', []):
|
||||
grandchild_metadata.append(pms_connect.get_metadata_details(rating_key=key))
|
||||
grandchild_metadata.append(pmsconnect.PmsConnect().get_metadata_details(rating_key=key))
|
||||
|
||||
# Session values
|
||||
session = session or {}
|
||||
@@ -507,102 +505,107 @@ def build_media_notify_params(notify_action=None, session=None, timeline=None, m
|
||||
stream_duration = 0
|
||||
|
||||
view_offset = helpers.convert_milliseconds_to_minutes(session.get('view_offset', 0))
|
||||
duration = helpers.convert_milliseconds_to_minutes(metadata['duration'])
|
||||
duration = helpers.convert_milliseconds_to_minutes(notify_params['duration'])
|
||||
remaining_duration = duration - view_offset
|
||||
|
||||
# Build Plex URL
|
||||
metadata['plex_url'] = '{web_url}#!/server/{pms_identifier}/details?key=%2Flibrary%2Fmetadata%2F{rating_key}'.format(
|
||||
if notify_params['media_type'] == 'track':
|
||||
plex_web_rating_key = notify_params['parent_rating_key']
|
||||
else:
|
||||
plex_web_rating_key = notify_params['rating_key']
|
||||
|
||||
notify_params['plex_url'] = '{web_url}#!/server/{pms_identifier}/details?key=%2Flibrary%2Fmetadata%2F{rating_key}'.format(
|
||||
web_url=plexpy.CONFIG.PMS_WEB_URL,
|
||||
pms_identifier=plexpy.CONFIG.PMS_IDENTIFIER,
|
||||
rating_key=rating_key)
|
||||
rating_key=plex_web_rating_key)
|
||||
|
||||
# Get media IDs from guid and build URLs
|
||||
if 'imdb://' in metadata['guid']:
|
||||
metadata['imdb_id'] = metadata['guid'].split('imdb://')[1].split('?')[0]
|
||||
metadata['imdb_url'] = 'https://www.imdb.com/title/' + metadata['imdb_id']
|
||||
metadata['trakt_url'] = 'https://trakt.tv/search/imdb/' + metadata['imdb_id']
|
||||
if 'imdb://' in notify_params['guid']:
|
||||
notify_params['imdb_id'] = notify_params['guid'].split('imdb://')[1].split('?')[0]
|
||||
notify_params['imdb_url'] = 'https://www.imdb.com/title/' + notify_params['imdb_id']
|
||||
notify_params['trakt_url'] = 'https://trakt.tv/search/imdb/' + notify_params['imdb_id']
|
||||
|
||||
if 'thetvdb://' in metadata['guid']:
|
||||
metadata['thetvdb_id'] = metadata['guid'].split('thetvdb://')[1].split('/')[0]
|
||||
metadata['thetvdb_url'] = 'https://thetvdb.com/?tab=series&id=' + metadata['thetvdb_id']
|
||||
metadata['trakt_url'] = 'https://trakt.tv/search/tvdb/' + metadata['thetvdb_id'] + '?id_type=show'
|
||||
if 'thetvdb://' in notify_params['guid']:
|
||||
notify_params['thetvdb_id'] = notify_params['guid'].split('thetvdb://')[1].split('/')[0].split('?')[0]
|
||||
notify_params['thetvdb_url'] = 'https://thetvdb.com/?tab=series&id=' + notify_params['thetvdb_id']
|
||||
notify_params['trakt_url'] = 'https://trakt.tv/search/tvdb/' + notify_params['thetvdb_id'] + '?id_type=show'
|
||||
|
||||
elif 'thetvdbdvdorder://' in metadata['guid']:
|
||||
metadata['thetvdb_id'] = metadata['guid'].split('thetvdbdvdorder://')[1].split('/')[0]
|
||||
metadata['thetvdb_url'] = 'https://thetvdb.com/?tab=series&id=' + metadata['thetvdb_id']
|
||||
metadata['trakt_url'] = 'https://trakt.tv/search/tvdb/' + metadata['thetvdb_id'] + '?id_type=show'
|
||||
elif 'thetvdbdvdorder://' in notify_params['guid']:
|
||||
notify_params['thetvdb_id'] = notify_params['guid'].split('thetvdbdvdorder://')[1].split('/')[0]
|
||||
notify_params['thetvdb_url'] = 'https://thetvdb.com/?tab=series&id=' + notify_params['thetvdb_id']
|
||||
notify_params['trakt_url'] = 'https://trakt.tv/search/tvdb/' + notify_params['thetvdb_id'] + '?id_type=show'
|
||||
|
||||
if 'themoviedb://' in metadata['guid']:
|
||||
if metadata['media_type'] == 'movie':
|
||||
metadata['themoviedb_id'] = metadata['guid'].split('themoviedb://')[1].split('?')[0]
|
||||
metadata['themoviedb_url'] = 'https://www.themoviedb.org/movie/' + metadata['themoviedb_id']
|
||||
metadata['trakt_url'] = 'https://trakt.tv/search/tmdb/' + metadata['themoviedb_id'] + '?id_type=movie'
|
||||
if 'themoviedb://' in notify_params['guid']:
|
||||
if notify_params['media_type'] == 'movie':
|
||||
notify_params['themoviedb_id'] = notify_params['guid'].split('themoviedb://')[1].split('?')[0]
|
||||
notify_params['themoviedb_url'] = 'https://www.themoviedb.org/movie/' + notify_params['themoviedb_id']
|
||||
notify_params['trakt_url'] = 'https://trakt.tv/search/tmdb/' + notify_params['themoviedb_id'] + '?id_type=movie'
|
||||
|
||||
elif metadata['media_type'] in ('show', 'season', 'episode'):
|
||||
metadata['themoviedb_id'] = metadata['guid'].split('themoviedb://')[1].split('/')[0]
|
||||
metadata['themoviedb_url'] = 'https://www.themoviedb.org/tv/' + metadata['themoviedb_id']
|
||||
metadata['trakt_url'] = 'https://trakt.tv/search/tmdb/' + metadata['themoviedb_id'] + '?id_type=show'
|
||||
elif notify_params['media_type'] in ('show', 'season', 'episode'):
|
||||
notify_params['themoviedb_id'] = notify_params['guid'].split('themoviedb://')[1].split('/')[0]
|
||||
notify_params['themoviedb_url'] = 'https://www.themoviedb.org/tv/' + notify_params['themoviedb_id']
|
||||
notify_params['trakt_url'] = 'https://trakt.tv/search/tmdb/' + notify_params['themoviedb_id'] + '?id_type=show'
|
||||
|
||||
if 'lastfm://' in metadata['guid']:
|
||||
metadata['lastfm_id'] = metadata['guid'].split('lastfm://')[1].rsplit('/', 1)[0]
|
||||
metadata['lastfm_url'] = 'https://www.last.fm/music/' + metadata['lastfm_id']
|
||||
if 'lastfm://' in notify_params['guid']:
|
||||
notify_params['lastfm_id'] = notify_params['guid'].split('lastfm://')[1].rsplit('/', 1)[0]
|
||||
notify_params['lastfm_url'] = 'https://www.last.fm/music/' + notify_params['lastfm_id']
|
||||
|
||||
# Get TheMovieDB info
|
||||
if plexpy.CONFIG.THEMOVIEDB_LOOKUP:
|
||||
if metadata.get('themoviedb_id'):
|
||||
if notify_params.get('themoviedb_id'):
|
||||
themoveidb_json = get_themoviedb_info(rating_key=rating_key,
|
||||
media_type=metadata['media_type'],
|
||||
themoviedb_id=metadata['themoviedb_id'])
|
||||
media_type=notify_params['media_type'],
|
||||
themoviedb_id=notify_params['themoviedb_id'])
|
||||
|
||||
if themoveidb_json.get('imdb_id'):
|
||||
metadata['imdb_id'] = themoveidb_json['imdb_id']
|
||||
metadata['imdb_url'] = 'https://www.imdb.com/title/' + themoveidb_json['imdb_id']
|
||||
notify_params['imdb_id'] = themoveidb_json['imdb_id']
|
||||
notify_params['imdb_url'] = 'https://www.imdb.com/title/' + themoveidb_json['imdb_id']
|
||||
|
||||
elif metadata.get('thetvdb_id') or metadata.get('imdb_id'):
|
||||
elif notify_params.get('thetvdb_id') or notify_params.get('imdb_id'):
|
||||
themoviedb_info = lookup_themoviedb_by_id(rating_key=rating_key,
|
||||
thetvdb_id=metadata.get('thetvdb_id'),
|
||||
imdb_id=metadata.get('imdb_id'))
|
||||
metadata.update(themoviedb_info)
|
||||
thetvdb_id=notify_params.get('thetvdb_id'),
|
||||
imdb_id=notify_params.get('imdb_id'))
|
||||
notify_params.update(themoviedb_info)
|
||||
|
||||
# Get TVmaze info (for tv shows only)
|
||||
if plexpy.CONFIG.TVMAZE_LOOKUP:
|
||||
if metadata['media_type'] in ('show', 'season', 'episode') and (metadata.get('thetvdb_id') or metadata.get('imdb_id')):
|
||||
if notify_params['media_type'] in ('show', 'season', 'episode') and (notify_params.get('thetvdb_id') or notify_params.get('imdb_id')):
|
||||
tvmaze_info = lookup_tvmaze_by_id(rating_key=rating_key,
|
||||
thetvdb_id=metadata.get('thetvdb_id'),
|
||||
imdb_id=metadata.get('imdb_id'))
|
||||
metadata.update(tvmaze_info)
|
||||
thetvdb_id=notify_params.get('thetvdb_id'),
|
||||
imdb_id=notify_params.get('imdb_id'))
|
||||
notify_params.update(tvmaze_info)
|
||||
|
||||
if tvmaze_info.get('thetvdb_id'):
|
||||
metadata['thetvdb_url'] = 'https://thetvdb.com/?tab=series&id=' + str(tvmaze_info['thetvdb_id'])
|
||||
notify_params['thetvdb_url'] = 'https://thetvdb.com/?tab=series&id=' + str(tvmaze_info['thetvdb_id'])
|
||||
if tvmaze_info.get('imdb_id'):
|
||||
metadata['imdb_url'] = 'https://www.imdb.com/title/' + tvmaze_info['imdb_id']
|
||||
notify_params['imdb_url'] = 'https://www.imdb.com/title/' + tvmaze_info['imdb_id']
|
||||
|
||||
if metadata['media_type'] in ('movie', 'show', 'artist'):
|
||||
poster_thumb = metadata['thumb']
|
||||
poster_key = metadata['rating_key']
|
||||
poster_title = metadata['title']
|
||||
elif metadata['media_type'] in ('season', 'album'):
|
||||
poster_thumb = metadata['thumb'] or metadata['parent_thumb']
|
||||
poster_key = metadata['rating_key']
|
||||
poster_title = '%s - %s' % (metadata['parent_title'],
|
||||
metadata['title'])
|
||||
elif metadata['media_type'] in ('episode', 'track'):
|
||||
poster_thumb = metadata['parent_thumb'] or metadata['grandparent_thumb']
|
||||
poster_key = metadata['parent_rating_key']
|
||||
poster_title = '%s - %s' % (metadata['grandparent_title'],
|
||||
metadata['parent_title'])
|
||||
if notify_params['media_type'] in ('movie', 'show', 'artist'):
|
||||
poster_thumb = notify_params['thumb']
|
||||
poster_key = notify_params['rating_key']
|
||||
poster_title = notify_params['title']
|
||||
elif notify_params['media_type'] in ('season', 'album'):
|
||||
poster_thumb = notify_params['thumb'] or notify_params['parent_thumb']
|
||||
poster_key = notify_params['rating_key']
|
||||
poster_title = '%s - %s' % (notify_params['parent_title'],
|
||||
notify_params['title'])
|
||||
elif notify_params['media_type'] in ('episode', 'track'):
|
||||
poster_thumb = notify_params['parent_thumb'] or notify_params['grandparent_thumb']
|
||||
poster_key = notify_params['parent_rating_key']
|
||||
poster_title = '%s - %s' % (notify_params['grandparent_title'],
|
||||
notify_params['parent_title'])
|
||||
else:
|
||||
poster_thumb = ''
|
||||
|
||||
if plexpy.CONFIG.NOTIFY_UPLOAD_POSTERS:
|
||||
poster_info = get_poster_info(poster_thumb=poster_thumb, poster_key=poster_key, poster_title=poster_title)
|
||||
metadata.update(poster_info)
|
||||
notify_params.update(poster_info)
|
||||
|
||||
if ((manual_trigger or plexpy.CONFIG.NOTIFY_GROUP_RECENTLY_ADDED_GRANDPARENT)
|
||||
and metadata['media_type'] in ('show', 'artist')):
|
||||
show_name = metadata['title']
|
||||
and notify_params['media_type'] in ('show', 'artist')):
|
||||
show_name = notify_params['title']
|
||||
episode_name = ''
|
||||
artist_name = metadata['title']
|
||||
artist_name = notify_params['title']
|
||||
album_name = ''
|
||||
track_name = ''
|
||||
|
||||
@@ -614,14 +617,14 @@ def build_media_notify_params(notify_action=None, session=None, timeline=None, m
|
||||
track_num, track_num00 = '', ''
|
||||
|
||||
elif ((manual_trigger or plexpy.CONFIG.NOTIFY_GROUP_RECENTLY_ADDED_PARENT)
|
||||
and metadata['media_type'] in ('season', 'album')):
|
||||
show_name = metadata['parent_title']
|
||||
and notify_params['media_type'] in ('season', 'album')):
|
||||
show_name = notify_params['parent_title']
|
||||
episode_name = ''
|
||||
artist_name = metadata['parent_title']
|
||||
album_name = metadata['title']
|
||||
artist_name = notify_params['parent_title']
|
||||
album_name = notify_params['title']
|
||||
track_name = ''
|
||||
season_num = metadata['media_index'].zfill(1)
|
||||
season_num00 = metadata['media_index'].zfill(2)
|
||||
season_num = str(notify_params['media_index']).zfill(1)
|
||||
season_num00 = str(notify_params['media_index']).zfill(2)
|
||||
|
||||
num, num00 = format_group_index([helpers.cast_to_int(d['media_index'])
|
||||
for d in child_metadata if d['parent_rating_key'] == rating_key])
|
||||
@@ -629,38 +632,45 @@ def build_media_notify_params(notify_action=None, session=None, timeline=None, m
|
||||
track_num, track_num00 = num, num00
|
||||
|
||||
else:
|
||||
show_name = metadata['grandparent_title']
|
||||
episode_name = metadata['title']
|
||||
artist_name = metadata['grandparent_title']
|
||||
album_name = metadata['parent_title']
|
||||
track_name = metadata['title']
|
||||
season_num = metadata['parent_media_index'].zfill(1)
|
||||
season_num00 = metadata['parent_media_index'].zfill(2)
|
||||
episode_num = metadata['media_index'].zfill(1)
|
||||
episode_num00 = metadata['media_index'].zfill(2)
|
||||
track_num = metadata['media_index'].zfill(1)
|
||||
track_num00 = metadata['media_index'].zfill(2)
|
||||
show_name = notify_params['grandparent_title']
|
||||
episode_name = notify_params['title']
|
||||
artist_name = notify_params['grandparent_title']
|
||||
album_name = notify_params['parent_title']
|
||||
track_name = notify_params['title']
|
||||
season_num = str(notify_params['parent_media_index']).zfill(1)
|
||||
season_num00 = str(notify_params['parent_media_index']).zfill(2)
|
||||
episode_num = str(notify_params['media_index']).zfill(1)
|
||||
episode_num00 = str(notify_params['media_index']).zfill(2)
|
||||
track_num = str(notify_params['media_index']).zfill(1)
|
||||
track_num00 = str(notify_params['media_index']).zfill(2)
|
||||
|
||||
available_params = {# Global paramaters
|
||||
'plexpy_version': common.VERSION_NUMBER,
|
||||
'plexpy_branch': plexpy.CONFIG.GIT_BRANCH,
|
||||
'plexpy_commit': plexpy.CURRENT_VERSION,
|
||||
'server_name': server_name,
|
||||
'server_uptime': server_uptime,
|
||||
'server_version': server_times.get('version',''),
|
||||
'action': notify_action.split('on_')[-1],
|
||||
available_params = {
|
||||
# Global paramaters
|
||||
'tautulli_version': common.VERSION_NUMBER,
|
||||
'tautulli_remote': plexpy.CONFIG.GIT_REMOTE,
|
||||
'tautulli_branch': plexpy.CONFIG.GIT_BRANCH,
|
||||
'tautulli_commit': plexpy.CURRENT_VERSION,
|
||||
'server_name': plexpy.CONFIG.PMS_NAME,
|
||||
'server_ip': plexpy.CONFIG.PMS_IP,
|
||||
'server_port': plexpy.CONFIG.PMS_PORT,
|
||||
'server_url': plexpy.CONFIG.PMS_URL,
|
||||
'server_machine_id': plexpy.CONFIG.PMS_IDENTIFIER,
|
||||
'server_platform': plexpy.CONFIG.PMS_PLATFORM,
|
||||
'server_version': plexpy.CONFIG.PMS_VERSION,
|
||||
'action': notify_action.lstrip('on_'),
|
||||
'datestamp': arrow.now().format(date_format),
|
||||
'timestamp': arrow.now().format(time_format),
|
||||
'unixtime': int(time.time()),
|
||||
# Stream parameters
|
||||
'streams': stream_count,
|
||||
'user_streams': user_stream_count,
|
||||
'user': session.get('friendly_name',''),
|
||||
'username': session.get('user',''),
|
||||
'device': session.get('device',''),
|
||||
'platform': session.get('platform',''),
|
||||
'product': session.get('product',''),
|
||||
'player': session.get('player',''),
|
||||
'ip_address': session.get('ip_address','N/A'),
|
||||
'user': notify_params['friendly_name'],
|
||||
'username': notify_params['user'],
|
||||
'device': notify_params['device'],
|
||||
'platform': notify_params['platform'],
|
||||
'product': notify_params['product'],
|
||||
'player': notify_params['player'],
|
||||
'ip_address': notify_params.get('ip_address', 'N/A'),
|
||||
'stream_duration': stream_duration,
|
||||
'stream_time': arrow.get(stream_duration * 60).format(duration_format),
|
||||
'remaining_duration': remaining_duration,
|
||||
@@ -669,65 +679,68 @@ def build_media_notify_params(notify_action=None, session=None, timeline=None, m
|
||||
'progress_time': arrow.get(view_offset * 60).format(duration_format),
|
||||
'progress_percent': helpers.get_percent(view_offset, duration),
|
||||
'transcode_decision': transcode_decision,
|
||||
'video_decision': session.get('video_decision',''),
|
||||
'audio_decision': session.get('audio_decision',''),
|
||||
'subtitle_decision': session.get('subtitle_decision',''),
|
||||
'quality_profile': session.get('quality_profile',''),
|
||||
'optimized_version': session.get('optimized_version',''),
|
||||
'optimized_version_profile': session.get('optimized_version_profile',''),
|
||||
'stream_local': session.get('local', ''),
|
||||
'stream_location': session.get('location', ''),
|
||||
'stream_bandwidth': session.get('bandwidth', ''),
|
||||
'stream_container': session.get('stream_container', ''),
|
||||
'stream_bitrate': session.get('stream_bitrate', ''),
|
||||
'stream_aspect_ratio': session.get('stream_aspect_ratio', ''),
|
||||
'stream_video_codec': session.get('stream_video_codec', ''),
|
||||
'stream_video_codec_level': session.get('stream_video_codec_level', ''),
|
||||
'stream_video_bitrate': session.get('stream_video_bitrate', ''),
|
||||
'stream_video_bit_depth': session.get('stream_video_bit_depth', ''),
|
||||
'stream_video_framerate': session.get('stream_video_framerate', ''),
|
||||
'stream_video_ref_frames': session.get('stream_video_ref_frames', ''),
|
||||
'stream_video_resolution': session.get('stream_video_resolution', ''),
|
||||
'stream_video_height': session.get('stream_video_height', ''),
|
||||
'stream_video_width': session.get('stream_video_width', ''),
|
||||
'stream_video_language': session.get('stream_video_language', ''),
|
||||
'stream_video_language_code': session.get('stream_video_language_code', ''),
|
||||
'stream_audio_bitrate': session.get('stream_audio_bitrate', ''),
|
||||
'stream_audio_bitrate_mode': session.get('stream_audio_bitrate_mode', ''),
|
||||
'stream_audio_codec': session.get('stream_audio_codec', ''),
|
||||
'stream_audio_channels': session.get('stream_audio_channels', ''),
|
||||
'stream_audio_channel_layout': session.get('stream_audio_channel_layout', ''),
|
||||
'stream_audio_sample_rate': session.get('stream_audio_sample_rate', ''),
|
||||
'stream_audio_language': session.get('stream_audio_language', ''),
|
||||
'stream_audio_language_code': session.get('stream_audio_language_code', ''),
|
||||
'stream_subtitle_codec': session.get('stream_subtitle_codec', ''),
|
||||
'stream_subtitle_container': session.get('stream_subtitle_container', ''),
|
||||
'stream_subtitle_format': session.get('stream_subtitle_format', ''),
|
||||
'stream_subtitle_forced': session.get('stream_subtitle_forced', ''),
|
||||
'stream_subtitle_language': session.get('stream_subtitle_language', ''),
|
||||
'stream_subtitle_language_code': session.get('stream_subtitle_language_code', ''),
|
||||
'stream_subtitle_location': session.get('stream_subtitle_location', ''),
|
||||
'transcode_container': session.get('transcode_container',''),
|
||||
'transcode_video_codec': session.get('transcode_video_codec',''),
|
||||
'transcode_video_width': session.get('transcode_width',''),
|
||||
'transcode_video_height': session.get('transcode_height',''),
|
||||
'transcode_audio_codec': session.get('transcode_audio_codec',''),
|
||||
'transcode_audio_channels': session.get('transcode_audio_channels',''),
|
||||
'transcode_hw_requested': session.get('transcode_hw_requested',''),
|
||||
'transcode_hw_decode': session.get('transcode_hw_decode',''),
|
||||
'transcode_hw_decode_title': session.get('transcode_hw_decode_title',''),
|
||||
'transcode_hw_encode': session.get('transcode_hw_encode',''),
|
||||
'transcode_hw_encode_title': session.get('transcode_hw_encode_title',''),
|
||||
'transcode_hw_full_pipeline': session.get('transcode_hw_full_pipeline',''),
|
||||
'session_key': session.get('session_key',''),
|
||||
'transcode_key': session.get('transcode_key',''),
|
||||
'session_id': session.get('session_id',''),
|
||||
'user_id': session.get('user_id',''),
|
||||
'machine_id': session.get('machine_id',''),
|
||||
'video_decision': notify_params['video_decision'],
|
||||
'audio_decision': notify_params['audio_decision'],
|
||||
'subtitle_decision': notify_params['subtitle_decision'],
|
||||
'quality_profile': notify_params['quality_profile'],
|
||||
'optimized_version': notify_params['optimized_version'],
|
||||
'optimized_version_profile': notify_params['optimized_version_profile'],
|
||||
'synced_version': notify_params['synced_version'],
|
||||
'stream_local': notify_params['local'],
|
||||
'stream_location': notify_params['location'],
|
||||
'stream_bandwidth': notify_params['bandwidth'],
|
||||
'stream_container': notify_params['stream_container'],
|
||||
'stream_bitrate': notify_params['stream_bitrate'],
|
||||
'stream_aspect_ratio': notify_params['stream_aspect_ratio'],
|
||||
'stream_video_codec': notify_params['stream_video_codec'],
|
||||
'stream_video_codec_level': notify_params['stream_video_codec_level'],
|
||||
'stream_video_bitrate': notify_params['stream_video_bitrate'],
|
||||
'stream_video_bit_depth': notify_params['stream_video_bit_depth'],
|
||||
'stream_video_framerate': notify_params['stream_video_framerate'],
|
||||
'stream_video_ref_frames': notify_params['stream_video_ref_frames'],
|
||||
'stream_video_resolution': notify_params['stream_video_resolution'],
|
||||
'stream_video_height': notify_params['stream_video_height'],
|
||||
'stream_video_width': notify_params['stream_video_width'],
|
||||
'stream_video_language': notify_params['stream_video_language'],
|
||||
'stream_video_language_code': notify_params['stream_video_language_code'],
|
||||
'stream_audio_bitrate': notify_params['stream_audio_bitrate'],
|
||||
'stream_audio_bitrate_mode': notify_params['stream_audio_bitrate_mode'],
|
||||
'stream_audio_codec': notify_params['stream_audio_codec'],
|
||||
'stream_audio_channels': notify_params['stream_audio_channels'],
|
||||
'stream_audio_channel_layout': notify_params['stream_audio_channel_layout'],
|
||||
'stream_audio_sample_rate': notify_params['stream_audio_sample_rate'],
|
||||
'stream_audio_language': notify_params['stream_audio_language'],
|
||||
'stream_audio_language_code': notify_params['stream_audio_language_code'],
|
||||
'stream_subtitle_codec': notify_params['stream_subtitle_codec'],
|
||||
'stream_subtitle_container': notify_params['stream_subtitle_container'],
|
||||
'stream_subtitle_format': notify_params['stream_subtitle_format'],
|
||||
'stream_subtitle_forced': notify_params['stream_subtitle_forced'],
|
||||
'stream_subtitle_language': notify_params['stream_subtitle_language'],
|
||||
'stream_subtitle_language_code': notify_params['stream_subtitle_language_code'],
|
||||
'stream_subtitle_location': notify_params['stream_subtitle_location'],
|
||||
'transcode_container': notify_params['transcode_container'],
|
||||
'transcode_video_codec': notify_params['transcode_video_codec'],
|
||||
'transcode_video_width': notify_params['transcode_width'],
|
||||
'transcode_video_height': notify_params['transcode_height'],
|
||||
'transcode_audio_codec': notify_params['transcode_audio_codec'],
|
||||
'transcode_audio_channels': notify_params['transcode_audio_channels'],
|
||||
'transcode_hw_requested': notify_params['transcode_hw_requested'],
|
||||
'transcode_hw_decoding': notify_params['transcode_hw_decoding'],
|
||||
'transcode_hw_decode_codec': notify_params['transcode_hw_decode'],
|
||||
'transcode_hw_decode_title': notify_params['transcode_hw_decode_title'],
|
||||
'transcode_hw_encoding': notify_params['transcode_hw_encoding'],
|
||||
'transcode_hw_encode_codec': notify_params['transcode_hw_encode'],
|
||||
'transcode_hw_encode_title': notify_params['transcode_hw_encode_title'],
|
||||
'transcode_hw_full_pipeline': notify_params['transcode_hw_full_pipeline'],
|
||||
'session_key': notify_params['session_key'],
|
||||
'transcode_key': notify_params['transcode_key'],
|
||||
'session_id': notify_params['session_id'],
|
||||
'user_id': notify_params['user_id'],
|
||||
'machine_id': notify_params['machine_id'],
|
||||
# Source metadata parameters
|
||||
'media_type': metadata['media_type'],
|
||||
'title': metadata['full_title'],
|
||||
'library_name': metadata['library_name'],
|
||||
'media_type': notify_params['media_type'],
|
||||
'title': notify_params['full_title'],
|
||||
'library_name': notify_params['library_name'],
|
||||
'show_name': show_name,
|
||||
'episode_name': episode_name,
|
||||
'artist_name': artist_name,
|
||||
@@ -739,80 +752,82 @@ def build_media_notify_params(notify_action=None, session=None, timeline=None, m
|
||||
'episode_num00': episode_num00,
|
||||
'track_num': track_num,
|
||||
'track_num00': track_num00,
|
||||
'year': metadata['year'],
|
||||
'release_date': arrow.get(metadata['originally_available_at']).format(date_format)
|
||||
if metadata['originally_available_at'] else '',
|
||||
'air_date': arrow.get(metadata['originally_available_at']).format(date_format)
|
||||
if metadata['originally_available_at'] else '',
|
||||
'added_date': arrow.get(metadata['added_at']).format(date_format)
|
||||
if metadata['added_at'] else '',
|
||||
'updated_date': arrow.get(metadata['updated_at']).format(date_format)
|
||||
if metadata['updated_at'] else '',
|
||||
'last_viewed_date': arrow.get(metadata['last_viewed_at']).format(date_format)
|
||||
if metadata['last_viewed_at'] else '',
|
||||
'studio': metadata['studio'],
|
||||
'content_rating': metadata['content_rating'],
|
||||
'directors': ', '.join(metadata['directors']),
|
||||
'writers': ', '.join(metadata['writers']),
|
||||
'actors': ', '.join(metadata['actors']),
|
||||
'genres': ', '.join(metadata['genres']),
|
||||
'summary': metadata['summary'],
|
||||
'tagline': metadata['tagline'],
|
||||
'rating': metadata['rating'],
|
||||
'audience_rating': helpers.get_percent(metadata['audience_rating'], 10) or '',
|
||||
'year': notify_params['year'],
|
||||
'release_date': arrow.get(notify_params['originally_available_at']).format(date_format)
|
||||
if notify_params['originally_available_at'] else '',
|
||||
'air_date': arrow.get(notify_params['originally_available_at']).format(date_format)
|
||||
if notify_params['originally_available_at'] else '',
|
||||
'added_date': arrow.get(notify_params['added_at']).format(date_format)
|
||||
if notify_params['added_at'] else '',
|
||||
'updated_date': arrow.get(notify_params['updated_at']).format(date_format)
|
||||
if notify_params['updated_at'] else '',
|
||||
'last_viewed_date': arrow.get(notify_params['last_viewed_at']).format(date_format)
|
||||
if notify_params['last_viewed_at'] else '',
|
||||
'studio': notify_params['studio'],
|
||||
'content_rating': notify_params['content_rating'],
|
||||
'directors': ', '.join(notify_params['directors']),
|
||||
'writers': ', '.join(notify_params['writers']),
|
||||
'actors': ', '.join(notify_params['actors']),
|
||||
'genres': ', '.join(notify_params['genres']),
|
||||
'labels': ', '.join(notify_params['labels']),
|
||||
'collections': ', '.join(notify_params['collections']),
|
||||
'summary': notify_params['summary'],
|
||||
'tagline': notify_params['tagline'],
|
||||
'rating': notify_params['rating'],
|
||||
'audience_rating': helpers.get_percent(notify_params['audience_rating'], 10) or '',
|
||||
'duration': duration,
|
||||
'poster_title': metadata.get('poster_title',''),
|
||||
'poster_url': metadata.get('poster_url',''),
|
||||
'plex_url': metadata.get('plex_url',''),
|
||||
'imdb_id': metadata.get('imdb_id',''),
|
||||
'imdb_url': metadata.get('imdb_url',''),
|
||||
'thetvdb_id': metadata.get('thetvdb_id',''),
|
||||
'thetvdb_url': metadata.get('thetvdb_url',''),
|
||||
'themoviedb_id': metadata.get('themoviedb_id',''),
|
||||
'themoviedb_url': metadata.get('themoviedb_url',''),
|
||||
'tvmaze_id': metadata.get('tvmaze_id',''),
|
||||
'tvmaze_url': metadata.get('tvmaze_url',''),
|
||||
'lastfm_url': metadata.get('lastfm_url',''),
|
||||
'trakt_url': metadata.get('trakt_url',''),
|
||||
'container': session.get('container', media_info.get('container','')),
|
||||
'bitrate': session.get('bitrate', media_info.get('bitrate','')),
|
||||
'aspect_ratio': session.get('aspect_ratio', media_info.get('aspect_ratio','')),
|
||||
'video_codec': session.get('video_codec', media_part_info.get('video_codec','')),
|
||||
'video_codec_level': session.get('video_codec_level', media_part_info.get('video_codec_level','')),
|
||||
'video_bitrate': session.get('video_bitrate', media_part_info.get('video_bitrate','')),
|
||||
'video_bit_depth': session.get('video_bit_depth', media_part_info.get('video_bit_depth','')),
|
||||
'video_framerate': session.get('video_framerate', media_info.get('video_framerate','')),
|
||||
'video_ref_frames': session.get('video_ref_frames', media_part_info.get('video_ref_frames','')),
|
||||
'video_resolution': session.get('video_resolution', media_info.get('video_resolution','')),
|
||||
'video_height': session.get('height', media_info.get('height','')),
|
||||
'video_width': session.get('width', media_info.get('width','')),
|
||||
'video_language': session.get('video_language', media_part_info.get('video_language','')),
|
||||
'video_language_code': session.get('video_language_code', media_part_info.get('video_language_code','')),
|
||||
'audio_bitrate': session.get('audio_bitrate', media_part_info.get('audio_bitrate','')),
|
||||
'audio_bitrate_mode': session.get('audio_bitrate_mode', media_part_info.get('audio_bitrate_mode','')),
|
||||
'audio_codec': session.get('audio_codec', media_part_info.get('audio_codec','')),
|
||||
'audio_channels': session.get('audio_channels', media_part_info.get('audio_channels','')),
|
||||
'audio_channel_layout': session.get('audio_channel_layout', media_part_info.get('audio_channel_layout','')),
|
||||
'audio_sample_rate': session.get('audio_sample_rate', media_part_info.get('audio_sample_rate','')),
|
||||
'audio_language': session.get('audio_language', media_part_info.get('audio_language','')),
|
||||
'audio_language_code': session.get('audio_language_code', media_part_info.get('audio_language_code','')),
|
||||
'subtitle_codec': session.get('subtitle_codec', media_part_info.get('subtitle_codec','')),
|
||||
'subtitle_container': session.get('subtitle_container', media_part_info.get('subtitle_container','')),
|
||||
'subtitle_format': session.get('subtitle_format', media_part_info.get('subtitle_format','')),
|
||||
'subtitle_forced': session.get('subtitle_forced', media_part_info.get('subtitle_forced','')),
|
||||
'subtitle_location': session.get('subtitle_location', media_part_info.get('subtitle_location','')),
|
||||
'subtitle_language': session.get('subtitle_language', media_part_info.get('subtitle_language','')),
|
||||
'subtitle_language_code': session.get('subtitle_language_code', media_part_info.get('subtitle_language_code','')),
|
||||
'file': media_part_info.get('file',''),
|
||||
'file_size': helpers.humanFileSize(media_part_info.get('file_size','')),
|
||||
'indexes': media_part_info.get('indexes',''),
|
||||
'section_id': metadata['section_id'],
|
||||
'rating_key': metadata['rating_key'],
|
||||
'parent_rating_key': metadata['parent_rating_key'],
|
||||
'grandparent_rating_key': metadata['grandparent_rating_key'],
|
||||
'thumb': metadata['thumb'],
|
||||
'parent_thumb': metadata['parent_thumb'],
|
||||
'grandparent_thumb': metadata['grandparent_thumb'],
|
||||
'poster_title': notify_params['poster_title'],
|
||||
'poster_url': notify_params['poster_url'],
|
||||
'plex_url': notify_params['plex_url'],
|
||||
'imdb_id': notify_params['imdb_id'],
|
||||
'imdb_url': notify_params['imdb_url'],
|
||||
'thetvdb_id': notify_params['thetvdb_id'],
|
||||
'thetvdb_url': notify_params['thetvdb_url'],
|
||||
'themoviedb_id': notify_params['themoviedb_id'],
|
||||
'themoviedb_url': notify_params['themoviedb_url'],
|
||||
'tvmaze_id': notify_params['tvmaze_id'],
|
||||
'tvmaze_url': notify_params['tvmaze_url'],
|
||||
'lastfm_url': notify_params['lastfm_url'],
|
||||
'trakt_url': notify_params['trakt_url'],
|
||||
'container': notify_params['container'],
|
||||
'bitrate': notify_params['bitrate'],
|
||||
'aspect_ratio': notify_params['aspect_ratio'],
|
||||
'video_codec': notify_params['video_codec'],
|
||||
'video_codec_level': notify_params['video_codec_level'],
|
||||
'video_bitrate': notify_params['video_bitrate'],
|
||||
'video_bit_depth': notify_params['video_bit_depth'],
|
||||
'video_framerate': notify_params['video_framerate'],
|
||||
'video_ref_frames': notify_params['video_ref_frames'],
|
||||
'video_resolution': notify_params['video_resolution'],
|
||||
'video_height': notify_params['height'],
|
||||
'video_width': notify_params['width'],
|
||||
'video_language': notify_params['video_language'],
|
||||
'video_language_code': notify_params['video_language_code'],
|
||||
'audio_bitrate': notify_params['audio_bitrate'],
|
||||
'audio_bitrate_mode': notify_params['audio_bitrate_mode'],
|
||||
'audio_codec': notify_params['audio_codec'],
|
||||
'audio_channels': notify_params['audio_channels'],
|
||||
'audio_channel_layout': notify_params['audio_channel_layout'],
|
||||
'audio_sample_rate': notify_params['audio_sample_rate'],
|
||||
'audio_language': notify_params['audio_language'],
|
||||
'audio_language_code': notify_params['audio_language_code'],
|
||||
'subtitle_codec': notify_params['subtitle_codec'],
|
||||
'subtitle_container': notify_params['subtitle_container'],
|
||||
'subtitle_format': notify_params['subtitle_format'],
|
||||
'subtitle_forced': notify_params['subtitle_forced'],
|
||||
'subtitle_location': notify_params['subtitle_location'],
|
||||
'subtitle_language': notify_params['subtitle_language'],
|
||||
'subtitle_language_code': notify_params['subtitle_language_code'],
|
||||
'file': notify_params['file'],
|
||||
'file_size': helpers.humanFileSize(notify_params['file_size']),
|
||||
'indexes': notify_params['indexes'],
|
||||
'section_id': notify_params['section_id'],
|
||||
'rating_key': notify_params['rating_key'],
|
||||
'parent_rating_key': notify_params['parent_rating_key'],
|
||||
'grandparent_rating_key': notify_params['grandparent_rating_key'],
|
||||
'thumb': notify_params['thumb'],
|
||||
'parent_thumb': notify_params['parent_thumb'],
|
||||
'grandparent_thumb': notify_params['grandparent_thumb'],
|
||||
'poster_thumb': poster_thumb
|
||||
}
|
||||
|
||||
@@ -824,53 +839,48 @@ def build_server_notify_params(notify_action=None, **kwargs):
|
||||
date_format = plexpy.CONFIG.DATE_FORMAT.replace('Do','')
|
||||
time_format = plexpy.CONFIG.TIME_FORMAT.replace('Do','')
|
||||
|
||||
# Get the server name
|
||||
server_name = plexpy.CONFIG.PMS_NAME
|
||||
update_channel = pmsconnect.PmsConnect().get_server_update_channel()
|
||||
|
||||
# Get the server uptime
|
||||
plex_tv = plextv.PlexTV()
|
||||
server_times = plex_tv.get_server_times()
|
||||
pms_download_info = defaultdict(str, kwargs.pop('pms_download_info', {}))
|
||||
plexpy_download_info = defaultdict(str, kwargs.pop('plexpy_download_info', {}))
|
||||
|
||||
pms_download_info = kwargs.pop('pms_download_info', {})
|
||||
plexpy_download_info = kwargs.pop('plexpy_download_info', {})
|
||||
|
||||
if server_times:
|
||||
updated_at = server_times['updated_at']
|
||||
server_uptime = helpers.human_duration(int(time.time() - helpers.cast_to_int(updated_at)))
|
||||
else:
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to retrieve server uptime.")
|
||||
server_uptime = 'N/A'
|
||||
|
||||
available_params = {# Global paramaters
|
||||
'plexpy_version': common.VERSION_NUMBER,
|
||||
'plexpy_branch': plexpy.CONFIG.GIT_BRANCH,
|
||||
'plexpy_commit': plexpy.CURRENT_VERSION,
|
||||
'server_name': server_name,
|
||||
'server_uptime': server_uptime,
|
||||
'server_version': server_times.get('version',''),
|
||||
'action': notify_action.split('on_')[-1],
|
||||
available_params = {
|
||||
# Global paramaters
|
||||
'tautulli_version': common.VERSION_NUMBER,
|
||||
'tautulli_remote': plexpy.CONFIG.GIT_REMOTE,
|
||||
'tautulli_branch': plexpy.CONFIG.GIT_BRANCH,
|
||||
'tautulli_commit': plexpy.CURRENT_VERSION,
|
||||
'server_name': plexpy.CONFIG.PMS_NAME,
|
||||
'server_ip': plexpy.CONFIG.PMS_IP,
|
||||
'server_port': plexpy.CONFIG.PMS_PORT,
|
||||
'server_url': plexpy.CONFIG.PMS_URL,
|
||||
'server_platform': plexpy.CONFIG.PMS_PLATFORM,
|
||||
'server_version': plexpy.CONFIG.PMS_VERSION,
|
||||
'server_machine_id': plexpy.CONFIG.PMS_IDENTIFIER,
|
||||
'action': notify_action.lstrip('on_'),
|
||||
'datestamp': arrow.now().format(date_format),
|
||||
'timestamp': arrow.now().format(time_format),
|
||||
'unixtime': int(time.time()),
|
||||
# Plex Media Server update parameters
|
||||
'update_version': pms_download_info.get('version',''),
|
||||
'update_url': pms_download_info.get('download_url',''),
|
||||
'update_release_date': arrow.get(pms_download_info.get('release_date','')).format(date_format)
|
||||
if pms_download_info.get('release_date','') else '',
|
||||
'update_channel': 'Plex Pass' if plexpy.CONFIG.PMS_UPDATE_CHANNEL == 'plexpass' else 'Public',
|
||||
'update_platform': pms_download_info.get('platform',''),
|
||||
'update_distro': pms_download_info.get('distro',''),
|
||||
'update_distro_build': pms_download_info.get('build',''),
|
||||
'update_requirements': pms_download_info.get('requirements',''),
|
||||
'update_extra_info': pms_download_info.get('extra_info',''),
|
||||
'update_changelog_added': pms_download_info.get('changelog_added',''),
|
||||
'update_changelog_fixed': pms_download_info.get('changelog_fixed',''),
|
||||
'update_version': pms_download_info['version'],
|
||||
'update_url': pms_download_info['download_url'],
|
||||
'update_release_date': arrow.get(pms_download_info['release_date']).format(date_format)
|
||||
if pms_download_info['release_date'] else '',
|
||||
'update_channel': 'Beta' if update_channel == 'beta' else 'Public',
|
||||
'update_platform': pms_download_info['platform'],
|
||||
'update_distro': pms_download_info['distro'],
|
||||
'update_distro_build': pms_download_info['build'],
|
||||
'update_requirements': pms_download_info['requirements'],
|
||||
'update_extra_info': pms_download_info['extra_info'],
|
||||
'update_changelog_added': pms_download_info['changelog_added'],
|
||||
'update_changelog_fixed': pms_download_info['changelog_fixed'],
|
||||
# Tautulli update parameters
|
||||
'plexpy_update_version': plexpy_download_info.get('tag_name', ''),
|
||||
'plexpy_update_tar': plexpy_download_info.get('tarball_url', ''),
|
||||
'plexpy_update_zip': plexpy_download_info.get('zipball_url', ''),
|
||||
'plexpy_update_commit': kwargs.pop('plexpy_update_commit', ''),
|
||||
'plexpy_update_behind': kwargs.pop('plexpy_update_behind', ''),
|
||||
'plexpy_update_changelog': plexpy_download_info.get('body', '')
|
||||
'tautulli_update_version': plexpy_download_info['tag_name'],
|
||||
'tautulli_update_tar': plexpy_download_info['tarball_url'],
|
||||
'tautulli_update_zip': plexpy_download_info['zipball_url'],
|
||||
'tautulli_update_commit': kwargs.pop('plexpy_update_commit', ''),
|
||||
'tautulli_update_behind': kwargs.pop('plexpy_update_behind', ''),
|
||||
'tautulli_update_changelog': plexpy_download_info['body']
|
||||
}
|
||||
|
||||
return available_params
|
||||
@@ -930,7 +940,7 @@ def build_notify_text(subject='', body='', notify_action=None, parameters=None,
|
||||
try:
|
||||
script_args = [custom_formatter.format(unicode(arg), **parameters) for arg in subject.split()]
|
||||
except LookupError as e:
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to parse field %s in script argument. Using fallback." % e)
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to parse parameter %s in script argument. Using fallback." % e)
|
||||
script_args = []
|
||||
except Exception as e:
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to parse custom script arguments: %s. Using fallback." % e)
|
||||
@@ -941,7 +951,7 @@ def build_notify_text(subject='', body='', notify_action=None, parameters=None,
|
||||
try:
|
||||
subject = custom_formatter.format(unicode(subject), **parameters)
|
||||
except LookupError as e:
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to parse field %s in notification subject. Using fallback." % e)
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to parse parameter %s in notification subject. Using fallback." % e)
|
||||
subject = unicode(default_subject).format(**parameters)
|
||||
except Exception as e:
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to parse custom notification subject: %s. Using fallback." % e)
|
||||
@@ -950,7 +960,7 @@ def build_notify_text(subject='', body='', notify_action=None, parameters=None,
|
||||
try:
|
||||
body = custom_formatter.format(unicode(body), **parameters)
|
||||
except LookupError as e:
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to parse field %s in notification body. Using fallback." % e)
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to parse parameter %s in notification body. Using fallback." % e)
|
||||
body = unicode(default_body).format(**parameters)
|
||||
except Exception as e:
|
||||
logger.error(u"Tautulli NotificationHandler :: Unable to parse custom notification body: %s. Using fallback." % e)
|
||||
|
@@ -60,8 +60,11 @@ import logger
|
||||
import mobile_app
|
||||
import pmsconnect
|
||||
import request
|
||||
import users
|
||||
from plexpy.config import _BLACKLIST_KEYS, _WHITELIST_KEYS
|
||||
from plexpy.helpers import checked
|
||||
|
||||
|
||||
BROWSER_NOTIFIERS = {}
|
||||
|
||||
|
||||
AGENT_IDS = {'growl': 0,
|
||||
@@ -551,6 +554,10 @@ def set_notifier_config(notifier_id=None, agent_id=None, **kwargs):
|
||||
db.upsert(table_name='notifiers', key_dict=keys, value_dict=values)
|
||||
logger.info(u"Tautulli Notifiers :: Updated notification agent: %s (notifier_id %s)." % (agent['label'], notifier_id))
|
||||
blacklist_logger()
|
||||
|
||||
if agent['name'] == 'browser':
|
||||
check_browser_enabled()
|
||||
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.warn(u"Tautulli Notifiers :: Unable to update notification agent: %s." % e)
|
||||
@@ -624,9 +631,9 @@ class PrettyMetadata(object):
|
||||
poster_url = self.parameters['poster_url']
|
||||
if not poster_url:
|
||||
if self.media_type in ('artist', 'album', 'track'):
|
||||
poster_url = 'https://raw.githubusercontent.com/%s/plexpy/master/data/interfaces/default/images/cover.png' % plexpy.CONFIG.GIT_USER
|
||||
poster_url = 'http://tautulli.com/images/cover.png'
|
||||
else:
|
||||
poster_url = 'https://raw.githubusercontent.com/%s/plexpy/master/data/interfaces/default/images/poster.png' % plexpy.CONFIG.GIT_USER
|
||||
poster_url = 'http://tautulli.com/images/poster.png'
|
||||
return poster_url
|
||||
|
||||
def get_provider_name(self, provider):
|
||||
@@ -715,6 +722,17 @@ class Notifier(object):
|
||||
return new_config
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if self.NAME != 'Script':
|
||||
if not subject and self.config.get('incl_subject', True):
|
||||
logger.error(u"Tautulli Notifiers :: %s notification subject cannot be blank." % self.NAME)
|
||||
return
|
||||
elif not body:
|
||||
logger.error(u"Tautulli Notifiers :: %s notification body cannot be blank." % self.NAME)
|
||||
return
|
||||
|
||||
return self.agent_notify(subject=subject, body=body, action=action, **kwargs)
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
pass
|
||||
|
||||
def make_request(self, url, method='POST', **kwargs):
|
||||
@@ -755,10 +773,7 @@ class ANDROIDAPP(Notifier):
|
||||
|
||||
_ONESIGNAL_APP_ID = '3b4b666a-d557-4b92-acdf-e2c8c4b95357'
|
||||
|
||||
def notify(self, subject='', body='', action='', notification_id=None, **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', notification_id=None, **kwargs):
|
||||
# Check mobile device is still registered
|
||||
device = mobile_app.get_mobile_devices(device_id=self.config['device_id'])
|
||||
if not device:
|
||||
@@ -919,10 +934,7 @@ class BOXCAR(Notifier):
|
||||
'sound': ''
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
data = {'user_credentials': self.config['token'],
|
||||
'notification[title]': subject.encode('utf-8'),
|
||||
'notification[long_message]': body.encode('utf-8'),
|
||||
@@ -989,43 +1001,15 @@ class BROWSER(Notifier):
|
||||
Browser notifications
|
||||
"""
|
||||
NAME = 'Browser'
|
||||
_DEFAULT_CONFIG = {'enabled': 0,
|
||||
'auto_hide_delay': 5
|
||||
_DEFAULT_CONFIG = {'auto_hide_delay': 5
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
logger.info(u"Tautulli Notifiers :: {name} notification sent.".format(name=self.NAME))
|
||||
return True
|
||||
|
||||
def get_notifications(self):
|
||||
if not self.config['enabled']:
|
||||
return
|
||||
|
||||
db = database.MonitorDatabase()
|
||||
result = db.select('SELECT subject_text, body_text FROM notify_log '
|
||||
'WHERE agent_id = 17 AND timestamp >= ? ',
|
||||
args=[time.time() - 3])
|
||||
|
||||
notifications = []
|
||||
for item in result:
|
||||
notification = {'subject_text': item['subject_text'],
|
||||
'body_text': item['body_text'],
|
||||
'delay': self.config['auto_hide_delay']}
|
||||
notifications.append(notification)
|
||||
|
||||
return {'notifications': notifications}
|
||||
|
||||
def return_config_options(self):
|
||||
config_option = [{'label': 'Enable Browser Notifications',
|
||||
'value': self.config['enabled'],
|
||||
'name': 'browser_enabled',
|
||||
'description': 'Enable to display desktop notifications from your browser.',
|
||||
'input_type': 'checkbox'
|
||||
},
|
||||
{'label': 'Allow Notifications',
|
||||
config_option = [{'label': 'Allow Notifications',
|
||||
'value': 'Allow Notifications',
|
||||
'name': 'browser_allow_browser',
|
||||
'description': 'Click to allow browser notifications. You must click this button for each browser.',
|
||||
@@ -1063,10 +1047,7 @@ class DISCORD(Notifier):
|
||||
'music_provider': ''
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
if self.config['incl_subject']:
|
||||
text = subject.encode('utf-8') + '\r\n' + body.encode("utf-8")
|
||||
else:
|
||||
@@ -1179,7 +1160,8 @@ class DISCORD(Notifier):
|
||||
{'label': 'Include Rich Metadata Info',
|
||||
'value': self.config['incl_card'],
|
||||
'name': 'discord_incl_card',
|
||||
'description': 'Include an info card with a poster and metadata with the notifications.',
|
||||
'description': 'Include an info card with a poster and metadata with the notifications.<br>'
|
||||
'Imgur upload may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'checkbox'
|
||||
},
|
||||
{'label': 'Include Plot Summaries',
|
||||
@@ -1203,16 +1185,16 @@ class DISCORD(Notifier):
|
||||
{'label': 'Movie Link Source',
|
||||
'value': self.config['movie_provider'],
|
||||
'name': 'discord_movie_provider',
|
||||
'description': 'Select the source for movie links on the info cards. Leave blank for default.<br> \
|
||||
3rd party API lookup may need to be enabled under the notification settings tab.',
|
||||
'description': 'Select the source for movie links on the info cards. Leave blank for default.<br>'
|
||||
'3rd party API lookup may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'select',
|
||||
'select_options': PrettyMetadata().get_movie_providers()
|
||||
},
|
||||
{'label': 'TV Show Link Source',
|
||||
'value': self.config['tv_provider'],
|
||||
'name': 'discord_tv_provider',
|
||||
'description': 'Select the source for tv show links on the info cards. Leave blank for default.<br> \
|
||||
3rd party API lookup may need to be enabled under the notification settings tab.',
|
||||
'description': 'Select the source for tv show links on the info cards. Leave blank for default.<br>'
|
||||
'3rd party API lookup may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'select',
|
||||
'select_options': PrettyMetadata().get_tv_providers()
|
||||
},
|
||||
@@ -1246,27 +1228,31 @@ class EMAIL(Notifier):
|
||||
'html_support': 1
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
def __init__(self, config=None):
|
||||
super(EMAIL, self).__init__(config=config)
|
||||
|
||||
if not isinstance(self.config['to'], list):
|
||||
self.config['to'] = [x.strip() for x in self.config['to'].split(';')]
|
||||
if not isinstance(self.config['cc'], list):
|
||||
self.config['cc'] = [x.strip() for x in self.config['cc'].split(';')]
|
||||
if not isinstance(self.config['bcc'], list):
|
||||
self.config['bcc'] = [x.strip() for x in self.config['bcc'].split(';')]
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
if self.config['html_support']:
|
||||
body = body.replace('\n', '<br />')
|
||||
msg = MIMEMultipart('alternative')
|
||||
msg.attach(MIMEText(bleach.clean(body, strip=True), 'plain', 'utf-8'))
|
||||
msg.attach(MIMEText(body, 'html', 'utf-8'))
|
||||
else:
|
||||
msg = MIMEText(body, 'plain', 'utf-8')
|
||||
|
||||
msg['Date'] = email.utils.formatdate(localtime=True)
|
||||
msg['Subject'] = subject
|
||||
msg['From'] = email.utils.formataddr((self.config['from_name'], self.config['from']))
|
||||
msg['To'] = self.config['to']
|
||||
msg['CC'] = self.config['cc']
|
||||
msg['To'] = ','.join(self.config['to'])
|
||||
msg['CC'] = ','.join(self.config['cc'])
|
||||
|
||||
recipients = [x.strip() for x in self.config['to'].split(';')] \
|
||||
+ [x.strip() for x in self.config['cc'].split(';')] \
|
||||
+ [x.strip() for x in self.config['bcc'].split(';')]
|
||||
recipients = filter(None, recipients)
|
||||
recipients = self.config['to'] + self.config['cc'] + self.config['bcc']
|
||||
|
||||
try:
|
||||
mailserver = smtplib.SMTP(self.config['smtp_server'], self.config['smtp_port'])
|
||||
@@ -1289,7 +1275,26 @@ class EMAIL(Notifier):
|
||||
logger.error(u"Tautulli Notifiers :: {name} notification failed: {e}".format(name=self.NAME, e=e))
|
||||
return False
|
||||
|
||||
def get_user_emails(self):
|
||||
emails = {u['email']: u['friendly_name'] for u in users.Users().get_users() if u['email']}
|
||||
|
||||
user_emails_to = {v: '' for v in self.config['to']}
|
||||
user_emails_cc = {v: '' for v in self.config['cc']}
|
||||
user_emails_bcc = {v: '' for v in self.config['bcc']}
|
||||
|
||||
user_emails_to.update(emails)
|
||||
user_emails_cc.update(emails)
|
||||
user_emails_bcc.update(emails)
|
||||
|
||||
user_emails_to = [{'value': k, 'text': v} for k, v in user_emails_to.iteritems()]
|
||||
user_emails_cc = [{'value': k, 'text': v} for k, v in user_emails_cc.iteritems()]
|
||||
user_emails_bcc = [{'value': k, 'text': v} for k, v in user_emails_bcc.iteritems()]
|
||||
|
||||
return user_emails_to, user_emails_cc, user_emails_bcc
|
||||
|
||||
def return_config_options(self):
|
||||
user_emails_to, user_emails_cc, user_emails_bcc = self.get_user_emails()
|
||||
|
||||
config_option = [{'label': 'From Name',
|
||||
'value': self.config['from_name'],
|
||||
'name': 'email_from_name',
|
||||
@@ -1305,20 +1310,23 @@ class EMAIL(Notifier):
|
||||
{'label': 'To',
|
||||
'value': self.config['to'],
|
||||
'name': 'email_to',
|
||||
'description': 'The email address(es) of the recipients, separated by semicolons (;).',
|
||||
'input_type': 'text'
|
||||
'description': 'The email address(es) of the recipients.',
|
||||
'input_type': 'selectize',
|
||||
'select_options': user_emails_to
|
||||
},
|
||||
{'label': 'CC',
|
||||
'value': self.config['cc'],
|
||||
'name': 'email_cc',
|
||||
'description': 'The email address(es) to CC, separated by semicolons (;).',
|
||||
'input_type': 'text'
|
||||
'description': 'The email address(es) to CC.',
|
||||
'input_type': 'selectize',
|
||||
'select_options': user_emails_cc
|
||||
},
|
||||
{'label': 'BCC',
|
||||
'value': self.config['bcc'],
|
||||
'name': 'email_bcc',
|
||||
'description': 'The email address(es) to BCC, separated by semicolons (;).',
|
||||
'input_type': 'text'
|
||||
'description': 'The email address(es) to BCC.',
|
||||
'input_type': 'selectize',
|
||||
'select_options': user_emails_bcc
|
||||
},
|
||||
{'label': 'SMTP Server',
|
||||
'value': self.config['smtp_server'],
|
||||
@@ -1353,8 +1361,7 @@ class EMAIL(Notifier):
|
||||
{'label': 'Enable HTML Support',
|
||||
'value': self.config['html_support'],
|
||||
'name': 'email_html_support',
|
||||
'description': 'Style your messages using HTML tags. '
|
||||
'Line breaks (<br>) will be inserted automatically.',
|
||||
'description': 'Style your messages using HTML tags.',
|
||||
'input_type': 'checkbox'
|
||||
}
|
||||
]
|
||||
@@ -1440,10 +1447,7 @@ class FACEBOOK(Notifier):
|
||||
logger.error(u"Tautulli Notifiers :: Error sending {name} post: No {name} Group ID provided.".format(name=self.NAME))
|
||||
return False
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
if self.config['incl_subject']:
|
||||
text = subject.encode('utf-8') + '\r\n' + body.encode("utf-8")
|
||||
else:
|
||||
@@ -1470,17 +1474,17 @@ class FACEBOOK(Notifier):
|
||||
|
||||
def return_config_options(self):
|
||||
config_option = [{'label': 'Instructions',
|
||||
'description': 'Step 1: Visit <a href="' + helpers.anon_url('https://developers.facebook.com/apps') + '" target="_blank"> \
|
||||
Facebook Developers</a> to add a new app using <strong>basic setup</strong>.<br>\
|
||||
Step 2: Click <strong>Add Product</strong> on the left, then <strong>Get Started</strong> \
|
||||
for <strong>Facebook Login</strong>.<br>\
|
||||
Step 3: Fill in <strong>Valid OAuth redirect URIs</strong> with your Tautulli URL (e.g. http://localhost:8181).<br>\
|
||||
Step 4: Click <strong>App Review</strong> on the left and toggle "make public" to <strong>Yes</strong>.<br>\
|
||||
Step 5: Fill in the <strong>Tautulli URL</strong> below with the exact same URL from Step 3.<br>\
|
||||
Step 6: Fill in the <strong>App ID</strong> and <strong>App Secret</strong> below.<br>\
|
||||
Step 7: Click the <strong>Request Authorization</strong> button below to retrieve your access token.<br>\
|
||||
Step 8: Fill in your <strong>Access Token</strong> below if it is not filled in automatically.<br>\
|
||||
Step 9: Fill in your <strong>Group ID</strong> number below. It can be found in the URL of your group page.',
|
||||
'description': 'Step 1: Visit <a href="' + helpers.anon_url('https://developers.facebook.com/apps') + '" target="_blank">'
|
||||
'Facebook Developers</a> to add a new app using <strong>basic setup</strong>.<br>'
|
||||
'Step 2: Click <strong>Add Product</strong> on the left, then <strong>Get Started</strong>'
|
||||
'for <strong>Facebook Login</strong>.<br>'
|
||||
'Step 3: Fill in <strong>Valid OAuth redirect URIs</strong> with your Tautulli URL (e.g. http://localhost:8181).<br>'
|
||||
'Step 4: Click <strong>App Review</strong> on the left and toggle "make public" to <strong>Yes</strong>.<br>'
|
||||
'Step 5: Fill in the <strong>Tautulli URL</strong> below with the exact same URL from Step 3.<br>'
|
||||
'Step 6: Fill in the <strong>App ID</strong> and <strong>App Secret</strong> below.<br>'
|
||||
'Step 7: Click the <strong>Request Authorization</strong> button below to retrieve your access token.<br>'
|
||||
'Step 8: Fill in your <strong>Access Token</strong> below if it is not filled in automatically.<br>'
|
||||
'Step 9: Fill in your <strong>Group ID</strong> number below. It can be found in the URL of your group page.',
|
||||
'input_type': 'help'
|
||||
},
|
||||
{'label': 'Tautulli URL',
|
||||
@@ -1529,22 +1533,23 @@ class FACEBOOK(Notifier):
|
||||
{'label': 'Include Rich Metadata Info',
|
||||
'value': self.config['incl_card'],
|
||||
'name': 'facebook_incl_card',
|
||||
'description': 'Include an info card with a poster and metadata with the notifications.',
|
||||
'description': 'Include an info card with a poster and metadata with the notifications.<br>'
|
||||
'Imgur upload may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'checkbox'
|
||||
},
|
||||
{'label': 'Movie Link Source',
|
||||
'value': self.config['movie_provider'],
|
||||
'name': 'facebook_movie_provider',
|
||||
'description': 'Select the source for movie links on the info cards. Leave blank for default.<br> \
|
||||
3rd party API lookup may need to be enabled under the notification settings tab.',
|
||||
'description': 'Select the source for movie links on the info cards. Leave blank for default.<br>'
|
||||
'3rd party API lookup may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'select',
|
||||
'select_options': PrettyMetadata().get_movie_providers()
|
||||
},
|
||||
{'label': 'TV Show Link Source',
|
||||
'value': self.config['tv_provider'],
|
||||
'name': 'facebook_tv_provider',
|
||||
'description': 'Select the source for tv show links on the info cards. Leave blank for default.<br> \
|
||||
3rd party API lookup may need to be enabled under the notification settings tab.',
|
||||
'description': 'Select the source for tv show links on the info cards. Leave blank for default.<br>'
|
||||
'3rd party API lookup may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'select',
|
||||
'select_options': PrettyMetadata().get_tv_providers()
|
||||
},
|
||||
@@ -1571,10 +1576,7 @@ class GROUPME(Notifier):
|
||||
'incl_poster': 0
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
data = {'bot_id': self.config['bot_id']}
|
||||
|
||||
if self.config['incl_subject']:
|
||||
@@ -1649,10 +1651,7 @@ class GROWL(Notifier):
|
||||
'password': ''
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
# Split host and port
|
||||
if self.config['host'] == "":
|
||||
host, port = "localhost", 23053
|
||||
@@ -1692,7 +1691,7 @@ class GROWL(Notifier):
|
||||
|
||||
# Send it, including an image
|
||||
image_file = os.path.join(str(plexpy.PROG_DIR),
|
||||
"data/interfaces/default/images/logo.png")
|
||||
"data/interfaces/default/images/logo-circle.png")
|
||||
|
||||
with open(image_file, 'rb') as f:
|
||||
image = f.read()
|
||||
@@ -1745,10 +1744,7 @@ class HIPCHAT(Notifier):
|
||||
'music_provider': ''
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
data = {'notify': 'false'}
|
||||
|
||||
text = body.encode('utf-8')
|
||||
@@ -1858,6 +1854,7 @@ class HIPCHAT(Notifier):
|
||||
'value': self.config['incl_card'],
|
||||
'name': 'hipchat_incl_card',
|
||||
'description': 'Include an info card with a poster and metadata with the notifications.<br>'
|
||||
'Imgur upload may need to be enabled under the notifications settings tab.<br>'
|
||||
'Note: This will change the notification type to HTML and emoticons will no longer work.',
|
||||
'input_type': 'checkbox'
|
||||
},
|
||||
@@ -1876,16 +1873,16 @@ class HIPCHAT(Notifier):
|
||||
{'label': 'Movie Link Source',
|
||||
'value': self.config['movie_provider'],
|
||||
'name': 'hipchat_movie_provider',
|
||||
'description': 'Select the source for movie links on the info cards. Leave blank for default.<br> \
|
||||
3rd party API lookup may need to be enabled under the notification settings tab.',
|
||||
'description': 'Select the source for movie links on the info cards. Leave blank for default.<br>'
|
||||
'3rd party API lookup may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'select',
|
||||
'select_options': PrettyMetadata().get_movie_providers()
|
||||
},
|
||||
{'label': 'TV Show Link Source',
|
||||
'value': self.config['tv_provider'],
|
||||
'name': 'hipchat_tv_provider',
|
||||
'description': 'Select the source for tv show links on the info cards. Leave blank for default.<br> \
|
||||
3rd party API lookup may need to be enabled under the notification settings tab.',
|
||||
'description': 'Select the source for tv show links on the info cards. Leave blank for default.<br>'
|
||||
'3rd party API lookup may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'select',
|
||||
'select_options': PrettyMetadata().get_tv_providers()
|
||||
},
|
||||
@@ -1910,10 +1907,7 @@ class IFTTT(Notifier):
|
||||
'event': 'plexpy'
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
event = unicode(self.config['event']).format(action=action)
|
||||
|
||||
data = {'value1': subject.encode("utf-8"),
|
||||
@@ -1953,23 +1947,50 @@ class JOIN(Notifier):
|
||||
"""
|
||||
NAME = 'Join'
|
||||
_DEFAULT_CONFIG = {'api_key': '',
|
||||
'device_id': '',
|
||||
'incl_subject': 1
|
||||
'device_names': '',
|
||||
'priority': 2,
|
||||
'incl_subject': 1,
|
||||
'incl_poster': 0,
|
||||
'movie_provider': '',
|
||||
'tv_provider': '',
|
||||
'music_provider': ''
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
def __init__(self, config=None):
|
||||
super(JOIN, self).__init__(config=config)
|
||||
|
||||
deviceid_key = 'deviceId%s' % ('s' if len(self.config['device_id'].split(',')) > 1 else '')
|
||||
if not isinstance(self.config['device_names'], list):
|
||||
self.config['device_names'] = [x.strip() for x in self.config['device_names'].split(',')]
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
data = {'apikey': self.config['api_key'],
|
||||
deviceid_key: self.config['device_id'],
|
||||
'deviceNames': ','.join(self.config['device_names']),
|
||||
'text': body.encode("utf-8")}
|
||||
|
||||
if self.config['incl_subject']:
|
||||
data['title'] = subject.encode("utf-8")
|
||||
|
||||
if kwargs.get('parameters', {}).get('media_type'):
|
||||
# Grab formatted metadata
|
||||
pretty_metadata = PrettyMetadata(kwargs['parameters'])
|
||||
|
||||
poster_url = pretty_metadata.get_poster_url()
|
||||
if poster_url and self.config['incl_poster']:
|
||||
data['icon'] = poster_url
|
||||
|
||||
if pretty_metadata.media_type == 'movie':
|
||||
provider = self.config['movie_provider']
|
||||
elif pretty_metadata.media_type in ('show', 'season', 'episode'):
|
||||
provider = self.config['tv_provider']
|
||||
elif pretty_metadata.media_type in ('artist', 'album', 'track'):
|
||||
provider = self.config['music_provider']
|
||||
else:
|
||||
provider = None
|
||||
|
||||
provider_link = pretty_metadata.get_provider_link(provider)
|
||||
if provider_link:
|
||||
data['url'] = provider_link
|
||||
|
||||
r = requests.post('https://joinjoaomgcd.appspot.com/_ah/api/messaging/v1/sendPush', params=data)
|
||||
|
||||
if r.status_code == 200:
|
||||
@@ -1987,6 +2008,9 @@ class JOIN(Notifier):
|
||||
return False
|
||||
|
||||
def get_devices(self):
|
||||
devices = {d: d for d in self.config['device_names']}
|
||||
devices.update({'': ''})
|
||||
|
||||
if self.config['api_key']:
|
||||
params = {'apikey': self.config['api_key']}
|
||||
|
||||
@@ -1995,28 +2019,22 @@ class JOIN(Notifier):
|
||||
if r.status_code == 200:
|
||||
response_data = r.json()
|
||||
if response_data.get('success'):
|
||||
devices = response_data.get('records', [])
|
||||
devices = {d['deviceId']: d['deviceName'] for d in devices}
|
||||
devices.update({'': ''})
|
||||
response_devices = response_data.get('records', [])
|
||||
devices.update({d['deviceName']: d['deviceName'] for d in response_devices})
|
||||
return devices
|
||||
else:
|
||||
error_msg = response_data.get('errorMessage')
|
||||
logger.info(u"Tautulli Notifiers :: Unable to retrieve {name} devices list: {msg}".format(name=self.NAME, msg=error_msg))
|
||||
return {'': ''}
|
||||
return devices
|
||||
else:
|
||||
logger.error(u"Tautulli Notifiers :: Unable to retrieve {name} devices list: [{r.status_code}] {r.reason}".format(name=self.NAME, r=r))
|
||||
logger.debug(u"Tautulli Notifiers :: Request response: {}".format(request.server_message(r, True)))
|
||||
return {'': ''}
|
||||
return devices
|
||||
|
||||
else:
|
||||
return {'': ''}
|
||||
return devices
|
||||
|
||||
def return_config_options(self):
|
||||
devices = '<br>'.join(['%s: <span class="inline-pre">%s</span>'
|
||||
% (v, k) for k, v in self.get_devices().iteritems() if k])
|
||||
if not devices:
|
||||
devices = 'Enter your Join API key to load your device list.'
|
||||
|
||||
config_option = [{'label': 'Join API Key',
|
||||
'value': self.config['api_key'],
|
||||
'name': 'join_api_key',
|
||||
@@ -2024,22 +2042,55 @@ class JOIN(Notifier):
|
||||
'input_type': 'text',
|
||||
'refresh': True
|
||||
},
|
||||
{'label': 'Device ID(s) or Group ID',
|
||||
'value': self.config['device_id'],
|
||||
'name': 'join_device_id',
|
||||
'description': 'Set your Join device ID or group ID. ' \
|
||||
'Separate multiple devices with commas (,).',
|
||||
'input_type': 'text',
|
||||
{'label': 'Device Name(s)',
|
||||
'value': self.config['device_names'],
|
||||
'name': 'join_device_names',
|
||||
'description': 'Select your Join device(s).',
|
||||
'input_type': 'select',
|
||||
'select_options': self.get_devices()
|
||||
},
|
||||
{'label': 'Your Devices IDs',
|
||||
'description': devices,
|
||||
'input_type': 'help'
|
||||
{'label': 'Priority',
|
||||
'value': self.config['priority'],
|
||||
'name': 'join_priority',
|
||||
'description': 'Set the notification priority.',
|
||||
'input_type': 'select',
|
||||
'select_options': {-2: -2, -1: -1, 0: 0, 1: 1, 2: 2}
|
||||
},
|
||||
{'label': 'Include Subject Line',
|
||||
'value': self.config['incl_subject'],
|
||||
'name': 'join_incl_subject',
|
||||
'description': 'Include the subject line with the notifications.',
|
||||
'input_type': 'checkbox'
|
||||
},
|
||||
{'label': 'Include Poster Image',
|
||||
'value': self.config['incl_poster'],
|
||||
'name': 'join_incl_poster',
|
||||
'description': 'Include a poster with the notifications.<br>'
|
||||
'Imgur upload may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'checkbox'
|
||||
},
|
||||
{'label': 'Movie Link Source',
|
||||
'value': self.config['movie_provider'],
|
||||
'name': 'join_movie_provider',
|
||||
'description': 'Select the source for movie links on the info cards. Leave blank for default.<br>'
|
||||
'3rd party API lookup may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'select',
|
||||
'select_options': PrettyMetadata().get_movie_providers()
|
||||
},
|
||||
{'label': 'TV Show Link Source',
|
||||
'value': self.config['tv_provider'],
|
||||
'name': 'join_tv_provider',
|
||||
'description': 'Select the source for tv show links on the info cards. Leave blank for default.<br>'
|
||||
'3rd party API lookup may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'select',
|
||||
'select_options': PrettyMetadata().get_tv_providers()
|
||||
},
|
||||
{'label': 'Music Link Source',
|
||||
'value': self.config['music_provider'],
|
||||
'name': 'join_music_provider',
|
||||
'description': 'Select the source for music links on the info cards. Leave blank for default.',
|
||||
'input_type': 'select',
|
||||
'select_options': PrettyMetadata().get_music_providers()
|
||||
}
|
||||
]
|
||||
|
||||
@@ -2062,10 +2113,7 @@ class MQTT(Notifier):
|
||||
'keep_alive': 60
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
if not self.config['topic']:
|
||||
logger.error(u"Tautulli Notifiers :: MQTT topic not specified.")
|
||||
return
|
||||
@@ -2168,10 +2216,7 @@ class NMA(Notifier):
|
||||
'priority': 0
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
title = 'Tautulli'
|
||||
batch = False
|
||||
|
||||
@@ -2248,7 +2293,7 @@ class OSX(Notifier):
|
||||
def _swizzled_bundleIdentifier(self, original, swizzled):
|
||||
return 'ade.plexpy.osxnotify'
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
|
||||
subtitle = kwargs.get('subtitle', '')
|
||||
sound = kwargs.get('sound', '')
|
||||
@@ -2341,10 +2386,7 @@ class PLEX(Notifier):
|
||||
if response:
|
||||
return response[0]['result']
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
hosts = [x.strip() for x in self.config['hosts'].split(',')]
|
||||
|
||||
if self.config['display_time'] > 0:
|
||||
@@ -2355,7 +2397,7 @@ class PLEX(Notifier):
|
||||
if self.config['image']:
|
||||
image = self.config['image']
|
||||
else:
|
||||
image = os.path.join(plexpy.DATA_DIR, os.path.abspath("data/interfaces/default/images/logo.png"))
|
||||
image = os.path.join(plexpy.DATA_DIR, os.path.abspath("data/interfaces/default/images/logo-circle.png"))
|
||||
|
||||
for host in hosts:
|
||||
logger.info(u"Tautulli Notifiers :: Sending notification command to {name} @ {host}".format(name=self.NAME, host=host))
|
||||
@@ -2427,10 +2469,7 @@ class PROWL(Notifier):
|
||||
'priority': 0
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
data = {'apikey': self.config['key'],
|
||||
'application': 'Tautulli',
|
||||
'event': subject.encode("utf-8"),
|
||||
@@ -2468,10 +2507,7 @@ class PUSHALOT(Notifier):
|
||||
_DEFAULT_CONFIG = {'api_key': ''
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
data = {'AuthorizationToken': self.config['api_key'],
|
||||
'Title': subject.encode('utf-8'),
|
||||
'Body': body.encode("utf-8")}
|
||||
@@ -2502,10 +2538,7 @@ class PUSHBULLET(Notifier):
|
||||
'channel_tag': ''
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
data = {'type': 'note',
|
||||
'title': subject.encode("utf-8"),
|
||||
'body': body.encode("utf-8")}
|
||||
@@ -2587,10 +2620,7 @@ class PUSHOVER(Notifier):
|
||||
'music_provider': ''
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
data = {'token': self.config['api_token'],
|
||||
'user': self.config['key'],
|
||||
'title': subject.encode("utf-8"),
|
||||
@@ -2684,16 +2714,16 @@ class PUSHOVER(Notifier):
|
||||
{'label': 'Movie Link Source',
|
||||
'value': self.config['movie_provider'],
|
||||
'name': 'pushover_movie_provider',
|
||||
'description': 'Select the source for movie links on the info cards. Leave blank for default.<br> \
|
||||
3rd party API lookup may need to be enabled under the notification settings tab.',
|
||||
'description': 'Select the source for movie links on the info cards. Leave blank for default.<br>'
|
||||
'3rd party API lookup may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'select',
|
||||
'select_options': PrettyMetadata().get_movie_providers()
|
||||
},
|
||||
{'label': 'TV Show Link Source',
|
||||
'value': self.config['tv_provider'],
|
||||
'name': 'pushover_tv_provider',
|
||||
'description': 'Select the source for tv show links on the info cards. Leave blank for default.<br> \
|
||||
3rd party API lookup may need to be enabled under the notification settings tab.',
|
||||
'description': 'Select the source for tv show links on the info cards. Leave blank for default.<br>'
|
||||
'3rd party API lookup may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'select',
|
||||
'select_options': PrettyMetadata().get_tv_providers()
|
||||
},
|
||||
@@ -2795,7 +2825,7 @@ class SCRIPTS(Notifier):
|
||||
logger.info(u"Tautulli Notifiers :: Script notification sent.")
|
||||
return True
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
"""
|
||||
Args:
|
||||
subject(string, optional): Subject text,
|
||||
@@ -2907,10 +2937,7 @@ class SLACK(Notifier):
|
||||
'music_provider': ''
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
if self.config['incl_subject']:
|
||||
text = subject.encode('utf-8') + '\r\n' + body.encode("utf-8")
|
||||
else:
|
||||
@@ -3022,7 +3049,8 @@ class SLACK(Notifier):
|
||||
{'label': 'Include Rich Metadata Info',
|
||||
'value': self.config['incl_card'],
|
||||
'name': 'slack_incl_card',
|
||||
'description': 'Include an info card with a poster and metadata with the notifications.',
|
||||
'description': 'Include an info card with a poster and metadata with the notifications.<br>'
|
||||
'Imgur upload may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'checkbox'
|
||||
},
|
||||
{'label': 'Include Plot Summaries',
|
||||
@@ -3046,16 +3074,16 @@ class SLACK(Notifier):
|
||||
{'label': 'Movie Link Source',
|
||||
'value': self.config['movie_provider'],
|
||||
'name': 'slack_movie_provider',
|
||||
'description': 'Select the source for movie links on the info cards. Leave blank for default.<br> \
|
||||
3rd party API lookup may need to be enabled under the notification settings tab.',
|
||||
'description': 'Select the source for movie links on the info cards. Leave blank for default.<br>'
|
||||
'3rd party API lookup may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'select',
|
||||
'select_options': PrettyMetadata().get_movie_providers()
|
||||
},
|
||||
{'label': 'TV Show Link Source',
|
||||
'value': self.config['tv_provider'],
|
||||
'name': 'slack_tv_provider',
|
||||
'description': 'Select the source for tv show links on the info cards. Leave blank for default.<br> \
|
||||
3rd party API lookup may need to be enabled under the notification settings tab.',
|
||||
'description': 'Select the source for tv show links on the info cards. Leave blank for default.<br>'
|
||||
'3rd party API lookup may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'select',
|
||||
'select_options': PrettyMetadata().get_tv_providers()
|
||||
},
|
||||
@@ -3084,10 +3112,7 @@ class TELEGRAM(Notifier):
|
||||
'incl_poster': 0
|
||||
}
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not body or not subject:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
data = {'chat_id': self.config['chat_id']}
|
||||
|
||||
if self.config['incl_subject']:
|
||||
@@ -3150,7 +3175,8 @@ class TELEGRAM(Notifier):
|
||||
{'label': 'Include Poster Image',
|
||||
'value': self.config['incl_poster'],
|
||||
'name': 'telegram_incl_poster',
|
||||
'description': 'Include a poster with the notifications.',
|
||||
'description': 'Include a poster with the notifications.<br>'
|
||||
'Imgur upload may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'checkbox'
|
||||
},
|
||||
{'label': 'Enable HTML Support',
|
||||
@@ -3205,10 +3231,7 @@ class TWITTER(Notifier):
|
||||
logger.error(u"Tautulli Notifiers :: {name} notification failed: {e}".format(name=self.NAME, e=e))
|
||||
return False
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
poster_url = ''
|
||||
if self.config['incl_poster'] and kwargs.get('parameters'):
|
||||
parameters = kwargs['parameters']
|
||||
@@ -3221,12 +3244,12 @@ class TWITTER(Notifier):
|
||||
|
||||
def return_config_options(self):
|
||||
config_option = [{'label': 'Instructions',
|
||||
'description': 'Step 1: Visit <a href="' + helpers.anon_url('https://apps.twitter.com') + '" target="_blank"> \
|
||||
Twitter Apps</a> to <strong>Create New App</strong>. A vaild "Website" is not required.<br>\
|
||||
Step 2: Go to <strong>Keys and Access Tokens</strong> and click \
|
||||
<strong>Create my access token</strong>.<br>\
|
||||
Step 3: Fill in the <strong>Consumer Key</strong>, <strong>Consumer Secret</strong>, \
|
||||
<strong>Access Token</strong>, and <strong>Access Token Secret</strong> below.',
|
||||
'description': 'Step 1: Visit <a href="' + helpers.anon_url('https://apps.twitter.com') + '" target="_blank">'
|
||||
'Twitter Apps</a> to <strong>Create New App</strong>. A vaild "Website" is not required.<br>'
|
||||
'Step 2: Go to <strong>Keys and Access Tokens</strong> and click '
|
||||
'<strong>Create my access token</strong>.<br>'
|
||||
'Step 3: Fill in the <strong>Consumer Key</strong>, <strong>Consumer Secret</strong>, '
|
||||
'<strong>Access Token</strong>, and <strong>Access Token Secret</strong> below.',
|
||||
'input_type': 'help'
|
||||
},
|
||||
{'label': 'Twitter Consumer Key',
|
||||
@@ -3262,7 +3285,8 @@ class TWITTER(Notifier):
|
||||
{'label': 'Include Poster Image',
|
||||
'value': self.config['incl_poster'],
|
||||
'name': 'twitter_incl_poster',
|
||||
'description': 'Include a poster with the notifications.',
|
||||
'description': 'Include a poster with the notifications.<br>'
|
||||
'Imgur upload may need to be enabled under the notifications settings tab.',
|
||||
'input_type': 'checkbox'
|
||||
}
|
||||
]
|
||||
@@ -3305,10 +3329,7 @@ class XBMC(Notifier):
|
||||
if response:
|
||||
return response[0]['result']
|
||||
|
||||
def notify(self, subject='', body='', action='', **kwargs):
|
||||
if not subject or not body:
|
||||
return
|
||||
|
||||
def agent_notify(self, subject='', body='', action='', **kwargs):
|
||||
hosts = [x.strip() for x in self.config['hosts'].split(',')]
|
||||
|
||||
if self.config['display_time'] > 0:
|
||||
@@ -3319,7 +3340,7 @@ class XBMC(Notifier):
|
||||
if self.config['image']:
|
||||
image = self.config['image']
|
||||
else:
|
||||
image = os.path.join(plexpy.DATA_DIR, os.path.abspath("data/interfaces/default/images/logo.png"))
|
||||
image = os.path.join(plexpy.DATA_DIR, os.path.abspath("data/interfaces/default/images/logo-circle.png"))
|
||||
|
||||
for host in hosts:
|
||||
logger.info(u"Tautulli Notifiers :: Sending notification command to XMBC @ " + host)
|
||||
@@ -3504,3 +3525,27 @@ def upgrade_config_to_db():
|
||||
notifier_id = add_notifier_config(agent_id=agent_id)
|
||||
set_notifier_config(notifier_id=notifier_id, agent_id=agent_id, **notifier_config)
|
||||
|
||||
|
||||
def check_browser_enabled():
|
||||
global BROWSER_NOTIFIERS
|
||||
BROWSER_NOTIFIERS = {}
|
||||
for n in get_notifiers():
|
||||
if n['agent_id'] == 17 and n['active']:
|
||||
notifier_config = get_notifier_config(n['id'])
|
||||
BROWSER_NOTIFIERS[n['id']] = notifier_config['config']['auto_hide_delay']
|
||||
|
||||
|
||||
def get_browser_notifications():
|
||||
db = database.MonitorDatabase()
|
||||
result = db.select('SELECT notifier_id, subject_text, body_text FROM notify_log '
|
||||
'WHERE agent_id = 17 AND timestamp >= ? ',
|
||||
args=[time.time() - 5])
|
||||
|
||||
notifications = []
|
||||
for item in result:
|
||||
notification = {'subject_text': item['subject_text'],
|
||||
'body_text': item['body_text'],
|
||||
'delay': BROWSER_NOTIFIERS.get(item['notifier_id'], 5)}
|
||||
notifications.append(notification)
|
||||
|
||||
return {'notifications': notifications}
|
||||
|
@@ -335,6 +335,7 @@ class PlexTV(object):
|
||||
"thumb": helpers.get_xml_attr(a, 'thumb'),
|
||||
"email": helpers.get_xml_attr(a, 'email'),
|
||||
"is_home_user": helpers.get_xml_attr(a, 'home'),
|
||||
"is_admin": 1,
|
||||
"is_allow_sync": None,
|
||||
"is_restricted": helpers.get_xml_attr(a, 'restricted'),
|
||||
"filter_all": helpers.get_xml_attr(a, 'filterAll'),
|
||||
@@ -357,6 +358,7 @@ class PlexTV(object):
|
||||
"username": helpers.get_xml_attr(a, 'title'),
|
||||
"thumb": helpers.get_xml_attr(a, 'thumb'),
|
||||
"email": helpers.get_xml_attr(a, 'email'),
|
||||
"is_admin": 0,
|
||||
"is_home_user": helpers.get_xml_attr(a, 'home'),
|
||||
"is_allow_sync": helpers.get_xml_attr(a, 'allowSync'),
|
||||
"is_restricted": helpers.get_xml_attr(a, 'restricted'),
|
||||
@@ -377,6 +379,16 @@ class PlexTV(object):
|
||||
if machine_id is None:
|
||||
machine_id = plexpy.CONFIG.PMS_IDENTIFIER
|
||||
|
||||
if isinstance(rating_key_filter, list):
|
||||
rating_key_filter = [str(k) for k in rating_key_filter]
|
||||
elif rating_key_filter is not None:
|
||||
rating_key_filter = [str(rating_key_filter)]
|
||||
|
||||
if isinstance(user_id_filter, list):
|
||||
user_id_filter = [str(k) for k in user_id_filter]
|
||||
elif user_id_filter is not None:
|
||||
user_id_filter = [str(user_id_filter)]
|
||||
|
||||
sync_list = self.get_plextv_sync_lists(machine_id, output_format='xml')
|
||||
user_data = users.Users()
|
||||
|
||||
@@ -416,7 +428,7 @@ class PlexTV(object):
|
||||
device_last_seen = helpers.get_xml_attr(device, 'lastSeenAt')
|
||||
|
||||
# Filter by user_id
|
||||
if user_id_filter and str(user_id_filter) != device_user_id:
|
||||
if user_id_filter and device_user_id not in user_id_filter:
|
||||
continue
|
||||
|
||||
for synced in a.getElementsByTagName('SyncItems'):
|
||||
@@ -430,7 +442,7 @@ class PlexTV(object):
|
||||
for idx, item in enumerate(clean_uri) if item == 'metadata'), None)
|
||||
|
||||
# Filter by rating_key
|
||||
if rating_key_filter and str(rating_key_filter) != rating_key:
|
||||
if rating_key_filter and rating_key not in rating_key_filter:
|
||||
continue
|
||||
|
||||
sync_id = helpers.get_xml_attr(item, 'id')
|
||||
@@ -459,12 +471,13 @@ class PlexTV(object):
|
||||
status_item_downloaded_count, status_item_count)
|
||||
|
||||
for settings in item.getElementsByTagName('MediaSettings'):
|
||||
settings_audio_boost = helpers.get_xml_attr(settings, 'audioBoost')
|
||||
settings_music_bitrate = helpers.get_xml_attr(settings, 'musicBitrate')
|
||||
settings_photo_quality = helpers.get_xml_attr(settings, 'photoQuality')
|
||||
settings_photo_resolution = helpers.get_xml_attr(settings, 'photoResolution')
|
||||
settings_video_bitrate = helpers.get_xml_attr(settings, 'maxVideoBitrate')
|
||||
settings_video_quality = helpers.get_xml_attr(settings, 'videoQuality')
|
||||
settings_video_resolution = helpers.get_xml_attr(settings, 'videoResolution')
|
||||
settings_audio_boost = helpers.get_xml_attr(settings, 'audioBoost')
|
||||
settings_audio_bitrate = helpers.get_xml_attr(settings, 'musicBitrate')
|
||||
settings_photo_quality = helpers.get_xml_attr(settings, 'photoQuality')
|
||||
settings_photo_resolution = helpers.get_xml_attr(settings, 'photoResolution')
|
||||
|
||||
sync_details = {"device_name": helpers.sanitize(device_name),
|
||||
"platform": helpers.sanitize(device_platform),
|
||||
@@ -481,7 +494,8 @@ class PlexTV(object):
|
||||
"item_complete_count": status_item_complete_count,
|
||||
"item_downloaded_count": status_item_downloaded_count,
|
||||
"item_downloaded_percent_complete": status_item_download_percent_complete,
|
||||
"music_bitrate": settings_music_bitrate,
|
||||
"video_bitrate": settings_video_bitrate,
|
||||
"audio_bitrate": settings_audio_bitrate,
|
||||
"photo_quality": settings_photo_quality,
|
||||
"video_quality": settings_video_quality,
|
||||
"total_size": status_total_size,
|
||||
@@ -639,10 +653,14 @@ class PlexTV(object):
|
||||
|
||||
def get_plex_downloads(self):
|
||||
logger.debug(u"Tautulli PlexTV :: Retrieving current server version.")
|
||||
pmsconnect.PmsConnect().set_server_version()
|
||||
|
||||
logger.debug(u"Tautulli PlexTV :: Plex update channel is %s." % plexpy.CONFIG.PMS_UPDATE_CHANNEL)
|
||||
plex_downloads = self.get_plextv_downloads(plexpass=(plexpy.CONFIG.PMS_UPDATE_CHANNEL == 'plexpass'))
|
||||
pms_connect = pmsconnect.PmsConnect()
|
||||
pms_connect.set_server_version()
|
||||
|
||||
update_channel = pms_connect.get_server_update_channel()
|
||||
|
||||
logger.debug(u"Tautulli PlexTV :: Plex update channel is %s." % update_channel)
|
||||
plex_downloads = self.get_plextv_downloads(plexpass=(update_channel == 'beta'))
|
||||
|
||||
try:
|
||||
available_downloads = json.loads(plex_downloads)
|
||||
|
@@ -13,6 +13,9 @@
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with Tautulli. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
import json
|
||||
import os
|
||||
import time
|
||||
import urllib
|
||||
|
||||
import plexpy
|
||||
@@ -195,7 +198,6 @@ class PmsConnect(object):
|
||||
"""
|
||||
uri = '/hubs/metadata/' + rating_key + '/related'
|
||||
request = self.request_handler.make_request(uri=uri,
|
||||
proto=self.protocol,
|
||||
request_type='GET',
|
||||
output_format=output_format)
|
||||
|
||||
@@ -520,7 +522,7 @@ class PmsConnect(object):
|
||||
|
||||
return output
|
||||
|
||||
def get_metadata_details(self, rating_key='', sync_id=''):
|
||||
def get_metadata_details(self, rating_key='', sync_id='', cache_key=None):
|
||||
"""
|
||||
Return processed and validated metadata list for requested item.
|
||||
|
||||
@@ -528,42 +530,61 @@ class PmsConnect(object):
|
||||
|
||||
Output: array
|
||||
"""
|
||||
metadata = {}
|
||||
|
||||
if cache_key:
|
||||
in_file_path = os.path.join(plexpy.CONFIG.CACHE_DIR, 'metadata-sessionKey-%s.json' % cache_key)
|
||||
try:
|
||||
with open(in_file_path, 'r') as inFile:
|
||||
metadata = json.load(inFile)
|
||||
except (IOError, ValueError) as e:
|
||||
pass
|
||||
|
||||
if metadata:
|
||||
_cache_time = metadata.pop('_cache_time', 0)
|
||||
# Return cached metadata if less than METADATA_CACHE_SECONDS ago
|
||||
if int(time.time()) - _cache_time <= plexpy.CONFIG.METADATA_CACHE_SECONDS:
|
||||
return metadata
|
||||
|
||||
if rating_key:
|
||||
metadata = self.get_metadata(str(rating_key), output_format='xml')
|
||||
metadata_xml = self.get_metadata(str(rating_key), output_format='xml')
|
||||
elif sync_id:
|
||||
metadata = self.get_sync_item(str(sync_id), output_format='xml')
|
||||
metadata_xml = self.get_sync_item(str(sync_id), output_format='xml')
|
||||
|
||||
try:
|
||||
xml_head = metadata.getElementsByTagName('MediaContainer')
|
||||
xml_head = metadata_xml.getElementsByTagName('MediaContainer')
|
||||
except Exception as e:
|
||||
logger.warn(u"Tautulli Pmsconnect :: Unable to parse XML for get_metadata_details: %s." % e)
|
||||
return {}
|
||||
|
||||
metadata = {}
|
||||
|
||||
for a in xml_head:
|
||||
if a.getAttribute('size'):
|
||||
if a.getAttribute('size') != '1':
|
||||
if a.getAttribute('size') == '0':
|
||||
return metadata
|
||||
|
||||
if a.getElementsByTagName('Directory'):
|
||||
metadata_main = a.getElementsByTagName('Directory')[0]
|
||||
metadata_type = helpers.get_xml_attr(metadata_main, 'type')
|
||||
if metadata_type == 'photo':
|
||||
metadata_type = 'photo_album'
|
||||
metadata_main_list = a.getElementsByTagName('Directory')
|
||||
elif a.getElementsByTagName('Video'):
|
||||
metadata_main = a.getElementsByTagName('Video')[0]
|
||||
metadata_type = helpers.get_xml_attr(metadata_main, 'type')
|
||||
metadata_main_list = a.getElementsByTagName('Video')
|
||||
elif a.getElementsByTagName('Track'):
|
||||
metadata_main = a.getElementsByTagName('Track')[0]
|
||||
metadata_type = helpers.get_xml_attr(metadata_main, 'type')
|
||||
metadata_main_list = a.getElementsByTagName('Track')
|
||||
elif a.getElementsByTagName('Photo'):
|
||||
metadata_main = a.getElementsByTagName('Photo')[0]
|
||||
metadata_type = helpers.get_xml_attr(metadata_main, 'type')
|
||||
metadata_main_list = a.getElementsByTagName('Photo')
|
||||
else:
|
||||
logger.debug(u"Tautulli Pmsconnect :: Metadata failed")
|
||||
return {}
|
||||
|
||||
if sync_id and len(metadata_main_list) > 1:
|
||||
for metadata_main in metadata_main_list:
|
||||
if helpers.get_xml_attr(metadata_main, 'ratingKey') == rating_key:
|
||||
break
|
||||
else:
|
||||
metadata_main = metadata_main_list[0]
|
||||
|
||||
metadata_type = helpers.get_xml_attr(metadata_main, 'type')
|
||||
if metadata_type == 'photo':
|
||||
metadata_type = 'photo_album'
|
||||
|
||||
section_id = helpers.get_xml_attr(a, 'librarySectionID')
|
||||
library_name = helpers.get_xml_attr(a, 'librarySectionTitle')
|
||||
|
||||
@@ -572,6 +593,7 @@ class PmsConnect(object):
|
||||
actors = []
|
||||
genres = []
|
||||
labels = []
|
||||
collections = []
|
||||
|
||||
if metadata_main.getElementsByTagName('Director'):
|
||||
for director in metadata_main.getElementsByTagName('Director'):
|
||||
@@ -593,6 +615,10 @@ class PmsConnect(object):
|
||||
for label in metadata_main.getElementsByTagName('Label'):
|
||||
labels.append(helpers.get_xml_attr(label, 'tag'))
|
||||
|
||||
if metadata_main.getElementsByTagName('Collection'):
|
||||
for collection in metadata_main.getElementsByTagName('Collection'):
|
||||
collections.append(helpers.get_xml_attr(collection, 'tag'))
|
||||
|
||||
if metadata_type == 'movie':
|
||||
metadata = {'media_type': metadata_type,
|
||||
'section_id': section_id,
|
||||
@@ -630,6 +656,7 @@ class PmsConnect(object):
|
||||
'actors': actors,
|
||||
'genres': genres,
|
||||
'labels': labels,
|
||||
'collections': collections,
|
||||
'full_title': helpers.get_xml_attr(metadata_main, 'title')
|
||||
}
|
||||
|
||||
@@ -670,6 +697,7 @@ class PmsConnect(object):
|
||||
'actors': actors,
|
||||
'genres': genres,
|
||||
'labels': labels,
|
||||
'collections': collections,
|
||||
'full_title': helpers.get_xml_attr(metadata_main, 'title')
|
||||
}
|
||||
|
||||
@@ -712,6 +740,7 @@ class PmsConnect(object):
|
||||
'actors': show_details['actors'],
|
||||
'genres': show_details['genres'],
|
||||
'labels': show_details['labels'],
|
||||
'collections': show_details['collections'],
|
||||
'full_title': u'{} - {}'.format(helpers.get_xml_attr(metadata_main, 'parentTitle'),
|
||||
helpers.get_xml_attr(metadata_main, 'title'))
|
||||
}
|
||||
@@ -755,6 +784,7 @@ class PmsConnect(object):
|
||||
'actors': show_details['actors'],
|
||||
'genres': show_details['genres'],
|
||||
'labels': show_details['labels'],
|
||||
'collections': show_details['collections'],
|
||||
'full_title': u'{} - {}'.format(helpers.get_xml_attr(metadata_main, 'grandparentTitle'),
|
||||
helpers.get_xml_attr(metadata_main, 'title'))
|
||||
}
|
||||
@@ -796,6 +826,7 @@ class PmsConnect(object):
|
||||
'actors': actors,
|
||||
'genres': genres,
|
||||
'labels': labels,
|
||||
'collections': collections,
|
||||
'full_title': helpers.get_xml_attr(metadata_main, 'title')
|
||||
}
|
||||
|
||||
@@ -838,6 +869,7 @@ class PmsConnect(object):
|
||||
'actors': actors,
|
||||
'genres': genres,
|
||||
'labels': labels,
|
||||
'collections': collections,
|
||||
'full_title': u'{} - {}'.format(helpers.get_xml_attr(metadata_main, 'parentTitle'),
|
||||
helpers.get_xml_attr(metadata_main, 'title'))
|
||||
}
|
||||
@@ -881,6 +913,7 @@ class PmsConnect(object):
|
||||
'actors': actors,
|
||||
'genres': album_details['genres'],
|
||||
'labels': album_details['labels'],
|
||||
'collections': album_details['collections'],
|
||||
'full_title': u'{} - {}'.format(helpers.get_xml_attr(metadata_main, 'grandparentTitle'),
|
||||
helpers.get_xml_attr(metadata_main, 'title'))
|
||||
}
|
||||
@@ -922,6 +955,7 @@ class PmsConnect(object):
|
||||
'actors': actors,
|
||||
'genres': genres,
|
||||
'labels': labels,
|
||||
'collections': collections,
|
||||
'full_title': helpers.get_xml_attr(metadata_main, 'title')
|
||||
}
|
||||
|
||||
@@ -964,6 +998,7 @@ class PmsConnect(object):
|
||||
'actors': actors,
|
||||
'genres': photo_album_details['genres'],
|
||||
'labels': photo_album_details['labels'],
|
||||
'collections': photo_album_details['collections'],
|
||||
'full_title': u'{} - {}'.format(helpers.get_xml_attr(metadata_main, 'parentTitle'),
|
||||
helpers.get_xml_attr(metadata_main, 'title'))
|
||||
}
|
||||
@@ -1009,6 +1044,7 @@ class PmsConnect(object):
|
||||
'actors': actors,
|
||||
'genres': genres,
|
||||
'labels': labels,
|
||||
'collections': collections,
|
||||
'full_title': helpers.get_xml_attr(metadata_main, 'title')
|
||||
}
|
||||
|
||||
@@ -1049,6 +1085,7 @@ class PmsConnect(object):
|
||||
'actors': actors,
|
||||
'genres': genres,
|
||||
'labels': labels,
|
||||
'collections': collections,
|
||||
'full_title': helpers.get_xml_attr(metadata_main, 'title')
|
||||
}
|
||||
|
||||
@@ -1103,7 +1140,7 @@ class PmsConnect(object):
|
||||
'subtitle_codec': helpers.get_xml_attr(stream, 'codec'),
|
||||
'subtitle_container': helpers.get_xml_attr(stream, 'container'),
|
||||
'subtitle_format': helpers.get_xml_attr(stream, 'format'),
|
||||
'subtitle_forced': 1 if helpers.get_xml_attr(stream, 'forced') == '1' else 0,
|
||||
'subtitle_forced': int(helpers.get_xml_attr(stream, 'forced') == '1'),
|
||||
'subtitle_location': 'external' if helpers.get_xml_attr(stream, 'key') else 'embedded',
|
||||
'subtitle_language': helpers.get_xml_attr(stream, 'language'),
|
||||
'subtitle_language_code': helpers.get_xml_attr(stream, 'languageCode')
|
||||
@@ -1112,7 +1149,7 @@ class PmsConnect(object):
|
||||
parts.append({'id': helpers.get_xml_attr(part, 'id'),
|
||||
'file': helpers.get_xml_attr(part, 'file'),
|
||||
'file_size': helpers.get_xml_attr(part, 'size'),
|
||||
'indexes': 1 if helpers.get_xml_attr(part, 'indexes') == 'sd' else 0,
|
||||
'indexes': int(helpers.get_xml_attr(part, 'indexes') == 'sd'),
|
||||
'streams': streams
|
||||
})
|
||||
|
||||
@@ -1132,13 +1169,24 @@ class PmsConnect(object):
|
||||
'audio_channels': audio_channels,
|
||||
'audio_channel_layout': common.AUDIO_CHANNELS.get(audio_channels, audio_channels),
|
||||
'audio_profile': helpers.get_xml_attr(media, 'audioProfile'),
|
||||
'optimized_version': 1 if helpers.get_xml_attr(media, 'proxyType') == '42' else 0,
|
||||
'optimized_version': int(helpers.get_xml_attr(media, 'proxyType') == '42'),
|
||||
'parts': parts
|
||||
})
|
||||
|
||||
metadata['media_info'] = medias
|
||||
|
||||
if metadata:
|
||||
if cache_key:
|
||||
metadata['_cache_time'] = int(time.time())
|
||||
|
||||
out_file_path = os.path.join(plexpy.CONFIG.CACHE_DIR, 'metadata-sessionKey-%s.json' % cache_key)
|
||||
try:
|
||||
with open(out_file_path, 'w') as outFile:
|
||||
json.dump(metadata, outFile)
|
||||
except (IOError, ValueError) as e:
|
||||
logger.error(u"Tautulli Pmsconnect :: Unable to create cache file for metadata (sessionKey %s): %s"
|
||||
% (cache_key, e))
|
||||
|
||||
return metadata
|
||||
else:
|
||||
return {}
|
||||
@@ -1300,6 +1348,7 @@ class PmsConnect(object):
|
||||
# Get the source media type
|
||||
media_type = helpers.get_xml_attr(session, 'type')
|
||||
rating_key = helpers.get_xml_attr(session, 'ratingKey')
|
||||
session_key = helpers.get_xml_attr(session, 'sessionKey')
|
||||
|
||||
# Get the user details
|
||||
user_info = session.getElementsByTagName('User')[0]
|
||||
@@ -1342,7 +1391,7 @@ class PmsConnect(object):
|
||||
else:
|
||||
session_details = {'session_id': '',
|
||||
'bandwidth': '',
|
||||
'location': 'Unknown'
|
||||
'location': 'wan' if player_details['local'] == '0' else 'lan'
|
||||
}
|
||||
|
||||
# Get the transcode details
|
||||
@@ -1353,7 +1402,7 @@ class PmsConnect(object):
|
||||
transcode_speed = helpers.get_xml_attr(transcode_info, 'speed')
|
||||
|
||||
transcode_details = {'transcode_key': helpers.get_xml_attr(transcode_info, 'key'),
|
||||
'transcode_throttled': 1 if helpers.get_xml_attr(transcode_info, 'throttled') == '1' else 0,
|
||||
'transcode_throttled': int(helpers.get_xml_attr(transcode_info, 'throttled') == '1'),
|
||||
'transcode_progress': int(round(helpers.cast_to_float(transcode_progress), 0)),
|
||||
'transcode_speed': str(round(helpers.cast_to_float(transcode_speed), 1)),
|
||||
'transcode_audio_channels': helpers.get_xml_attr(transcode_info, 'audioChannels'),
|
||||
@@ -1363,12 +1412,12 @@ class PmsConnect(object):
|
||||
'transcode_height': helpers.get_xml_attr(transcode_info, 'height'), # Blank but keep backwards compatibility
|
||||
'transcode_container': helpers.get_xml_attr(transcode_info, 'container'),
|
||||
'transcode_protocol': helpers.get_xml_attr(transcode_info, 'protocol'),
|
||||
'transcode_hw_requested': 1 if helpers.get_xml_attr(transcode_info, 'transcodeHwRequested') == '1' else 0,
|
||||
'transcode_hw_requested': int(helpers.get_xml_attr(transcode_info, 'transcodeHwRequested') == '1'),
|
||||
'transcode_hw_decode': helpers.get_xml_attr(transcode_info, 'transcodeHwDecoding'),
|
||||
'transcode_hw_decode_title': helpers.get_xml_attr(transcode_info, 'transcodeHwDecodingTitle'),
|
||||
'transcode_hw_encode': helpers.get_xml_attr(transcode_info, 'transcodeHwEncoding'),
|
||||
'transcode_hw_encode_title': helpers.get_xml_attr(transcode_info, 'transcodeHwEncodingTitle'),
|
||||
'transcode_hw_full_pipeline': 1 if helpers.get_xml_attr(transcode_info, 'transcodeHwFullPipeline') == '1' else 0,
|
||||
'transcode_hw_full_pipeline': int(helpers.get_xml_attr(transcode_info, 'transcodeHwFullPipeline') == '1'),
|
||||
'audio_decision': helpers.get_xml_attr(transcode_info, 'audioDecision'),
|
||||
'video_decision': helpers.get_xml_attr(transcode_info, 'videoDecision'),
|
||||
'subtitle_decision': helpers.get_xml_attr(transcode_info, 'subtitleDecision'),
|
||||
@@ -1398,6 +1447,10 @@ class PmsConnect(object):
|
||||
'throttled': '0' # Keep for backwards compatibility
|
||||
}
|
||||
|
||||
# Check HW decoding/encoding
|
||||
transcode_details['transcode_hw_decoding'] = int(transcode_details['transcode_hw_decode'].lower() in common.HW_DECODERS)
|
||||
transcode_details['transcode_hw_encoding'] = int(transcode_details['transcode_hw_encode'].lower() in common.HW_ENCODERS)
|
||||
|
||||
# Generate a combined transcode decision value
|
||||
if transcode_details['video_decision'] == 'transcode' or transcode_details['audio_decision'] == 'transcode':
|
||||
transcode_decision = 'transcode'
|
||||
@@ -1411,16 +1464,24 @@ class PmsConnect(object):
|
||||
if media_type not in ('photo', 'clip') and not session.getElementsByTagName('Session') \
|
||||
and helpers.get_xml_attr(session, 'ratingKey').isdigit() and transcode_decision == 'direct play':
|
||||
plex_tv = plextv.PlexTV()
|
||||
parent_rating_key = helpers.get_xml_attr(session, 'parentRatingKey')
|
||||
grandparent_rating_key = helpers.get_xml_attr(session, 'grandparentRatingKey')
|
||||
|
||||
synced_items = plex_tv.get_synced_items(client_id_filter=player_details['machine_id'],
|
||||
rating_key_filter=rating_key)
|
||||
rating_key_filter=[rating_key, parent_rating_key, grandparent_rating_key])
|
||||
if synced_items:
|
||||
sync_id = synced_items[0]['sync_id']
|
||||
synced_item_details = synced_items[0]
|
||||
sync_id = synced_item_details['sync_id']
|
||||
synced_xml = self.get_sync_item(sync_id=sync_id, output_format='xml')
|
||||
synced_xml_head = synced_xml.getElementsByTagName('MediaContainer')
|
||||
if synced_xml_head[0].getElementsByTagName('Track'):
|
||||
synced_session_data = synced_xml_head[0].getElementsByTagName('Track')[0]
|
||||
synced_xml_items = synced_xml_head[0].getElementsByTagName('Track')
|
||||
elif synced_xml_head[0].getElementsByTagName('Video'):
|
||||
synced_session_data = synced_xml_head[0].getElementsByTagName('Video')[0]
|
||||
synced_xml_items = synced_xml_head[0].getElementsByTagName('Video')
|
||||
|
||||
for synced_session_data in synced_xml_items:
|
||||
if helpers.get_xml_attr(synced_session_data, 'ratingKey') == rating_key:
|
||||
break
|
||||
|
||||
# Figure out which version is being played
|
||||
if sync_id:
|
||||
@@ -1490,7 +1551,7 @@ class PmsConnect(object):
|
||||
subtitle_details = {'stream_subtitle_codec': helpers.get_xml_attr(subtitle_stream_info, 'codec'),
|
||||
'stream_subtitle_container': helpers.get_xml_attr(subtitle_stream_info, 'container'),
|
||||
'stream_subtitle_format': helpers.get_xml_attr(subtitle_stream_info, 'format'),
|
||||
'stream_subtitle_forced': 1 if helpers.get_xml_attr(subtitle_stream_info, 'forced') == '1' else 0,
|
||||
'stream_subtitle_forced': int(helpers.get_xml_attr(subtitle_stream_info, 'forced') == '1'),
|
||||
'stream_subtitle_location': helpers.get_xml_attr(subtitle_stream_info, 'location'),
|
||||
'stream_subtitle_language': helpers.get_xml_attr(subtitle_stream_info, 'language'),
|
||||
'stream_subtitle_language_code': helpers.get_xml_attr(subtitle_stream_info, 'languageCode'),
|
||||
@@ -1538,10 +1599,10 @@ class PmsConnect(object):
|
||||
'stream_duration': helpers.get_xml_attr(stream_media_info, 'duration') or helpers.get_xml_attr(session, 'duration'),
|
||||
'stream_container_decision': 'direct play' if sync_id else helpers.get_xml_attr(stream_media_parts_info, 'decision').replace('directplay', 'direct play'),
|
||||
'transcode_decision': transcode_decision,
|
||||
'optimized_version': 1 if helpers.get_xml_attr(stream_media_info, 'proxyType') == '42' else 0,
|
||||
'optimized_version': int(helpers.get_xml_attr(stream_media_info, 'proxyType') == '42'),
|
||||
'optimized_version_title': helpers.get_xml_attr(stream_media_info, 'title'),
|
||||
'synced_version': 1 if sync_id else 0,
|
||||
'indexes': 1 if indexes == 'sd' else 0,
|
||||
'indexes': int(indexes == 'sd'),
|
||||
'bif_thumb': bif_thumb,
|
||||
'subtitles': 1 if subtitle_id and subtitle_selected else 0
|
||||
}
|
||||
@@ -1554,6 +1615,7 @@ class PmsConnect(object):
|
||||
channel_stream = 1
|
||||
|
||||
clip_media = session.getElementsByTagName('Media')[0]
|
||||
clip_part = clip_media.getElementsByTagName('Part')[0]
|
||||
audio_channels = helpers.get_xml_attr(clip_media, 'audioChannels')
|
||||
metadata_details = {'media_type': media_type,
|
||||
'section_id': helpers.get_xml_attr(session, 'librarySectionID'),
|
||||
@@ -1592,7 +1654,8 @@ class PmsConnect(object):
|
||||
'genres': [],
|
||||
'labels': [],
|
||||
'full_title': helpers.get_xml_attr(session, 'title'),
|
||||
'container': helpers.get_xml_attr(clip_media, 'container'),
|
||||
'container': helpers.get_xml_attr(clip_media, 'container') \
|
||||
or helpers.get_xml_attr(clip_part, 'container'),
|
||||
'height': helpers.get_xml_attr(clip_media, 'height'),
|
||||
'width': helpers.get_xml_attr(clip_media, 'width'),
|
||||
'video_codec': helpers.get_xml_attr(clip_media, 'videoCodec'),
|
||||
@@ -1601,7 +1664,8 @@ class PmsConnect(object):
|
||||
'audio_channels': audio_channels,
|
||||
'audio_channel_layout': common.AUDIO_CHANNELS.get(audio_channels, audio_channels),
|
||||
'channel_icon': helpers.get_xml_attr(session, 'sourceIcon'),
|
||||
'channel_title': helpers.get_xml_attr(session, 'sourceTitle')
|
||||
'channel_title': helpers.get_xml_attr(session, 'sourceTitle'),
|
||||
'live': int(helpers.get_xml_attr(session, 'live') == '1')
|
||||
}
|
||||
else:
|
||||
channel_stream = 0
|
||||
@@ -1610,9 +1674,9 @@ class PmsConnect(object):
|
||||
part_id = helpers.get_xml_attr(stream_media_parts_info, 'id')
|
||||
|
||||
if sync_id:
|
||||
metadata_details = self.get_metadata_details(sync_id=sync_id)
|
||||
metadata_details = self.get_metadata_details(rating_key=rating_key, sync_id=sync_id, cache_key=session_key)
|
||||
else:
|
||||
metadata_details = self.get_metadata_details(rating_key=rating_key)
|
||||
metadata_details = self.get_metadata_details(rating_key=rating_key, cache_key=session_key)
|
||||
|
||||
# Get the media info, fallback to first item if match id is not found
|
||||
source_medias = metadata_details.pop('media_info', [])
|
||||
@@ -1667,51 +1731,72 @@ class PmsConnect(object):
|
||||
source_subtitle_details = next((p for p in source_media_part_streams if p['id'] == subtitle_id),
|
||||
next((p for p in source_media_part_streams if p['type'] == '3'), source_subtitle_details))
|
||||
|
||||
# Overrides for live sessions
|
||||
if metadata_details.get('live') and transcode_decision == 'transcode':
|
||||
stream_details['stream_container_decision'] = 'transcode'
|
||||
stream_details['stream_container'] = transcode_details['transcode_container']
|
||||
|
||||
video_details['stream_video_decision'] = transcode_details['video_decision']
|
||||
stream_details['stream_video_codec'] = transcode_details['transcode_video_codec']
|
||||
stream_details['stream_video_resolution'] = metadata_details['video_resolution']
|
||||
|
||||
audio_details['stream_audio_decision'] = transcode_details['audio_decision']
|
||||
stream_details['stream_audio_codec'] = transcode_details['transcode_audio_codec']
|
||||
stream_details['stream_audio_channels'] = transcode_details['transcode_audio_channels']
|
||||
stream_details['stream_audio_channel_layout'] = common.AUDIO_CHANNELS.get(
|
||||
transcode_details['transcode_audio_channels'], transcode_details['transcode_audio_channels'])
|
||||
|
||||
# Get the quality profile
|
||||
if media_type in ('movie', 'episode', 'clip') and 'stream_bitrate' in stream_details:
|
||||
stream_bitrate = helpers.cast_to_int(stream_details['stream_bitrate'])
|
||||
source_bitrate = helpers.cast_to_int(source_media_details.get('bitrate'))
|
||||
|
||||
try:
|
||||
quailtiy_bitrate = min(b for b in common.VIDEO_QUALITY_PROFILES if stream_bitrate <= b <= source_bitrate)
|
||||
quality_profile = common.VIDEO_QUALITY_PROFILES[quailtiy_bitrate]
|
||||
except ValueError:
|
||||
if sync_id:
|
||||
quality_profile = 'Original'
|
||||
|
||||
if sync_id:
|
||||
synced_item_bitrate = helpers.cast_to_int(synced_item_details['video_bitrate'])
|
||||
try:
|
||||
synced_bitrate = min(b for b in common.VIDEO_QUALITY_PROFILES if source_bitrate <= b)
|
||||
synced_bitrate = max(b for b in common.VIDEO_QUALITY_PROFILES if b <= synced_item_bitrate)
|
||||
synced_version_profile = common.VIDEO_QUALITY_PROFILES[synced_bitrate]
|
||||
except ValueError:
|
||||
synced_version_profile = 'Original'
|
||||
else:
|
||||
synced_version_profile = ''
|
||||
|
||||
stream_bitrate = helpers.cast_to_int(stream_details['stream_bitrate'])
|
||||
source_bitrate = helpers.cast_to_int(source_media_details.get('bitrate'))
|
||||
try:
|
||||
quailtiy_bitrate = min(
|
||||
b for b in common.VIDEO_QUALITY_PROFILES if stream_bitrate <= b <= source_bitrate)
|
||||
quality_profile = common.VIDEO_QUALITY_PROFILES[quailtiy_bitrate]
|
||||
except ValueError:
|
||||
quality_profile = 'Original'
|
||||
|
||||
if stream_details['optimized_version']:
|
||||
optimized_version_profile = '{} Mbps {}'.format(round(source_bitrate / 1000.0, 1),
|
||||
plexpy.common.VIDEO_RESOLUTION_OVERRIDES.get(source_media_details['video_resolution'], source_media_details['video_resolution']))
|
||||
plexpy.common.VIDEO_RESOLUTION_OVERRIDES.get(source_media_details['video_resolution'],
|
||||
source_media_details['video_resolution']))
|
||||
else:
|
||||
optimized_version_profile = ''
|
||||
|
||||
elif media_type == 'track' and 'stream_bitrate' in stream_details:
|
||||
stream_bitrate = helpers.cast_to_int(stream_details['stream_bitrate'])
|
||||
source_bitrate = helpers.cast_to_int(source_media_details.get('bitrate'))
|
||||
|
||||
try:
|
||||
quailtiy_bitrate = min(b for b in common.AUDIO_QUALITY_PROFILES if stream_bitrate <= b <= source_bitrate)
|
||||
quality_profile = common.AUDIO_QUALITY_PROFILES[quailtiy_bitrate]
|
||||
except ValueError:
|
||||
if sync_id:
|
||||
quality_profile = 'Original'
|
||||
|
||||
if sync_id:
|
||||
synced_item_bitrate = helpers.cast_to_int(synced_item_details['audio_bitrate'])
|
||||
try:
|
||||
synced_bitrate = min(b for b in common.AUDIO_QUALITY_PROFILES if source_bitrate <= b)
|
||||
synced_bitrate = max(b for b in common.AUDIO_QUALITY_PROFILES if b <= synced_item_bitrate)
|
||||
synced_version_profile = common.AUDIO_QUALITY_PROFILES[synced_bitrate]
|
||||
except ValueError:
|
||||
synced_version_profile = 'Original'
|
||||
else:
|
||||
synced_version_profile = ''
|
||||
|
||||
stream_bitrate = helpers.cast_to_int(stream_details['stream_bitrate'])
|
||||
source_bitrate = helpers.cast_to_int(source_media_details.get('bitrate'))
|
||||
try:
|
||||
quailtiy_bitrate = min(b for b in common.AUDIO_QUALITY_PROFILES if stream_bitrate <= b <= source_bitrate)
|
||||
quality_profile = common.AUDIO_QUALITY_PROFILES[quailtiy_bitrate]
|
||||
except ValueError:
|
||||
quality_profile = 'Original'
|
||||
|
||||
optimized_version_profile = ''
|
||||
|
||||
elif media_type == 'photo':
|
||||
@@ -1725,7 +1810,7 @@ class PmsConnect(object):
|
||||
optimized_version_profile = ''
|
||||
|
||||
# Entire session output (single dict for backwards compatibility)
|
||||
session_output = {'session_key': helpers.get_xml_attr(session, 'sessionKey'),
|
||||
session_output = {'session_key': session_key,
|
||||
'media_type': media_type,
|
||||
'view_offset': view_offset,
|
||||
'progress_percent': str(helpers.get_percent(view_offset, stream_details['stream_duration'])),
|
||||
@@ -2502,3 +2587,14 @@ class PmsConnect(object):
|
||||
|
||||
plexpy.CONFIG.__setattr__('PMS_VERSION', version)
|
||||
plexpy.CONFIG.write()
|
||||
|
||||
def get_server_update_channel(self):
|
||||
if plexpy.CONFIG.PMS_UPDATE_CHANNEL == 'plex':
|
||||
update_channel_value = self.get_server_pref('ButlerUpdateChannel')
|
||||
|
||||
if update_channel_value == '8':
|
||||
return 'beta'
|
||||
else:
|
||||
return 'public'
|
||||
|
||||
return plexpy.CONFIG.PMS_UPDATE_CHANNEL
|
||||
|
@@ -23,15 +23,14 @@ def get_session_info():
|
||||
"""
|
||||
Returns the session info for the user session
|
||||
"""
|
||||
from plexpy.webauth import SESSION_KEY
|
||||
|
||||
_session = {'user_id': None,
|
||||
'user': None,
|
||||
'user_group': 'admin',
|
||||
'expiry': None}
|
||||
try:
|
||||
return cherrypy.session.get(SESSION_KEY, _session)
|
||||
except AttributeError as e:
|
||||
'exp': None}
|
||||
|
||||
if isinstance(cherrypy.request.login, dict):
|
||||
return cherrypy.request.login
|
||||
|
||||
return _session
|
||||
|
||||
def get_session_user():
|
||||
|
@@ -52,6 +52,7 @@ def refresh_users():
|
||||
new_value_dict = {"username": item['username'],
|
||||
"thumb": item['thumb'],
|
||||
"email": item['email'],
|
||||
"is_admin": item['is_admin'],
|
||||
"is_home_user": item['is_home_user'],
|
||||
"is_allow_sync": item['is_allow_sync'],
|
||||
"is_restricted": item['is_restricted'],
|
||||
@@ -330,6 +331,7 @@ class Users(object):
|
||||
'friendly_name': 'Local',
|
||||
'user_thumb': common.DEFAULT_USER_THUMB,
|
||||
'email': '',
|
||||
'is_admin': '',
|
||||
'is_home_user': 0,
|
||||
'is_allow_sync': 0,
|
||||
'is_restricted': 0,
|
||||
@@ -349,21 +351,21 @@ class Users(object):
|
||||
try:
|
||||
if str(user_id).isdigit():
|
||||
query = 'SELECT user_id, username, friendly_name, thumb AS user_thumb, custom_avatar_url AS custom_thumb, ' \
|
||||
'email, is_home_user, is_allow_sync, is_restricted, do_notify, keep_history, deleted_user, ' \
|
||||
'email, is_admin, is_home_user, is_allow_sync, is_restricted, do_notify, keep_history, deleted_user, ' \
|
||||
'allow_guest, shared_libraries ' \
|
||||
'FROM users ' \
|
||||
'WHERE user_id = ? '
|
||||
result = monitor_db.select(query, args=[user_id])
|
||||
elif user:
|
||||
query = 'SELECT user_id, username, friendly_name, thumb AS user_thumb, custom_avatar_url AS custom_thumb, ' \
|
||||
'email, is_home_user, is_allow_sync, is_restricted, do_notify, keep_history, deleted_user, ' \
|
||||
'email, is_admin, is_home_user, is_allow_sync, is_restricted, do_notify, keep_history, deleted_user, ' \
|
||||
'allow_guest, shared_libraries ' \
|
||||
'FROM users ' \
|
||||
'WHERE username = ? COLLATE NOCASE '
|
||||
result = monitor_db.select(query, args=[user])
|
||||
elif email:
|
||||
query = 'SELECT user_id, username, friendly_name, thumb AS user_thumb, custom_avatar_url AS custom_thumb, ' \
|
||||
'email, is_home_user, is_allow_sync, is_restricted, do_notify, keep_history, deleted_user, ' \
|
||||
'email, is_admin, is_home_user, is_allow_sync, is_restricted, do_notify, keep_history, deleted_user, ' \
|
||||
'allow_guest, shared_libraries ' \
|
||||
'FROM users ' \
|
||||
'WHERE email = ? COLLATE NOCASE '
|
||||
@@ -398,6 +400,7 @@ class Users(object):
|
||||
'friendly_name': friendly_name,
|
||||
'user_thumb': user_thumb,
|
||||
'email': item['email'],
|
||||
'is_admin': item['is_admin'],
|
||||
'is_home_user': item['is_home_user'],
|
||||
'is_allow_sync': item['is_allow_sync'],
|
||||
'is_restricted': item['is_restricted'],
|
||||
@@ -580,6 +583,27 @@ class Users(object):
|
||||
|
||||
return recently_watched
|
||||
|
||||
def get_users(self):
|
||||
monitor_db = database.MonitorDatabase()
|
||||
|
||||
try:
|
||||
query = 'SELECT user_id, username, friendly_name, email FROM users WHERE deleted_user = 0'
|
||||
result = monitor_db.select(query=query)
|
||||
except Exception as e:
|
||||
logger.warn(u"Tautulli Users :: Unable to execute database query for get_users: %s." % e)
|
||||
return None
|
||||
|
||||
users = []
|
||||
for item in result:
|
||||
user = {'user_id': item['user_id'],
|
||||
'username': item['username'],
|
||||
'friendly_name': item['friendly_name'] or item['username'],
|
||||
'email': item['email']
|
||||
}
|
||||
users.append(user)
|
||||
|
||||
return users
|
||||
|
||||
def delete_all_history(self, user_id=None):
|
||||
monitor_db = database.MonitorDatabase()
|
||||
|
||||
|
@@ -1,2 +1,2 @@
|
||||
PLEXPY_BRANCH = "beta"
|
||||
PLEXPY_RELEASE_VERSION = "v2.0.6-beta"
|
||||
PLEXPY_RELEASE_VERSION = "v2.0.16-beta"
|
||||
|
@@ -178,6 +178,10 @@ def checkGithub(auto_update=False):
|
||||
url = 'https://api.github.com/repos/%s/plexpy/releases' % plexpy.CONFIG.GIT_USER
|
||||
releases = request.request_json(url, timeout=20, whitelist_status_code=404, validator=lambda x: type(x) == list)
|
||||
|
||||
if releases is None:
|
||||
logger.warn('Could not get releases from GitHub.')
|
||||
return plexpy.LATEST_VERSION
|
||||
|
||||
if plexpy.CONFIG.GIT_BRANCH == 'master':
|
||||
release = next((r for r in releases if not r['prerelease']), releases[0])
|
||||
elif plexpy.CONFIG.GIT_BRANCH == 'beta':
|
||||
@@ -294,14 +298,14 @@ def checkout_git_branch():
|
||||
logger.info('Output: ' + str(output))
|
||||
|
||||
|
||||
def read_changelog(latest_only=False):
|
||||
def read_changelog(latest_only=False, since_prev_release=False):
|
||||
changelog_file = os.path.join(plexpy.PROG_DIR, 'CHANGELOG.md')
|
||||
|
||||
if not os.path.isfile(changelog_file):
|
||||
return '<h4>Missing changelog file</h4>'
|
||||
|
||||
try:
|
||||
output = ''
|
||||
output = ['']
|
||||
prev_level = 0
|
||||
|
||||
latest_version_found = False
|
||||
@@ -325,27 +329,34 @@ def read_changelog(latest_only=False):
|
||||
break
|
||||
elif latest_only:
|
||||
latest_version_found = True
|
||||
# Add a space to the end of the release to match tags
|
||||
elif since_prev_release and str(plexpy.PREV_RELEASE) + ' ' in header_text:
|
||||
break
|
||||
|
||||
output += '<h' + header_level + '>' + header_text + '</h' + header_level + '>'
|
||||
output[-1] += '<h' + header_level + '>' + header_text + '</h' + header_level + '>'
|
||||
|
||||
elif line_list_match:
|
||||
line_level = len(line_list_match.group(1)) / 2
|
||||
line_text = line_list_match.group(2)
|
||||
|
||||
if line_level > prev_level:
|
||||
output += '<ul>' * (line_level - prev_level) + '<li>' + line_text + '</li>'
|
||||
output[-1] += '<ul>' * (line_level - prev_level) + '<li>' + line_text + '</li>'
|
||||
elif line_level < prev_level:
|
||||
output += '</ul>' * (prev_level - line_level) + '<li>' + line_text + '</li>'
|
||||
output[-1] += '</ul>' * (prev_level - line_level) + '<li>' + line_text + '</li>'
|
||||
else:
|
||||
output += '<li>' + line_text + '</li>'
|
||||
output[-1] += '<li>' + line_text + '</li>'
|
||||
|
||||
prev_level = line_level
|
||||
|
||||
elif line.strip() == '' and prev_level:
|
||||
output += '</ul>' * (prev_level)
|
||||
output[-1] += '</ul>' * (prev_level)
|
||||
output.append('')
|
||||
prev_level = 0
|
||||
|
||||
return output
|
||||
if since_prev_release:
|
||||
output.reverse()
|
||||
|
||||
return ''.join(output)
|
||||
|
||||
except IOError as e:
|
||||
logger.error('Tautulli Version Checker :: Unable to open changelog file. %s' % e)
|
||||
|
@@ -180,7 +180,7 @@ def process(opcode, data):
|
||||
info = json.loads(data)
|
||||
except Exception as e:
|
||||
logger.warn(u"Tautulli WebSocket :: Error decoding message from websocket: %s" % e)
|
||||
logger.debug(data)
|
||||
logger.websocket_error(data)
|
||||
return False
|
||||
|
||||
info = info.get('NotificationContainer', info)
|
||||
|
@@ -18,12 +18,12 @@
|
||||
# Form based authentication for CherryPy. Requires the
|
||||
# Session tool to be loaded.
|
||||
|
||||
from cgi import escape
|
||||
from datetime import datetime, timedelta
|
||||
import re
|
||||
|
||||
import cherrypy
|
||||
from hashing_passwords import check_hash
|
||||
import jwt
|
||||
|
||||
import plexpy
|
||||
import logger
|
||||
@@ -32,7 +32,9 @@ from plexpy.users import Users, refresh_users
|
||||
from plexpy.plextv import PlexTV
|
||||
|
||||
|
||||
SESSION_KEY = '_cp_username'
|
||||
JWT_ALGORITHM = 'HS256'
|
||||
JWT_COOKIE_NAME = 'tautulli_token_'
|
||||
|
||||
|
||||
def user_login(username=None, password=None):
|
||||
if not username or not password:
|
||||
@@ -52,10 +54,17 @@ def user_login(username=None, password=None):
|
||||
if user_id != str(user_details['user_id']):
|
||||
# The user is not in the database.
|
||||
return None
|
||||
elif plexpy.CONFIG.HTTP_PLEX_ADMIN and user_details['is_admin']:
|
||||
# Plex admin login
|
||||
return 'admin'
|
||||
elif not user_details['allow_guest'] or user_details['deleted_user']:
|
||||
# Guest access is disabled or the user is deleted.
|
||||
return None
|
||||
|
||||
# Stop here if guest access is not enabled
|
||||
if not plexpy.CONFIG.ALLOW_GUEST_ACCESS:
|
||||
return None
|
||||
|
||||
# The user is in the database, and guest access is enabled, so try to retrieve a server token.
|
||||
# If a server token is returned, then the user is a valid friend of the server.
|
||||
plex_tv = PlexTV(token=user_token)
|
||||
@@ -73,7 +82,7 @@ def user_login(username=None, password=None):
|
||||
# Refresh the users list to make sure we have all the correct permissions.
|
||||
refresh_users()
|
||||
# Successful login
|
||||
return True
|
||||
return 'guest'
|
||||
else:
|
||||
logger.warn(u"Tautulli WebAuth :: Unable to register user '%s' in database." % username)
|
||||
return None
|
||||
@@ -89,37 +98,62 @@ def user_login(username=None, password=None):
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def check_credentials(username, password, admin_login='0'):
|
||||
"""Verifies credentials for username and password.
|
||||
Returns True and the user group on success or False and no user group"""
|
||||
|
||||
if plexpy.CONFIG.HTTP_PASSWORD:
|
||||
if plexpy.CONFIG.HTTP_HASHED_PASSWORD and \
|
||||
username == plexpy.CONFIG.HTTP_USERNAME and check_hash(password, plexpy.CONFIG.HTTP_PASSWORD):
|
||||
return True, u'admin'
|
||||
elif username == plexpy.CONFIG.HTTP_USERNAME and password == plexpy.CONFIG.HTTP_PASSWORD:
|
||||
return True, u'admin'
|
||||
elif not admin_login == '1' and plexpy.CONFIG.ALLOW_GUEST_ACCESS and user_login(username, password):
|
||||
return True, u'guest'
|
||||
else:
|
||||
return True, 'admin'
|
||||
elif not plexpy.CONFIG.HTTP_HASHED_PASSWORD and \
|
||||
username == plexpy.CONFIG.HTTP_USERNAME and password == plexpy.CONFIG.HTTP_PASSWORD:
|
||||
return True, 'admin'
|
||||
|
||||
if plexpy.CONFIG.HTTP_PLEX_ADMIN or (not admin_login == '1' and plexpy.CONFIG.ALLOW_GUEST_ACCESS):
|
||||
plex_login = user_login(username, password)
|
||||
if plex_login is not None:
|
||||
return True, plex_login
|
||||
|
||||
return False, None
|
||||
|
||||
|
||||
def check_jwt_token():
|
||||
jwt_cookie = JWT_COOKIE_NAME + plexpy.CONFIG.PMS_UUID
|
||||
jwt_token = cherrypy.request.cookie.get(jwt_cookie)
|
||||
|
||||
if jwt_token:
|
||||
try:
|
||||
payload = jwt.decode(
|
||||
jwt_token.value, plexpy.CONFIG.JWT_SECRET, leeway=timedelta(seconds=10), algorithms=[JWT_ALGORITHM]
|
||||
)
|
||||
except (jwt.DecodeError, jwt.ExpiredSignatureError):
|
||||
return None
|
||||
|
||||
return payload
|
||||
|
||||
|
||||
def check_auth(*args, **kwargs):
|
||||
"""A tool that looks in config for 'auth.require'. If found and it
|
||||
is not None, a login is required and the entry is evaluated as a list of
|
||||
conditions that the user must fulfill"""
|
||||
conditions = cherrypy.request.config.get('auth.require', None)
|
||||
if conditions is not None:
|
||||
_session = cherrypy.session.get(SESSION_KEY)
|
||||
payload = check_jwt_token()
|
||||
|
||||
if payload:
|
||||
cherrypy.request.login = payload
|
||||
|
||||
if _session and (_session['user'] and _session['expiry']) and _session['expiry'] > datetime.now():
|
||||
cherrypy.request.login = _session['user']
|
||||
for condition in conditions:
|
||||
# A condition is just a callable that returns true or false
|
||||
if not condition():
|
||||
raise cherrypy.HTTPRedirect(plexpy.HTTP_ROOT)
|
||||
|
||||
else:
|
||||
raise cherrypy.HTTPRedirect(plexpy.HTTP_ROOT + "auth/logout")
|
||||
|
||||
|
||||
def requireAuth(*conditions):
|
||||
"""A decorator that appends conditions to the auth.require config
|
||||
variable."""
|
||||
@@ -140,14 +174,13 @@ def requireAuth(*conditions):
|
||||
#
|
||||
# Define those at will however suits the application.
|
||||
|
||||
def member_of(groupname):
|
||||
def check():
|
||||
# replace with actual check if <username> is in <groupname>
|
||||
return cherrypy.request.login == plexpy.CONFIG.HTTP_USERNAME and groupname == 'admin'
|
||||
return check
|
||||
def member_of(user_group):
|
||||
return lambda: cherrypy.request.login and cherrypy.request.login['user_group'] == user_group
|
||||
|
||||
|
||||
def name_is(user_name):
|
||||
return lambda: cherrypy.request.login and cherrypy.request.login['user'] == user_name
|
||||
|
||||
def name_is(reqd_username):
|
||||
return lambda: reqd_username == cherrypy.request.login
|
||||
|
||||
# These might be handy
|
||||
|
||||
@@ -160,6 +193,7 @@ def any_of(*conditions):
|
||||
return False
|
||||
return check
|
||||
|
||||
|
||||
# By default all conditions are required, but this might still be
|
||||
# needed if you want to use it inside of an any_of(...) condition
|
||||
def all_of(*conditions):
|
||||
@@ -176,6 +210,11 @@ def all_of(*conditions):
|
||||
|
||||
class AuthController(object):
|
||||
|
||||
def check_auth_enabled(self):
|
||||
if not plexpy.CONFIG.HTTP_BASIC_AUTH and plexpy.CONFIG.HTTP_PASSWORD:
|
||||
return
|
||||
raise cherrypy.HTTPRedirect(plexpy.HTTP_ROOT)
|
||||
|
||||
def on_login(self, user_id, username, user_group):
|
||||
"""Called on successful login"""
|
||||
|
||||
@@ -196,7 +235,7 @@ class AuthController(object):
|
||||
|
||||
def on_logout(self, username, user_group):
|
||||
"""Called on logout"""
|
||||
logger.debug(u"Tautulli WebAuth :: %s User '%s' logged out of Tautulli." % (user_group.capitalize(), username))
|
||||
logger.debug(u"Tautulli WebAuth :: %s user '%s' logged out of Tautulli." % (user_group.capitalize(), username))
|
||||
|
||||
def on_login_failed(self, username):
|
||||
"""Called on failed login"""
|
||||
@@ -212,25 +251,48 @@ class AuthController(object):
|
||||
user_agent=user_agent,
|
||||
success=0)
|
||||
|
||||
def get_loginform(self, username="", msg=""):
|
||||
def get_loginform(self):
|
||||
from plexpy.webserve import serve_template
|
||||
return serve_template(templatename="login.html", title="Login", username=escape(username, True), msg=msg)
|
||||
return serve_template(templatename="login.html", title="Login")
|
||||
|
||||
@cherrypy.expose
|
||||
def index(self):
|
||||
raise cherrypy.HTTPRedirect(plexpy.HTTP_ROOT + "auth/login")
|
||||
|
||||
@cherrypy.expose
|
||||
def login(self, username=None, password=None, remember_me='0', admin_login='0'):
|
||||
if not cherrypy.config.get('tools.sessions.on'):
|
||||
raise cherrypy.HTTPRedirect(plexpy.HTTP_ROOT)
|
||||
def login(self):
|
||||
self.check_auth_enabled()
|
||||
|
||||
if not username and not password:
|
||||
return self.get_loginform()
|
||||
|
||||
(vaild_login, user_group) = check_credentials(username, password, admin_login)
|
||||
@cherrypy.expose
|
||||
def logout(self):
|
||||
self.check_auth_enabled()
|
||||
|
||||
if vaild_login:
|
||||
payload = check_jwt_token()
|
||||
if payload:
|
||||
self.on_logout(payload['user'], payload['user_group'])
|
||||
|
||||
jwt_cookie = JWT_COOKIE_NAME + plexpy.CONFIG.PMS_UUID
|
||||
cherrypy.response.cookie[jwt_cookie] = 'expire'
|
||||
cherrypy.response.cookie[jwt_cookie]['expires'] = 0
|
||||
cherrypy.response.cookie[jwt_cookie]['path'] = '/'
|
||||
|
||||
cherrypy.request.login = None
|
||||
raise cherrypy.HTTPRedirect(plexpy.HTTP_ROOT + "auth/login")
|
||||
|
||||
@cherrypy.expose
|
||||
@cherrypy.tools.json_out()
|
||||
def signin(self, username=None, password=None, remember_me='0', admin_login='0'):
|
||||
if cherrypy.request.method != 'POST':
|
||||
cherrypy.response.status = 405
|
||||
return {'status': 'error', 'message': 'Sign in using POST.'}
|
||||
|
||||
error_message = {'status': 'error', 'message': 'Incorrect username or password.'}
|
||||
|
||||
valid_login, user_group = check_credentials(username, password, admin_login)
|
||||
|
||||
if valid_login:
|
||||
if user_group == 'guest':
|
||||
if re.match(r"[^@]+@[^@]+\.[^@]+", username):
|
||||
user_details = Users().get_details(email=username)
|
||||
@@ -241,35 +303,37 @@ class AuthController(object):
|
||||
else:
|
||||
user_id = None
|
||||
|
||||
expiry = datetime.now() + (timedelta(days=30) if remember_me == '1' else timedelta(minutes=60))
|
||||
time_delta = timedelta(days=30) if remember_me == '1' else timedelta(minutes=60)
|
||||
expiry = datetime.utcnow() + time_delta
|
||||
|
||||
cherrypy.request.login = username
|
||||
cherrypy.session[SESSION_KEY] = {'user_id': user_id,
|
||||
payload = {
|
||||
'user_id': user_id,
|
||||
'user': username,
|
||||
'user_group': user_group,
|
||||
'expiry': expiry}
|
||||
'exp': expiry
|
||||
}
|
||||
|
||||
jwt_token = jwt.encode(payload, plexpy.CONFIG.JWT_SECRET, algorithm=JWT_ALGORITHM)
|
||||
|
||||
self.on_login(user_id, username, user_group)
|
||||
raise cherrypy.HTTPRedirect(plexpy.HTTP_ROOT)
|
||||
|
||||
jwt_cookie = JWT_COOKIE_NAME + plexpy.CONFIG.PMS_UUID
|
||||
cherrypy.response.cookie[jwt_cookie] = jwt_token
|
||||
cherrypy.response.cookie[jwt_cookie]['expires'] = int(time_delta.total_seconds())
|
||||
cherrypy.response.cookie[jwt_cookie]['path'] = '/'
|
||||
|
||||
cherrypy.request.login = payload
|
||||
cherrypy.response.status = 200
|
||||
return {'status': 'success', 'token': jwt_token.decode('utf-8'), 'uuid': plexpy.CONFIG.PMS_UUID}
|
||||
|
||||
elif admin_login == '1':
|
||||
self.on_login_failed(username)
|
||||
logger.debug(u"Tautulli WebAuth :: Invalid admin login attempt from '%s'." % username)
|
||||
raise cherrypy.HTTPRedirect(plexpy.HTTP_ROOT)
|
||||
cherrypy.response.status = 401
|
||||
return error_message
|
||||
|
||||
else:
|
||||
self.on_login_failed(username)
|
||||
logger.debug(u"Tautulli WebAuth :: Invalid login attempt from '%s'." % username)
|
||||
return self.get_loginform(username, u"Incorrect username/email or password.")
|
||||
|
||||
@cherrypy.expose
|
||||
def logout(self):
|
||||
if not cherrypy.config.get('tools.sessions.on'):
|
||||
raise cherrypy.HTTPRedirect(plexpy.HTTP_ROOT)
|
||||
|
||||
_session = cherrypy.session.get(SESSION_KEY)
|
||||
cherrypy.session[SESSION_KEY] = None
|
||||
|
||||
if _session and _session['user']:
|
||||
cherrypy.request.login = None
|
||||
self.on_logout(_session['user'], _session['user_group'])
|
||||
raise cherrypy.HTTPRedirect(plexpy.HTTP_ROOT + "auth/login")
|
||||
cherrypy.response.status = 401
|
||||
return error_message
|
||||
|
@@ -2201,9 +2201,8 @@ class WebInterface(object):
|
||||
@cherrypy.tools.json_out()
|
||||
@requireAuth()
|
||||
def get_sync(self, machine_id=None, user_id=None, **kwargs):
|
||||
|
||||
if not machine_id:
|
||||
machine_id = plexpy.CONFIG.PMS_IDENTIFIER
|
||||
if user_id == 'null':
|
||||
user_id = None
|
||||
|
||||
plex_tv = plextv.PlexTV()
|
||||
result = plex_tv.get_synced_items(machine_id=machine_id, user_id_filter=user_id)
|
||||
@@ -2538,6 +2537,7 @@ class WebInterface(object):
|
||||
"http_password": http_password,
|
||||
"http_root": plexpy.CONFIG.HTTP_ROOT,
|
||||
"http_proxy": checked(plexpy.CONFIG.HTTP_PROXY),
|
||||
"http_plex_admin": checked(plexpy.CONFIG.HTTP_PLEX_ADMIN),
|
||||
"launch_browser": checked(plexpy.CONFIG.LAUNCH_BROWSER),
|
||||
"enable_https": checked(plexpy.CONFIG.ENABLE_HTTPS),
|
||||
"https_create_cert": checked(plexpy.CONFIG.HTTPS_CREATE_CERT),
|
||||
@@ -2632,7 +2632,7 @@ class WebInterface(object):
|
||||
"monitor_pms_updates", "monitor_remote_access", "get_file_sizes", "log_blacklist", "http_hash_password",
|
||||
"allow_guest_access", "cache_images", "http_proxy", "http_basic_auth", "notify_concurrent_by_ip",
|
||||
"history_table_activity", "plexpy_auto_update",
|
||||
"themoviedb_lookup", "tvmaze_lookup"
|
||||
"themoviedb_lookup", "tvmaze_lookup", "http_plex_admin"
|
||||
]
|
||||
for checked_config in checked_configs:
|
||||
if checked_config not in kwargs:
|
||||
@@ -2673,8 +2673,7 @@ class WebInterface(object):
|
||||
refresh_users = False
|
||||
|
||||
# First run from the setup wizard
|
||||
if kwargs.get('first_run'):
|
||||
del kwargs['first_run']
|
||||
if kwargs.pop('first_run', None):
|
||||
first_run = True
|
||||
|
||||
# If we change any monitoring settings, make sure we reschedule tasks.
|
||||
@@ -2728,12 +2727,15 @@ class WebInterface(object):
|
||||
refresh_libraries = True
|
||||
|
||||
# If we change the server, make sure we grab the new url and refresh libraries and users lists.
|
||||
if kwargs.get('server_changed'):
|
||||
del kwargs['server_changed']
|
||||
if kwargs.pop('server_changed', None):
|
||||
server_changed = True
|
||||
refresh_users = True
|
||||
refresh_libraries = True
|
||||
|
||||
# If we change the authentication settings, make sure we refresh the users lists.
|
||||
if kwargs.pop('auth_changed', None):
|
||||
refresh_users = True
|
||||
|
||||
plexpy.CONFIG.process_kwargs(kwargs)
|
||||
|
||||
# Write the config
|
||||
@@ -2795,12 +2797,16 @@ class WebInterface(object):
|
||||
def get_server_update_params(self, **kwargs):
|
||||
plex_tv = plextv.PlexTV()
|
||||
plexpass = plex_tv.get_plexpass_status()
|
||||
|
||||
update_channel = pmsconnect.PmsConnect().get_server_update_channel()
|
||||
|
||||
return {'plexpass': plexpass,
|
||||
'pms_platform': common.PMS_PLATFORM_NAME_OVERRIDES.get(
|
||||
plexpy.CONFIG.PMS_PLATFORM, plexpy.CONFIG.PMS_PLATFORM),
|
||||
'pms_update_channel': plexpy.CONFIG.PMS_UPDATE_CHANNEL,
|
||||
'pms_update_distro': plexpy.CONFIG.PMS_UPDATE_DISTRO,
|
||||
'pms_update_distro_build': plexpy.CONFIG.PMS_UPDATE_DISTRO_BUILD}
|
||||
'pms_update_distro_build': plexpy.CONFIG.PMS_UPDATE_DISTRO_BUILD,
|
||||
'plex_update_channel': 'plexpass' if update_channel == 'beta' else 'public'}
|
||||
|
||||
@cherrypy.expose
|
||||
@cherrypy.tools.json_out()
|
||||
@@ -3079,7 +3085,6 @@ class WebInterface(object):
|
||||
|
||||
@cherrypy.expose
|
||||
@requireAuth(member_of("admin"))
|
||||
@addtoapi("notify")
|
||||
def send_notification(self, notifier_id=None, subject='Tautulli', body='Test notification', notify_action='', **kwargs):
|
||||
""" Send a notification using Tautulli.
|
||||
|
||||
@@ -3124,8 +3129,7 @@ class WebInterface(object):
|
||||
@cherrypy.tools.json_out()
|
||||
@requireAuth(member_of("admin"))
|
||||
def get_browser_notifications(self, **kwargs):
|
||||
browser = notifiers.BROWSER()
|
||||
result = browser.get_notifications()
|
||||
result = notifiers.get_browser_notifications()
|
||||
|
||||
if result:
|
||||
notifications = result['notifications']
|
||||
@@ -3544,13 +3548,20 @@ class WebInterface(object):
|
||||
|
||||
@cherrypy.expose
|
||||
@requireAuth(member_of("admin"))
|
||||
def get_changelog(self, latest_only=False, update_shown=False, **kwargs):
|
||||
latest_only = True if latest_only == 'true' else False
|
||||
def get_changelog(self, latest_only=False, since_prev_release=False, update_shown=False, **kwargs):
|
||||
latest_only = (latest_only == 'true')
|
||||
since_prev_release = (since_prev_release == 'true')
|
||||
|
||||
if since_prev_release and plexpy.PREV_RELEASE == common.VERSION_NUMBER:
|
||||
latest_only = True
|
||||
since_prev_release = False
|
||||
|
||||
# Set update changelog shown status
|
||||
if update_shown == 'true':
|
||||
plexpy.CONFIG.__setattr__('UPDATE_SHOW_CHANGELOG', 0)
|
||||
plexpy.CONFIG.write()
|
||||
return versioncheck.read_changelog(latest_only=latest_only)
|
||||
|
||||
return versioncheck.read_changelog(latest_only=latest_only, since_prev_release=since_prev_release)
|
||||
|
||||
##### Info #####
|
||||
|
||||
@@ -4437,11 +4448,12 @@ class WebInterface(object):
|
||||
if session_key:
|
||||
return next((s for s in result['sessions'] if s['session_key'] == session_key), {})
|
||||
|
||||
|
||||
counts = {'stream_count_direct_play': 0,
|
||||
'stream_count_direct_stream': 0,
|
||||
'stream_count_transcode': 0,
|
||||
'total_bandwidth': 0}
|
||||
'total_bandwidth': 0,
|
||||
'lan_bandwidth': 0,
|
||||
'wan_bandwidth': 0}
|
||||
|
||||
for s in result['sessions']:
|
||||
if s['transcode_decision'] == 'transcode':
|
||||
@@ -4452,6 +4464,10 @@ class WebInterface(object):
|
||||
counts['stream_count_direct_play'] += 1
|
||||
|
||||
counts['total_bandwidth'] += helpers.cast_to_int(s['bandwidth'])
|
||||
if s['location'] == 'lan':
|
||||
counts['lan_bandwidth'] += helpers.cast_to_int(s['bandwidth'])
|
||||
else:
|
||||
counts['wan_bandwidth'] += helpers.cast_to_int(s['bandwidth'])
|
||||
|
||||
result.update(counts)
|
||||
|
||||
|
@@ -35,7 +35,8 @@ def initialize(options):
|
||||
if enable_https:
|
||||
# If either the HTTPS certificate or key do not exist, try to make self-signed ones.
|
||||
if plexpy.CONFIG.HTTPS_CREATE_CERT and \
|
||||
(not (https_cert and os.path.exists(https_cert)) or not (https_key and os.path.exists(https_key))):
|
||||
(not (https_cert and os.path.exists(https_cert)) or
|
||||
not (https_key and os.path.exists(https_key))):
|
||||
if not create_https_certificates(https_cert, https_key):
|
||||
logger.warn(u"Tautulli WebStart :: Unable to create certificate and key. Disabling HTTPS")
|
||||
enable_https = False
|
||||
@@ -67,16 +68,21 @@ def initialize(options):
|
||||
protocol = "http"
|
||||
|
||||
if options['http_password']:
|
||||
logger.info(u"Tautulli WebStart :: Web server authentication is enabled, username is '%s'", options['http_username'])
|
||||
login_allowed = ["Tautulli admin (username is '%s')" % options['http_username']]
|
||||
if plexpy.CONFIG.HTTP_PLEX_ADMIN:
|
||||
login_allowed.append("Plex admin")
|
||||
|
||||
logger.info(u"Tautulli WebStart :: Web server authentication is enabled: %s allowed", ' and '.join(login_allowed))
|
||||
|
||||
if options['http_basic_auth']:
|
||||
session_enabled = auth_enabled = False
|
||||
auth_enabled = False
|
||||
basic_auth_enabled = True
|
||||
else:
|
||||
options_dict['tools.sessions.on'] = session_enabled = auth_enabled = True
|
||||
auth_enabled = True
|
||||
basic_auth_enabled = False
|
||||
cherrypy.tools.auth = cherrypy.Tool('before_handler', webauth.check_auth)
|
||||
else:
|
||||
session_enabled = auth_enabled = basic_auth_enabled = False
|
||||
auth_enabled = basic_auth_enabled = False
|
||||
|
||||
if options['http_root'].strip('/'):
|
||||
plexpy.HTTP_ROOT = options['http_root'] = '/' + options['http_root'].strip('/') + '/'
|
||||
@@ -93,11 +99,6 @@ def initialize(options):
|
||||
'tools.gzip.mime_types': ['text/html', 'text/plain', 'text/css',
|
||||
'text/javascript', 'application/json',
|
||||
'application/javascript'],
|
||||
'tools.sessions.on': session_enabled,
|
||||
'tools.session.name': 'tautulli_session_id-' + plexpy.CONFIG.PMS_UUID,
|
||||
'tools.sessions.storage_type': 'file',
|
||||
'tools.sessions.storage_path': plexpy.CONFIG.CACHE_DIR,
|
||||
'tools.sessions.timeout': 30 * 24 * 60, # 30 days
|
||||
'tools.auth.on': auth_enabled,
|
||||
'tools.auth_basic.on': basic_auth_enabled,
|
||||
'tools.auth_basic.realm': 'Tautulli web server',
|
||||
|
Reference in New Issue
Block a user