Compare commits

...

53 Commits

Author SHA1 Message Date
John 81a6f5a5b5
Merge 19fb9675a4 into 2c4c845cd2 2024-04-21 17:45:45 +00:00
John 19fb9675a4 Add CONTENT_SECURITY_POLICY 2024-04-21 20:44:33 +03:00
Frédéric Guillot 2c4c845cd2 http/response: add brotli compression support 2024-04-19 12:16:49 -07:00
bo0tzz 2caabbe939 fix: Use `FORCE_REFRESH_INTERVAL` config for category refresh 2024-04-19 11:58:13 -07:00
Frédéric Guillot 771f9d2b5f reader/fetcher: add brotli content encoding support 2024-04-19 10:50:46 -07:00
Romain de Laage 647c66e70a ui: add tag entries page 2024-04-14 20:08:38 -07:00
jvoisin b205b5aad0 reader/processor: minimize the feed's entries html
Compress the html of feed entries before storing it. This should reduce the
size of the database a bit, but more importantly, reduce the amount of data
sent to clients

minify being [stupidly fast](https://github.com/tdewolff/minify/?tab=readme-ov-file#performance), the performance impact should be in the noise level.
2024-04-10 19:48:48 -07:00
goodfirm 4ab0d9422d chore: fix function name in comment
Signed-off-by: goodfirm <fanyishang@yeah.net>
2024-04-10 19:36:30 -07:00
Frédéric Guillot 38b80d96ea storage: change GetReadTime() function to use entries_feed_id_hash_key index 2024-04-09 20:37:30 -07:00
Michael Kuhn 35edd8ea92 Fix clicking unread counter
When clicking the unread counter, the following exception occurs:
```
Uncaught TypeError: Cannot read properties of null (reading 'getAttribute')
```

This is due to `onClickMainMenuListItem` not working correctly for the
unread counter `span`s, which return `null` when using `querySelector`.
2024-04-09 20:36:42 -07:00
Alexandros Kosiaris f0cb041885 Add back removed other repo owners in GH docker actions
In cf96ab45c1, support was added for using Docker related Github
actions in repositories of other owners. This was pretty helpful as it
allowed running modified forks off of main in a nightly fashion before
patches were pushed upstream. This was 6e870cdccc, add it
back
2024-04-06 11:31:29 -07:00
Frédéric Guillot fdd1b3f18e database: entry URLs can exceeds btree index size limit 2024-04-04 20:22:23 -07:00
Frédéric Guillot 6e870cdccc ci: use docker/metadata-action instead of deprecated shell-scripts 2024-04-04 18:04:32 -07:00
Michael Kuhn 194f517be8 Improve Dockerfiles
- Specify Docker registry explicitly (e.g., Podman does not use
  `docker.io` by default)
- Use `make miniflux` instead of duplicating `go build` arguments (this
  leverages Go's PIE build mode)
- Enable cgo to fix ARM containers (we need to make sure to use the same
  OS version for both container stages to avoid libc issues)
2024-04-04 17:36:28 -07:00
dependabot[bot] 11fd1c935e Bump golang.org/x/net from 0.23.0 to 0.24.0
Bumps [golang.org/x/net](https://github.com/golang/net) from 0.23.0 to 0.24.0.
- [Commits](https://github.com/golang/net/compare/v0.23.0...v0.24.0)

---
updated-dependencies:
- dependency-name: golang.org/x/net
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-04 16:37:41 -07:00
dependabot[bot] 47e1111908 Bump golang.org/x/term from 0.18.0 to 0.19.0
Bumps [golang.org/x/term](https://github.com/golang/term) from 0.18.0 to 0.19.0.
- [Commits](https://github.com/golang/term/compare/v0.18.0...v0.19.0)

---
updated-dependencies:
- dependency-name: golang.org/x/term
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-04 16:08:16 -07:00
dependabot[bot] c5b812eb7b Bump golang.org/x/oauth2 from 0.18.0 to 0.19.0
Bumps [golang.org/x/oauth2](https://github.com/golang/oauth2) from 0.18.0 to 0.19.0.
- [Commits](https://github.com/golang/oauth2/compare/v0.18.0...v0.19.0)

---
updated-dependencies:
- dependency-name: golang.org/x/oauth2
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-04 15:46:40 -07:00
dependabot[bot] 53be550e8a Bump github.com/yuin/goldmark from 1.7.0 to 1.7.1
Bumps [github.com/yuin/goldmark](https://github.com/yuin/goldmark) from 1.7.0 to 1.7.1.
- [Release notes](https://github.com/yuin/goldmark/releases)
- [Commits](https://github.com/yuin/goldmark/compare/v1.7.0...v1.7.1)

---
updated-dependencies:
- dependency-name: github.com/yuin/goldmark
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-03 17:30:47 -07:00
dependabot[bot] d0d693a6ef Bump golang.org/x/net from 0.22.0 to 0.23.0
Bumps [golang.org/x/net](https://github.com/golang/net) from 0.22.0 to 0.23.0.
- [Commits](https://github.com/golang/net/compare/v0.22.0...v0.23.0)

---
updated-dependencies:
- dependency-name: golang.org/x/net
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-03 17:30:26 -07:00
Evan Elias Young 1b8c45d162 finder: Find feed from YouTube playlist
The feed from a YouTube playlist page is derived in practically the same way as a feed from a YouTube channel page.
2024-04-01 21:16:32 -07:00
jvoisin 19ce519836 reader/rewrite: add a rule for oglaf.com
By default, Oglaf show some disclaimer/warning about its content, and this
doesn't play well with rss readers, so let's rewrite it to show the actual
comic instead of a placeholder.
2024-04-01 21:05:01 -07:00
Thomas J Faughnan Jr 3e0d5de7a3 api tests: use intSize-agnostic random integers
rand.Intn(math.MaxInt64) causes tests to fail on 32-bit architectures.
Use the simpler rand.Int() instead, which still provides plenty of room
for generating pseudo-random test usernames.
2024-04-01 21:02:48 -07:00
Frédéric Guillot 0336774e8c Update ChangeLog 2024-03-30 14:39:41 -07:00
Jean Khawand 756dd449cc
integration/webhook: add category title to request body 2024-03-29 16:37:05 -07:00
Taylan Tatlı a0b4665080 Turkish Translation Update 2024-03-28 19:09:24 -07:00
dependabot[bot] 6592c1ad6b Bump dominikh/staticcheck-action from 1.3.0 to 1.3.1
Bumps [dominikh/staticcheck-action](https://github.com/dominikh/staticcheck-action) from 1.3.0 to 1.3.1.
- [Release notes](https://github.com/dominikh/staticcheck-action/releases)
- [Changelog](https://github.com/dominikh/staticcheck-action/blob/master/CHANGES.md)
- [Commits](https://github.com/dominikh/staticcheck-action/compare/v1.3.0...v1.3.1)

---
updated-dependencies:
- dependency-name: dominikh/staticcheck-action
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-25 17:02:21 -07:00
jvoisin f109e3207c reader/rss: don't add empty tags to RSS items
This commit adds a bunch of checks to prevent reader/rss from adding empty tags
to rss items, as well as some minor refactors like nested conditions and loops
unrolling.
2024-03-24 19:46:56 -07:00
Romain de Laage b54fe66809 fix: do not store empty tags 2024-03-24 14:50:18 -07:00
jvoisin 93c9d43497 http/response: get rid of the X-XSS-Protection header
It's useless at best, dangerous at worst, and shouldn't be used anymore
anywhere. See the following resources for details:

- https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/X-XSS-Protection
- https://chromestatus.com/feature/5021976655560704
- https://bugzilla.mozilla.org/show_bug.cgi?id=528661
- https://blogs.windows.com/windows-insider/2018/07/25/announcing-windows-10-insider-preview-build-17723-and-build-18204/
2024-03-24 13:45:38 -07:00
Frédéric Guillot e3b3c40c28 timezone: make sure the tests pass when the timezone database is not installed on the host 2024-03-24 13:25:02 -07:00
Frédéric Guillot 068790fc19 integration: fix rssbrige import 2024-03-24 12:42:29 -07:00
Frédéric Guillot 41d99c517f Update GitHub PR template 2024-03-23 14:34:03 -07:00
Frédéric Guillot 3db3f9884f cli: avoid misleading error message when creating an admin user 2024-03-23 14:32:55 -07:00
Frédéric Guillot ad1d349a0c rss: use Channel tags only if there is no Item tags 2024-03-23 13:46:48 -07:00
Jean Khawand 7ee4a731af Update miniflux.1 2024-03-21 19:59:02 -07:00
Jean Khawand 3c822a45ac Update miniflux.1
#2187  #2543
2024-03-21 19:59:02 -07:00
Frédéric Guillot c2311e316c Rename PROXY_* options to MEDIA_PROXY_* 2024-03-20 21:28:28 -07:00
jvoisin ed20771194 Enable trusted-types
This commit adds a policy, and make use of it in the Content-Security-Policy.

I've tested it the best I could, both on a modern browser supporting
trusted-types (Chrome) and on one that doesn't (firefox).

Thanks to @lweichselbaum for giving me a hand to wrap this up!
2024-03-20 17:50:37 -07:00
jvoisin beb8c80787 Replace a bunch of `let` with `const`
According to https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/const

> Many style guides (including MDN's) recommend using const over let whenever a
variable is not reassigned in its scope. This makes the intent clear that a
variable's type (or value, in the case of a primitive) can never change.
2024-03-20 17:36:01 -07:00
jvoisin fc4bdf3ab0 Inline a one-liner function
No need to expose a symbol for this.
2024-03-20 17:21:30 -07:00
Frédéric Guillot 6bc819e198 man page: sort config options in alphabetical order 2024-03-19 22:22:24 -07:00
Frédéric Guillot 08640b27d5 Ensure enclosure URLs are always absolute 2024-03-19 21:57:46 -07:00
jvoisin 4be993e055 Minor refactoring of internal/reader/atom/atom_10_adapter.go
- Move the population of the feed's entries into a new function, to make
  `BuildFeed` easier to understand/separate concerns/implementation details
- Use `sort+compact` instead of `compact+sort` to remove duplicates
- Change `if !a { a = } if !a {a = }` constructs into `if !a { a = ; if !a {a = }}`.
  This reduce the number of comparisons, but also improves a tad the
  control-flow readability.
2024-03-19 20:41:44 -07:00
jvoisin 9df12177eb Minor idiomatic pass on internal/http/request/context.go 2024-03-19 20:21:23 -07:00
Jean Khawand a78d1c79da
Add `FILTER_ENTRY_MAX_AGE_DAYS` config option to limit fetching all feed items 2024-03-20 02:58:53 +00:00
Matt Behrens 1ea3953271
Add keyboard shortcuts for scrolling to top/bottom of the item list 2024-03-19 19:30:38 -07:00
dependabot[bot] fe8b7a907e Bump github.com/coreos/go-oidc/v3 from 3.9.0 to 3.10.0
Bumps [github.com/coreos/go-oidc/v3](https://github.com/coreos/go-oidc) from 3.9.0 to 3.10.0.
- [Release notes](https://github.com/coreos/go-oidc/releases)
- [Commits](https://github.com/coreos/go-oidc/compare/v3.9.0...v3.10.0)

---
updated-dependencies:
- dependency-name: github.com/coreos/go-oidc/v3
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-03-19 19:09:08 -07:00
Frédéric Guillot a15cdb1655 Fix regression in AbsoluteProxifyURL()
Regression introduced in commit 66b8483791

PR #2499
2024-03-18 20:48:20 -07:00
Frédéric Guillot fa9697b972 Remove trailing space in SiteURL and FeedURL 2024-03-18 17:51:06 -07:00
jvoisin 8e28e41b02 Use struct embedding to reduce code duplication 2024-03-18 16:23:44 -07:00
jvoisin e2ee74428a Minor concatenation-related simplifications in internal/storage/
Use plain strings concatenation instead of
building an array and then joining it.
2024-03-18 16:20:55 -07:00
jvoisin 863a5b3648 Simplify removeDuplicates
Use a sort+compact construct instead of doing it by hand with a hashmap. The
time complexity is now O(nlogn+n) instead of O(n), and space complexity around
O(logn) instead of O(n+uniq(n)), but it shouldn't matter anyway, since
removeDuplicates is only called to deduplicate tags.
2024-03-18 16:13:32 -07:00
jvoisin 91f5522ce0 Minor simplification of internal/reader/media/media.go
- Simplify a switch-case by moving a common condition above it.
- Remove a superfluous error-check: `strconv.ParseInt` returns `0` when passed
  an empty string.
2024-03-18 16:09:32 -07:00
101 changed files with 3044 additions and 1578 deletions

View File

@ -1,7 +1,7 @@
version: '3.8'
services:
app:
image: mcr.microsoft.com/devcontainers/go
image: mcr.microsoft.com/devcontainers/go:1.22
volumes:
- ..:/workspace:cached
command: sleep infinity
@ -24,7 +24,7 @@ services:
ports:
- 5432:5432
apprise:
image: caronc/apprise:latest
image: caronc/apprise:1.0
restart: unless-stopped
hostname: apprise
volumes:

View File

@ -3,4 +3,5 @@ Do you follow the guidelines?
- [ ] I have tested my changes
- [ ] There is no breaking changes
- [ ] I really tested my changes and there is no regression
- [ ] Ideally, my commit messages use the same convention as the Go project: https://go.dev/doc/contribute#commit_messages
- [ ] I read this document: https://miniflux.app/faq.html#pull-request

View File

@ -8,35 +8,8 @@ on:
pull_request:
branches: [ main ]
jobs:
test-docker-images:
if: github.event.pull_request
name: Test Images
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Build Alpine image
uses: docker/build-push-action@v5
with:
context: .
file: ./packaging/docker/alpine/Dockerfile
push: false
tags: ${{ github.repository_owner }}/miniflux:alpine-dev
- name: Test Alpine Docker image
run: docker run --rm ${{ github.repository_owner }}/miniflux:alpine-dev miniflux -i
- name: Build Distroless image
uses: docker/build-push-action@v5
with:
context: .
file: ./packaging/docker/distroless/Dockerfile
push: false
tags: ${{ github.repository_owner }}/miniflux:distroless-dev
- name: Test Distroless Docker image
run: docker run --rm ${{ github.repository_owner }}/miniflux:distroless-dev miniflux -i
publish-docker-images:
if: ${{ ! github.event.pull_request }}
name: Publish Images
docker-images:
name: Docker Images
permissions:
packages: write
runs-on: ubuntu-latest
@ -46,33 +19,31 @@ jobs:
with:
fetch-depth: 0
- name: Generate Alpine Docker tag
id: docker_alpine_tag
run: |
DOCKER_IMAGE=${{ github.repository_owner }}/miniflux
DOCKER_VERSION=dev
if [ "${{ github.event_name }}" = "schedule" ]; then
DOCKER_VERSION=nightly
TAGS="docker.io/${DOCKER_IMAGE}:${DOCKER_VERSION},ghcr.io/${DOCKER_IMAGE}:${DOCKER_VERSION},quay.io/${DOCKER_IMAGE}:${DOCKER_VERSION}"
elif [[ $GITHUB_REF == refs/tags/* ]]; then
DOCKER_VERSION=${GITHUB_REF#refs/tags/}
TAGS="docker.io/${DOCKER_IMAGE}:${DOCKER_VERSION},ghcr.io/${DOCKER_IMAGE}:${DOCKER_VERSION},quay.io/${DOCKER_IMAGE}:${DOCKER_VERSION},docker.io/${DOCKER_IMAGE}:latest,ghcr.io/${DOCKER_IMAGE}:latest,quay.io/${DOCKER_IMAGE}:latest"
fi
echo ::set-output name=tags::${TAGS}
- name: Generate Alpine Docker tags
id: docker_alpine_tags
uses: docker/metadata-action@v5
with:
images: |
docker.io/${{ github.repository_owner }}/miniflux
ghcr.io/${{ github.repository_owner }}/miniflux
quay.io/${{ github.repository_owner }}/miniflux
tags: |
type=ref,event=pr
type=schedule,pattern=nightly
type=semver,pattern={{raw}}
- name: Generate Distroless Docker tag
id: docker_distroless_tag
run: |
DOCKER_IMAGE=${{ github.repository_owner }}/miniflux
DOCKER_VERSION=dev-distroless
if [ "${{ github.event_name }}" = "schedule" ]; then
DOCKER_VERSION=nightly-distroless
TAGS="docker.io/${DOCKER_IMAGE}:${DOCKER_VERSION},ghcr.io/${DOCKER_IMAGE}:${DOCKER_VERSION},quay.io/${DOCKER_IMAGE}:${DOCKER_VERSION}"
elif [[ $GITHUB_REF == refs/tags/* ]]; then
DOCKER_VERSION=${GITHUB_REF#refs/tags/}-distroless
TAGS="docker.io/${DOCKER_IMAGE}:${DOCKER_VERSION},ghcr.io/${DOCKER_IMAGE}:${DOCKER_VERSION},quay.io/${DOCKER_IMAGE}:${DOCKER_VERSION},docker.io/${DOCKER_IMAGE}:latest-distroless,ghcr.io/${DOCKER_IMAGE}:latest-distroless,quay.io/${DOCKER_IMAGE}:latest-distroless"
fi
echo ::set-output name=tags::${TAGS}
- name: Generate Distroless Docker tags
id: docker_distroless_tags
uses: docker/metadata-action@v5
with:
images: |
docker.io/${{ github.repository_owner }}/miniflux
ghcr.io/${{ github.repository_owner }}/miniflux
quay.io/${{ github.repository_owner }}/miniflux
tags: |
type=ref,event=pr,suffix=-distroless
type=schedule,pattern=nightly,suffix=-distroless
type=semver,pattern={{raw}},suffix=-distroless
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
@ -81,12 +52,14 @@ jobs:
uses: docker/setup-buildx-action@v3
- name: Login to DockerHub
if: github.event_name != 'pull_request'
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to GitHub Container Registry
if: github.event_name != 'pull_request'
uses: docker/login-action@v3
with:
registry: ghcr.io
@ -94,6 +67,7 @@ jobs:
password: ${{ secrets.GITHUB_TOKEN }}
- name: Login to Quay Container Registry
if: github.event_name != 'pull_request'
uses: docker/login-action@v3
with:
registry: quay.io
@ -106,8 +80,8 @@ jobs:
context: .
file: ./packaging/docker/alpine/Dockerfile
platforms: linux/amd64,linux/arm/v6,linux/arm/v7,linux/arm64
push: true
tags: ${{ steps.docker_alpine_tag.outputs.tags }}
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.docker_alpine_tags.outputs.tags }}
- name: Build and Push Distroless images
uses: docker/build-push-action@v5
@ -115,5 +89,5 @@ jobs:
context: .
file: ./packaging/docker/distroless/Dockerfile
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ steps.docker_distroless_tag.outputs.tags }}
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.docker_distroless_tags.outputs.tags }}

View File

@ -33,7 +33,7 @@ jobs:
- uses: golangci/golangci-lint-action@v4
with:
args: --timeout 10m --skip-dirs tests --disable errcheck --enable sqlclosecheck --enable misspell --enable gofmt --enable goimports --enable whitespace --enable gocritic
- uses: dominikh/staticcheck-action@v1.3.0
- uses: dominikh/staticcheck-action@v1.3.1
with:
version: "2023.1.7"
install-go: false

View File

@ -1,3 +1,84 @@
Version 2.1.2 (March 30, 2024)
------------------------------
* `api`: rewrite API integration tests without build tags
* `ci`: add basic ESLinter checks
* `ci`: enable go-critic linter and fix various issues detected
* `ci`: fix JavaScript linter path in GitHub Actions
* `cli`: avoid misleading error message when creating an admin user automatically
* `config`: add `FILTER_ENTRY_MAX_AGE_DAYS` option
* `config`: bump the number of simultaneous workers
* `config`: rename `PROXY_*` options to `MEDIA_PROXY_*`
* `config`: use `crypto.GenerateRandomBytes` instead of doing it by hand
* `http/request`: refactor conditions to be more idiomatic
* `http/response`: remove legacy `X-XSS-Protection` header
* `integration/rssbrige`: fix rssbrige import
* `integration/shaarli`: factorize the header+payload concatenation as data
* `integration/shaarli`: no need to base64-encode then remove the padding when we can simply encode without padding
* `integration/shaarli`: the JWT token was declared as using HS256 as algorithm, but was using HS512
* `integration/webhook`: add category title to request body
* `locale`: update Turkish translations
* `man page`: sort config options in alphabetical order
* `mediaproxy`: reduce the internal indentation of `ProxifiedUrl` by inverting some conditions
* `mediaproxy`: simplify and refactor the package
* `model`: replace` Optional{Int,Int64,Float64}` with a generic function `OptionalNumber()`
* `model`: use struct embedding for `FeedCreationRequestFromSubscriptionDiscovery` to reduce code duplication
* `reader/atom`: avoid debug message when the date is empty
* `reader/atom`: change `if !a { a = } if !a {a = }` constructs into `if !a { a = ; if !a {a = }}` to reduce the number of comparisons and improve readability
* `reader/atom`: Move the population of the feed's entries into a new function, to make BuildFeed easier to understand/separate concerns/implementation details
* `reader/atom`: refactor Atom parser to use an adapter
* `reader/atom`: use `sort+compact` instead of `compact+sort` to remove duplicates
* `reader/atom`: when detecting the format, detect its version as well
* `reader/encoding`: inline a one-liner function
* `reader/handler`: fix force refresh feature
* `reader/json`: refactor JSON Feed parser to use an adapter
* `reader/media`: remove a superfluous error-check: `strconv.ParseInt` returns `0` when passed an empty string
* `reader/media`: simplify switch-case by moving a common condition above it
* `reader/processor`: compile block/keep regex only once per feed
* `reader/rdf`: refactor RDF parser to use an adapter
* `reader/rewrite`: inline some one-line functions
* `reader/rewrite`: simplify `removeClickbait`
* `reader/rewrite`: transform a free-standing function into a method
* `reader/rewrite`: use a proper constant instead of a magic number in `applyFuncOnTextContent`
* `reader/rss`: add support for `<media:category>` element
* `reader/rss`: don't add empty tags to RSS items
* `reader/rss`: refactor RSS parser to use a default namespace to avoid some limitations of the Go XML parser
* `reader/rss`: refactor RSS Parser to use an adapter
* `reader/rss`: remove some duplicated code in RSS parser
* `reader`: ensure that enclosure URLs are always absolute
* `reader`: move iTunes and GooglePlay XML definitions to their own packages
* `reader`: parse podcast categories
* `reader`: remove trailing space in `SiteURL` and `FeedURL`
* `storage`: do not store empty tags
* `storage`: simplify `removeDuplicates()` to use a `sort`+`compact` construct instead of doing it by hand with a hashmap
* `storage`: Use plain strings concatenation instead of building an array and then joining it
* `timezone`: make sure the tests pass when the timezone database is not installed on the host
* `ui/css`: align `min-width` with the other `min-width` values
* `ui/css`: fix regression: "Add to Home Screen" button is unreadable
* `ui/js`: don't use lambdas to return a function, use directly the function instead
* `ui/js`: enable trusted-types
* `ui/js`: fix download button loading label
* `ui/js`: fix JavaScript error on the login page when the user not authenticated
* `ui/js`: inline one-line functions
* `ui/js`: inline some `querySelectorAll` calls
* `ui/js`: reduce the scope of some variables
* `ui/js`: remove a hack for "Chrome 67 and earlier" since it was released in 2018
* `ui/js`: replace `DomHelper.findParent` with `.closest`
* `ui/js`: replace `let` with `const`
* `ui/js`: simplify `DomHelper.getVisibleElements` by using a `filter` instead of a loop with an index
* `ui/js`: use a `Set` instead of an array in a `KeyboardHandler`'s member
* `ui/js`: use some ternaries where it makes sense
* `ui/static`: make use of `HashFromBytes` everywhere
* `ui/static`: set minifier ECMAScript version
* `ui`: add keyboard shortcuts for scrolling to top/bottom of the item list
* `ui`: add media player control playback speed
* `ui`: remove unused variables and improve JSON decoding in `saveEnclosureProgression()`
* `validator`: display an error message on edit feed page when the feed URL is not unique
* Bump `github.com/coreos/go-oidc/v3` from `3.9.0` to `3.10.0`
* Bump `github.com/go-webauthn/webauthn` from `0.10.1` to `0.10.2`
* Bump `github.com/tdewolff/minify/v2` from `2.20.18` to `2.20.19`
* Bump `google.golang.org/protobuf` from `1.32.0` to `1.33.0`
Version 2.1.1 (March 10, 2024)
-----------------------------

View File

@ -101,7 +101,7 @@ windows-x86:
@ GOOS=windows GOARCH=386 go build -ldflags=$(LD_FLAGS) -o $(APP)-$@.exe main.go
run:
@ LOG_DATE_TIME=1 DEBUG=1 RUN_MIGRATIONS=1 CREATE_ADMIN=1 ADMIN_USERNAME=admin ADMIN_PASSWORD=test123 go run main.go
@ LOG_DATE_TIME=1 LOG_LEVEL=debug RUN_MIGRATIONS=1 CREATE_ADMIN=1 ADMIN_USERNAME=admin ADMIN_PASSWORD=test123 go run main.go
clean:
@ rm -f $(APP)-* $(APP) $(APP)*.rpm $(APP)*.deb $(APP)*.exe

21
go.mod
View File

@ -5,17 +5,19 @@ module miniflux.app/v2
require (
github.com/PuerkitoBio/goquery v1.9.1
github.com/abadojack/whatlanggo v1.0.1
github.com/coreos/go-oidc/v3 v3.9.0
github.com/andybalholm/brotli v1.1.0
github.com/coreos/go-oidc/v3 v3.10.0
github.com/go-webauthn/webauthn v0.10.2
github.com/gorilla/mux v1.8.1
github.com/lib/pq v1.10.9
github.com/prometheus/client_golang v1.19.0
github.com/tdewolff/minify/v2 v2.20.19
github.com/yuin/goldmark v1.7.0
golang.org/x/crypto v0.21.0
golang.org/x/net v0.22.0
golang.org/x/oauth2 v0.18.0
golang.org/x/term v0.18.0
github.com/yuin/goldmark v1.7.1
golang.org/x/crypto v0.22.0
golang.org/x/net v0.24.0
golang.org/x/oauth2 v0.19.0
golang.org/x/term v0.19.0
golang.org/x/text v0.14.0
mvdan.cc/xurls/v2 v2.5.0
)
@ -30,8 +32,7 @@ require (
github.com/beorn7/perks v1.0.1 // indirect
github.com/cespare/xxhash/v2 v2.2.0 // indirect
github.com/fxamacker/cbor/v2 v2.6.0 // indirect
github.com/go-jose/go-jose/v3 v3.0.3 // indirect
github.com/golang/protobuf v1.5.3 // indirect
github.com/go-jose/go-jose/v4 v4.0.1 // indirect
github.com/google/uuid v1.6.0 // indirect
github.com/mitchellh/mapstructure v1.5.0 // indirect
github.com/prometheus/client_model v0.5.0 // indirect
@ -39,9 +40,7 @@ require (
github.com/prometheus/procfs v0.12.0 // indirect
github.com/tdewolff/parse/v2 v2.7.12 // indirect
github.com/x448/float16 v0.8.4 // indirect
golang.org/x/sys v0.18.0 // indirect
golang.org/x/text v0.14.0 // indirect
google.golang.org/appengine v1.6.8 // indirect
golang.org/x/sys v0.19.0 // indirect
google.golang.org/protobuf v1.33.0 // indirect
)

57
go.sum
View File

@ -2,33 +2,28 @@ github.com/PuerkitoBio/goquery v1.9.1 h1:mTL6XjbJTZdpfL+Gwl5U2h1l9yEkJjhmlTeV9VP
github.com/PuerkitoBio/goquery v1.9.1/go.mod h1:cW1n6TmIMDoORQU5IU/P1T3tGFunOeXEpGP2WHRwkbY=
github.com/abadojack/whatlanggo v1.0.1 h1:19N6YogDnf71CTHm3Mp2qhYfkRdyvbgwWdd2EPxJRG4=
github.com/abadojack/whatlanggo v1.0.1/go.mod h1:66WiQbSbJBIlOZMsvbKe5m6pzQovxCH9B/K8tQB2uoc=
github.com/andybalholm/brotli v1.1.0 h1:eLKJA0d02Lf0mVpIDgYnqXcUn0GqVmEFny3VuID1U3M=
github.com/andybalholm/brotli v1.1.0/go.mod h1:sms7XGricyQI9K10gOSf56VKKWS4oLer58Q+mhRPtnY=
github.com/andybalholm/cascadia v1.3.2 h1:3Xi6Dw5lHF15JtdcmAHD3i1+T8plmv7BQ/nsViSLyss=
github.com/andybalholm/cascadia v1.3.2/go.mod h1:7gtRlve5FxPPgIgX36uWBX58OdBsSS6lUvCFb+h7KvU=
github.com/beorn7/perks v1.0.1 h1:VlbKKnNfV8bJzeqoa4cOKqO6bYr3WgKZxO8Z16+hsOM=
github.com/beorn7/perks v1.0.1/go.mod h1:G2ZrVWU2WbWT9wwq4/hrbKbnv/1ERSJQ0ibhJ6rlkpw=
github.com/cespare/xxhash/v2 v2.2.0 h1:DC2CZ1Ep5Y4k3ZQ899DldepgrayRUGE6BBZ/cd9Cj44=
github.com/cespare/xxhash/v2 v2.2.0/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
github.com/coreos/go-oidc/v3 v3.9.0 h1:0J/ogVOd4y8P0f0xUh8l9t07xRP/d8tccvjHl2dcsSo=
github.com/coreos/go-oidc/v3 v3.9.0/go.mod h1:rTKz2PYwftcrtoCzV5g5kvfJoWcm0Mk8AF8y1iAQro4=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/coreos/go-oidc/v3 v3.10.0 h1:tDnXHnLyiTVyT/2zLDGj09pFPkhND8Gl8lnTRhoEaJU=
github.com/coreos/go-oidc/v3 v3.10.0/go.mod h1:5j11xcw0D3+SGxn6Z/WFADsgcWVMyNAlSQupk0KK3ac=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/fxamacker/cbor/v2 v2.6.0 h1:sU6J2usfADwWlYDAFhZBQ6TnLFBHxgesMrQfQgk1tWA=
github.com/fxamacker/cbor/v2 v2.6.0/go.mod h1:pxXPTn3joSm21Gbwsv0w9OSA2y1HFR9qXEeXQVeNoDQ=
github.com/go-jose/go-jose/v3 v3.0.3 h1:fFKWeig/irsp7XD2zBxvnmA/XaRWp5V3CBsZXJF7G7k=
github.com/go-jose/go-jose/v3 v3.0.3/go.mod h1:5b+7YgP7ZICgJDBdfjZaIt+H/9L9T/YQrVfLAMboGkQ=
github.com/go-jose/go-jose/v4 v4.0.1 h1:QVEPDE3OluqXBQZDcnNvQrInro2h0e4eqNbnZSWqS6U=
github.com/go-jose/go-jose/v4 v4.0.1/go.mod h1:WVf9LFMHh/QVrmqrOfqun0C45tMe3RoiKJMPvgWwLfY=
github.com/go-webauthn/webauthn v0.10.2 h1:OG7B+DyuTytrEPFmTX503K77fqs3HDK/0Iv+z8UYbq4=
github.com/go-webauthn/webauthn v0.10.2/go.mod h1:Gd1IDsGAybuvK1NkwUTLbGmeksxuRJjVN2PE/xsPxHs=
github.com/go-webauthn/x v0.1.9 h1:v1oeLmoaa+gPOaZqUdDentu6Rl7HkSSsmOT6gxEQHhE=
github.com/go-webauthn/x v0.1.9/go.mod h1:pJNMlIMP1SU7cN8HNlKJpLEnFHCygLCvaLZ8a1xeoQA=
github.com/golang-jwt/jwt/v5 v5.2.1 h1:OuVbFODueb089Lh128TAcimifWaLhJwVflnrgM17wHk=
github.com/golang-jwt/jwt/v5 v5.2.1/go.mod h1:pqrtFR0X4osieyHYxtmOUWsAWrfe1Q5UVIyoH402zdk=
github.com/golang/protobuf v1.5.0/go.mod h1:FsONVRAS9T7sI+LIUmWTfcYkHO4aIWwzhcaSAoJOfIk=
github.com/golang/protobuf v1.5.2/go.mod h1:XVQd3VNwM+JqD3oG2Ue2ip4fOMUkwXdXDdiuN0vRsmY=
github.com/golang/protobuf v1.5.3 h1:KhyjKVUg7Usr/dYsdSqoFveMYd5ko72D+zANwlG1mmg=
github.com/golang/protobuf v1.5.3/go.mod h1:XVQd3VNwM+JqD3oG2Ue2ip4fOMUkwXdXDdiuN0vRsmY=
github.com/google/go-cmp v0.5.5/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.5.9/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
github.com/google/go-cmp v0.6.0 h1:ofyhxvXcZhMsU5ulbFiLKl/XBFqE1GSq7atu8tAmTRI=
github.com/google/go-cmp v0.6.0/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
github.com/google/go-tpm v0.9.0 h1:sQF6YqWMi+SCXpsmS3fd21oPy/vSddwZry4JnmltHVk=
@ -51,8 +46,6 @@ github.com/prometheus/common v0.48.0 h1:QO8U2CdOzSn1BBsmXJXduaaW+dY/5QLjfB8svtSz
github.com/prometheus/common v0.48.0/go.mod h1:0/KsvlIEfPQCQ5I2iNSAWKPZziNCvRs5EC6ILDTlAPc=
github.com/prometheus/procfs v0.12.0 h1:jluTpSng7V9hY0O2R9DzzJHYb2xULk9VTR1V1R/k6Bo=
github.com/prometheus/procfs v0.12.0/go.mod h1:pcuDEFsWDnvcgNzo4EEweacyhjeA9Zk3cnaOZAZEfOo=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/stretchr/testify v1.9.0 h1:HtqpIVDClZ4nwg75+f6Lvsy/wHu+3BoSGCbBAcpTsTg=
github.com/stretchr/testify v1.9.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
github.com/tdewolff/minify/v2 v2.20.19 h1:tX0SR0LUrIqGoLjXnkIzRSIbKJ7PaNnSENLD4CyH6Xo=
@ -65,13 +58,12 @@ github.com/tdewolff/test v1.0.11-0.20240106005702-7de5f7df4739/go.mod h1:XPuWBzv
github.com/x448/float16 v0.8.4 h1:qLwI1I70+NjRFUR3zs1JPUCgaCXSh3SW62uAKT1mSBM=
github.com/x448/float16 v0.8.4/go.mod h1:14CWIYCyZA/cWjXOioeEpHeN/83MdbZDRQHoFcYsOfg=
github.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY=
github.com/yuin/goldmark v1.7.0 h1:EfOIvIMZIzHdB/R/zVrikYLPPwJlfMcNczJFMs1m6sA=
github.com/yuin/goldmark v1.7.0/go.mod h1:uzxRWxtg69N339t3louHJ7+O03ezfj6PlliRlaOzY1E=
github.com/yuin/goldmark v1.7.1 h1:3bajkSilaCbjdKVsKdZjZCLBNPL9pYzrCakKaf4U49U=
github.com/yuin/goldmark v1.7.1/go.mod h1:uzxRWxtg69N339t3louHJ7+O03ezfj6PlliRlaOzY1E=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
golang.org/x/crypto v0.19.0/go.mod h1:Iy9bg/ha4yyC70EfRS8jz+B6ybOBKMaSxLj6P6oBDfU=
golang.org/x/crypto v0.21.0 h1:X31++rzVUdKhX5sWmSOFZxx8UW/ldWx55cbf08iNAMA=
golang.org/x/crypto v0.21.0/go.mod h1:0BP7YvVV9gBbVKyeTG0Gyn+gZm94bibOW5BjDEYAOMs=
golang.org/x/crypto v0.22.0 h1:g1v0xeRhjcugydODzvb3mEM9SQ0HGp9s/nh3COQ/C30=
golang.org/x/crypto v0.22.0/go.mod h1:vr6Su+7cTlO45qkww3VDJlzDn0ctJvRgYbC2NvXHt+M=
golang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=
golang.org/x/mod v0.8.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
golang.org/x/net v0.0.0-20190620200207-3b0461eec859/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=
@ -79,11 +71,10 @@ golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v
golang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c=
golang.org/x/net v0.6.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=
golang.org/x/net v0.9.0/go.mod h1:d48xBJpPfHeWQsugry2m+kC02ZBRGRgulfHnEXEuWns=
golang.org/x/net v0.10.0/go.mod h1:0qNGK6F8kojg2nk9dLZ2mShWaEBan6FAoqfSigmmuDg=
golang.org/x/net v0.22.0 h1:9sGLhx7iRIHEiX0oAJ3MRZMUCElJgy7Br1nO+AMN3Tc=
golang.org/x/net v0.22.0/go.mod h1:JKghWKKOSdJwpW2GEx0Ja7fmaKnMsbu+MWVZTokSYmg=
golang.org/x/oauth2 v0.18.0 h1:09qnuIAgzdx1XplqJvW6CQqMCtGZykZWcXzPMPUusvI=
golang.org/x/oauth2 v0.18.0/go.mod h1:Wf7knwG0MPoWIMMBgFlEaSUDaKskp0dCfrlJRJXbBi8=
golang.org/x/net v0.24.0 h1:1PcaxkF854Fu3+lvBIx5SYn9wRlBzzcnHZSiaFFAb0w=
golang.org/x/net v0.24.0/go.mod h1:2Q7sJY5mzlzWjKtYUEXSlBWCdyaioyXzRB2RtU8KVE8=
golang.org/x/oauth2 v0.19.0 h1:9+E/EZBCbTLNrbN35fHv/a/d/mOBatymz1zbtQrXpIg=
golang.org/x/oauth2 v0.19.0/go.mod h1:vYi7skDa1x015PmRRYZ7+s1cWyPgrPiSYRe4rnsexc8=
golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
@ -94,22 +85,17 @@ golang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBc
golang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.7.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.8.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.17.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.18.0 h1:DBdB3niSjOA/O0blCZBqDefyWNYveAYMNF1Wum0DYQ4=
golang.org/x/sys v0.18.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.19.0 h1:q5f1RH2jigJ1MoAWp2KTp3gm5zAGFUTarQZ5U386+4o=
golang.org/x/sys v0.19.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
golang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
golang.org/x/term v0.5.0/go.mod h1:jMB1sMXY+tzblOD4FWmEbocvup2/aLOaQEp7JmGp78k=
golang.org/x/term v0.7.0/go.mod h1:P32HKFT3hSsZrRxla30E9HqToFYAQPCMs/zFMBUFqPY=
golang.org/x/term v0.8.0/go.mod h1:xPskH00ivmX89bAKVGSKKtLOWNx2+17Eiy94tnKShWo=
golang.org/x/term v0.17.0/go.mod h1:lLRBjIVuehSbZlaOtGMbcMncT+aqLLLmKrsjNrUguwk=
golang.org/x/term v0.18.0 h1:FcHjZXDMxI8mM3nwhX9HlKop4C0YQvCVCdwYl2wOtE8=
golang.org/x/term v0.18.0/go.mod h1:ILwASektA3OnRv7amZ1xhE/KTR+u50pbXfZ03+6Nx58=
golang.org/x/term v0.19.0 h1:+ThwsDv+tYfnJFhF4L8jITxu1tdTWRTZpdsWgEgjL6Q=
golang.org/x/term v0.19.0/go.mod h1:2CuTdWZ7KHSQwUzKva0cbMg6q2DMI3Mmxp+gKJbskEk=
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=
golang.org/x/text v0.3.8/go.mod h1:E6s5w1FMmriuDzIBO73fBruAKo1PCIq6d2Q6DHfQ8WQ=
golang.org/x/text v0.7.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=
golang.org/x/text v0.9.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8=
golang.org/x/text v0.14.0 h1:ScX5w1eTa3QqT8oi6+ziP7dTV1S2+ALU0bI+0zXKWiQ=
@ -119,15 +105,8 @@ golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtn
golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
golang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU=
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
google.golang.org/appengine v1.6.8 h1:IhEN5q69dyKagZPYMSdIjS2HqprW324FRQZJcGqPAsM=
google.golang.org/appengine v1.6.8/go.mod h1:1jJ3jBArFh5pcgW8gCtRJnepW8FzD1V44FJffLiz/Ds=
google.golang.org/protobuf v1.26.0-rc.1/go.mod h1:jlhhOSvTdKEhbULTjvd4ARK9grFBp09yW+WbY/TyQbw=
google.golang.org/protobuf v1.26.0/go.mod h1:9q0QmTI4eRPtz6boOQmLYwt+qCgq0jsYwAQnmE0givc=
google.golang.org/protobuf v1.33.0 h1:uNO2rsAINq/JlFpSdYEKIZ0uKD/R9cpdv0T+yoGwGmI=
google.golang.org/protobuf v1.33.0/go.mod h1:c6P6GXX6sHbq/GpV6MGZEdwhWPcYBgnhAHhKbcUYpos=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
mvdan.cc/xurls/v2 v2.5.0 h1:lyBNOm8Wo71UknhUs4QTFUNNMyxy2JEIaKKo0RWOh+8=

View File

@ -8,7 +8,6 @@ import (
"errors"
"fmt"
"io"
"math"
"math/rand"
"os"
"strings"
@ -58,7 +57,7 @@ func (c *integrationTestConfig) isConfigured() bool {
}
func (c *integrationTestConfig) genRandomUsername() string {
return fmt.Sprintf("%s_%10d", c.testRegularUsername, rand.Intn(math.MaxInt64))
return fmt.Sprintf("%s_%10d", c.testRegularUsername, rand.Int())
}
func TestIncorrectEndpoint(t *testing.T) {

View File

@ -15,8 +15,8 @@ import (
"miniflux.app/v2/internal/http/request"
"miniflux.app/v2/internal/http/response/json"
"miniflux.app/v2/internal/integration"
"miniflux.app/v2/internal/mediaproxy"
"miniflux.app/v2/internal/model"
"miniflux.app/v2/internal/proxy"
"miniflux.app/v2/internal/reader/processor"
"miniflux.app/v2/internal/reader/readingtime"
"miniflux.app/v2/internal/storage"
@ -36,14 +36,14 @@ func (h *handler) getEntryFromBuilder(w http.ResponseWriter, r *http.Request, b
return
}
entry.Content = proxy.AbsoluteProxyRewriter(h.router, r.Host, entry.Content)
proxyOption := config.Opts.ProxyOption()
entry.Content = mediaproxy.RewriteDocumentWithAbsoluteProxyURL(h.router, r.Host, entry.Content)
proxyOption := config.Opts.MediaProxyMode()
for i := range entry.Enclosures {
if proxyOption == "all" || proxyOption != "none" && !urllib.IsHTTPS(entry.Enclosures[i].URL) {
for _, mediaType := range config.Opts.ProxyMediaTypes() {
for _, mediaType := range config.Opts.MediaProxyResourceTypes() {
if strings.HasPrefix(entry.Enclosures[i].MimeType, mediaType+"/") {
entry.Enclosures[i].URL = proxy.AbsoluteProxifyURL(h.router, r.Host, entry.Enclosures[i].URL)
entry.Enclosures[i].URL = mediaproxy.ProxifyAbsoluteURL(h.router, r.Host, entry.Enclosures[i].URL)
break
}
}
@ -164,7 +164,7 @@ func (h *handler) findEntries(w http.ResponseWriter, r *http.Request, feedID int
}
for i := range entries {
entries[i].Content = proxy.AbsoluteProxyRewriter(h.router, r.Host, entries[i].Content)
entries[i].Content = mediaproxy.RewriteDocumentWithAbsoluteProxyURL(h.router, r.Host, entries[i].Content)
}
json.OK(w, r, &entriesResponse{Total: count, Entries: entries})

View File

@ -16,7 +16,7 @@ func askCredentials() (string, string) {
fd := int(os.Stdin.Fd())
if !term.IsTerminal(fd) {
printErrorAndExit(fmt.Errorf("this is not a terminal, exiting"))
printErrorAndExit(fmt.Errorf("this is not an interactive terminal, exiting"))
}
fmt.Print("Enter Username: ")

View File

@ -23,7 +23,7 @@ const (
flagVersionHelp = "Show application version"
flagMigrateHelp = "Run SQL migrations"
flagFlushSessionsHelp = "Flush all sessions (disconnect users)"
flagCreateAdminHelp = "Create admin user"
flagCreateAdminHelp = "Create an admin user from an interactive terminal"
flagResetPasswordHelp = "Reset user password"
flagResetFeedErrorsHelp = "Clear all feed errors for all users"
flagDebugModeHelp = "Show debug logs"
@ -191,7 +191,7 @@ func Parse() {
}
if flagCreateAdmin {
createAdmin(store)
createAdminUserFromInteractiveTerminal(store)
return
}
@ -211,9 +211,8 @@ func Parse() {
printErrorAndExit(err)
}
// Create admin user and start the daemon.
if config.Opts.CreateAdmin() {
createAdmin(store)
createAdminUserFromEnvironmentVariables(store)
}
if flagRefreshFeeds {

View File

@ -12,15 +12,20 @@ import (
"miniflux.app/v2/internal/validator"
)
func createAdmin(store *storage.Storage) {
userCreationRequest := &model.UserCreationRequest{
Username: config.Opts.AdminUsername(),
Password: config.Opts.AdminPassword(),
IsAdmin: true,
}
func createAdminUserFromEnvironmentVariables(store *storage.Storage) {
createAdminUser(store, config.Opts.AdminUsername(), config.Opts.AdminPassword())
}
if userCreationRequest.Username == "" || userCreationRequest.Password == "" {
userCreationRequest.Username, userCreationRequest.Password = askCredentials()
func createAdminUserFromInteractiveTerminal(store *storage.Storage) {
username, password := askCredentials()
createAdminUser(store, username, password)
}
func createAdminUser(store *storage.Storage, username, password string) {
userCreationRequest := &model.UserCreationRequest{
Username: username,
Password: password,
IsAdmin: true,
}
if store.UserExists(userCreationRequest.Username) {
@ -34,7 +39,12 @@ func createAdmin(store *storage.Storage) {
printErrorAndExit(validationErr.Error())
}
if _, err := store.CreateUser(userCreationRequest); err != nil {
if user, err := store.CreateUser(userCreationRequest); err != nil {
printErrorAndExit(err)
} else {
slog.Info("Created new admin user",
slog.String("username", user.Username),
slog.Int64("user_id", user.ID),
)
}
}

View File

@ -4,6 +4,7 @@
package config // import "miniflux.app/v2/internal/config"
import (
"bytes"
"os"
"testing"
)
@ -1442,9 +1443,9 @@ func TestPocketConsumerKeyFromUserPrefs(t *testing.T) {
}
}
func TestProxyOption(t *testing.T) {
func TestMediaProxyMode(t *testing.T) {
os.Clearenv()
os.Setenv("PROXY_OPTION", "all")
os.Setenv("MEDIA_PROXY_MODE", "all")
parser := NewParser()
opts, err := parser.ParseEnvironmentVariables()
@ -1453,14 +1454,14 @@ func TestProxyOption(t *testing.T) {
}
expected := "all"
result := opts.ProxyOption()
result := opts.MediaProxyMode()
if result != expected {
t.Fatalf(`Unexpected PROXY_OPTION value, got %q instead of %q`, result, expected)
t.Fatalf(`Unexpected MEDIA_PROXY_MODE value, got %q instead of %q`, result, expected)
}
}
func TestDefaultProxyOptionValue(t *testing.T) {
func TestDefaultMediaProxyModeValue(t *testing.T) {
os.Clearenv()
parser := NewParser()
@ -1469,17 +1470,17 @@ func TestDefaultProxyOptionValue(t *testing.T) {
t.Fatalf(`Parsing failure: %v`, err)
}
expected := defaultProxyOption
result := opts.ProxyOption()
expected := defaultMediaProxyMode
result := opts.MediaProxyMode()
if result != expected {
t.Fatalf(`Unexpected PROXY_OPTION value, got %q instead of %q`, result, expected)
t.Fatalf(`Unexpected MEDIA_PROXY_MODE value, got %q instead of %q`, result, expected)
}
}
func TestProxyMediaTypes(t *testing.T) {
func TestMediaProxyResourceTypes(t *testing.T) {
os.Clearenv()
os.Setenv("PROXY_MEDIA_TYPES", "image,audio")
os.Setenv("MEDIA_PROXY_RESOURCE_TYPES", "image,audio")
parser := NewParser()
opts, err := parser.ParseEnvironmentVariables()
@ -1489,25 +1490,25 @@ func TestProxyMediaTypes(t *testing.T) {
expected := []string{"audio", "image"}
if len(expected) != len(opts.ProxyMediaTypes()) {
t.Fatalf(`Unexpected PROXY_MEDIA_TYPES value, got %v instead of %v`, opts.ProxyMediaTypes(), expected)
if len(expected) != len(opts.MediaProxyResourceTypes()) {
t.Fatalf(`Unexpected MEDIA_PROXY_RESOURCE_TYPES value, got %v instead of %v`, opts.MediaProxyResourceTypes(), expected)
}
resultMap := make(map[string]bool)
for _, mediaType := range opts.ProxyMediaTypes() {
for _, mediaType := range opts.MediaProxyResourceTypes() {
resultMap[mediaType] = true
}
for _, mediaType := range expected {
if !resultMap[mediaType] {
t.Fatalf(`Unexpected PROXY_MEDIA_TYPES value, got %v instead of %v`, opts.ProxyMediaTypes(), expected)
t.Fatalf(`Unexpected MEDIA_PROXY_RESOURCE_TYPES value, got %v instead of %v`, opts.MediaProxyResourceTypes(), expected)
}
}
}
func TestProxyMediaTypesWithDuplicatedValues(t *testing.T) {
func TestMediaProxyResourceTypesWithDuplicatedValues(t *testing.T) {
os.Clearenv()
os.Setenv("PROXY_MEDIA_TYPES", "image,audio, image")
os.Setenv("MEDIA_PROXY_RESOURCE_TYPES", "image,audio, image")
parser := NewParser()
opts, err := parser.ParseEnvironmentVariables()
@ -1516,23 +1517,119 @@ func TestProxyMediaTypesWithDuplicatedValues(t *testing.T) {
}
expected := []string{"audio", "image"}
if len(expected) != len(opts.ProxyMediaTypes()) {
t.Fatalf(`Unexpected PROXY_MEDIA_TYPES value, got %v instead of %v`, opts.ProxyMediaTypes(), expected)
if len(expected) != len(opts.MediaProxyResourceTypes()) {
t.Fatalf(`Unexpected MEDIA_PROXY_RESOURCE_TYPES value, got %v instead of %v`, opts.MediaProxyResourceTypes(), expected)
}
resultMap := make(map[string]bool)
for _, mediaType := range opts.ProxyMediaTypes() {
for _, mediaType := range opts.MediaProxyResourceTypes() {
resultMap[mediaType] = true
}
for _, mediaType := range expected {
if !resultMap[mediaType] {
t.Fatalf(`Unexpected PROXY_MEDIA_TYPES value, got %v instead of %v`, opts.ProxyMediaTypes(), expected)
t.Fatalf(`Unexpected MEDIA_PROXY_RESOURCE_TYPES value, got %v instead of %v`, opts.MediaProxyResourceTypes(), expected)
}
}
}
func TestProxyImagesOptionBackwardCompatibility(t *testing.T) {
func TestDefaultMediaProxyResourceTypes(t *testing.T) {
os.Clearenv()
parser := NewParser()
opts, err := parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
expected := []string{"image"}
if len(expected) != len(opts.MediaProxyResourceTypes()) {
t.Fatalf(`Unexpected MEDIA_PROXY_RESOURCE_TYPES value, got %v instead of %v`, opts.MediaProxyResourceTypes(), expected)
}
resultMap := make(map[string]bool)
for _, mediaType := range opts.MediaProxyResourceTypes() {
resultMap[mediaType] = true
}
for _, mediaType := range expected {
if !resultMap[mediaType] {
t.Fatalf(`Unexpected MEDIA_PROXY_RESOURCE_TYPES value, got %v instead of %v`, opts.MediaProxyResourceTypes(), expected)
}
}
}
func TestMediaProxyHTTPClientTimeout(t *testing.T) {
os.Clearenv()
os.Setenv("MEDIA_PROXY_HTTP_CLIENT_TIMEOUT", "24")
parser := NewParser()
opts, err := parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
expected := 24
result := opts.MediaProxyHTTPClientTimeout()
if result != expected {
t.Fatalf(`Unexpected MEDIA_PROXY_HTTP_CLIENT_TIMEOUT value, got %d instead of %d`, result, expected)
}
}
func TestDefaultMediaProxyHTTPClientTimeoutValue(t *testing.T) {
os.Clearenv()
parser := NewParser()
opts, err := parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
expected := defaultMediaProxyHTTPClientTimeout
result := opts.MediaProxyHTTPClientTimeout()
if result != expected {
t.Fatalf(`Unexpected MEDIA_PROXY_HTTP_CLIENT_TIMEOUT value, got %d instead of %d`, result, expected)
}
}
func TestMediaProxyCustomURL(t *testing.T) {
os.Clearenv()
os.Setenv("MEDIA_PROXY_CUSTOM_URL", "http://example.org/proxy")
parser := NewParser()
opts, err := parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
expected := "http://example.org/proxy"
result := opts.MediaCustomProxyURL()
if result != expected {
t.Fatalf(`Unexpected MEDIA_PROXY_CUSTOM_URL value, got %q instead of %q`, result, expected)
}
}
func TestMediaProxyPrivateKey(t *testing.T) {
os.Clearenv()
os.Setenv("MEDIA_PROXY_PRIVATE_KEY", "foobar")
parser := NewParser()
opts, err := parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
expected := []byte("foobar")
result := opts.MediaProxyPrivateKey()
if !bytes.Equal(result, expected) {
t.Fatalf(`Unexpected MEDIA_PROXY_PRIVATE_KEY value, got %q instead of %q`, result, expected)
}
}
func TestProxyImagesOptionForBackwardCompatibility(t *testing.T) {
os.Clearenv()
os.Setenv("PROXY_IMAGES", "all")
@ -1543,30 +1640,31 @@ func TestProxyImagesOptionBackwardCompatibility(t *testing.T) {
}
expected := []string{"image"}
if len(expected) != len(opts.ProxyMediaTypes()) {
t.Fatalf(`Unexpected PROXY_MEDIA_TYPES value, got %v instead of %v`, opts.ProxyMediaTypes(), expected)
if len(expected) != len(opts.MediaProxyResourceTypes()) {
t.Fatalf(`Unexpected PROXY_IMAGES value, got %v instead of %v`, opts.MediaProxyResourceTypes(), expected)
}
resultMap := make(map[string]bool)
for _, mediaType := range opts.ProxyMediaTypes() {
for _, mediaType := range opts.MediaProxyResourceTypes() {
resultMap[mediaType] = true
}
for _, mediaType := range expected {
if !resultMap[mediaType] {
t.Fatalf(`Unexpected PROXY_MEDIA_TYPES value, got %v instead of %v`, opts.ProxyMediaTypes(), expected)
t.Fatalf(`Unexpected PROXY_IMAGES value, got %v instead of %v`, opts.MediaProxyResourceTypes(), expected)
}
}
expectedProxyOption := "all"
result := opts.ProxyOption()
result := opts.MediaProxyMode()
if result != expectedProxyOption {
t.Fatalf(`Unexpected PROXY_OPTION value, got %q instead of %q`, result, expectedProxyOption)
}
}
func TestDefaultProxyMediaTypes(t *testing.T) {
func TestProxyImageURLForBackwardCompatibility(t *testing.T) {
os.Clearenv()
os.Setenv("PROXY_IMAGE_URL", "http://example.org/proxy")
parser := NewParser()
opts, err := parser.ParseEnvironmentVariables()
@ -1574,25 +1672,73 @@ func TestDefaultProxyMediaTypes(t *testing.T) {
t.Fatalf(`Parsing failure: %v`, err)
}
expected := []string{"image"}
expected := "http://example.org/proxy"
result := opts.MediaCustomProxyURL()
if result != expected {
t.Fatalf(`Unexpected PROXY_IMAGE_URL value, got %q instead of %q`, result, expected)
}
}
if len(expected) != len(opts.ProxyMediaTypes()) {
t.Fatalf(`Unexpected PROXY_MEDIA_TYPES value, got %v instead of %v`, opts.ProxyMediaTypes(), expected)
func TestProxyURLOptionForBackwardCompatibility(t *testing.T) {
os.Clearenv()
os.Setenv("PROXY_URL", "http://example.org/proxy")
parser := NewParser()
opts, err := parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
expected := "http://example.org/proxy"
result := opts.MediaCustomProxyURL()
if result != expected {
t.Fatalf(`Unexpected PROXY_URL value, got %q instead of %q`, result, expected)
}
}
func TestProxyMediaTypesOptionForBackwardCompatibility(t *testing.T) {
os.Clearenv()
os.Setenv("PROXY_MEDIA_TYPES", "image,audio")
parser := NewParser()
opts, err := parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
expected := []string{"audio", "image"}
if len(expected) != len(opts.MediaProxyResourceTypes()) {
t.Fatalf(`Unexpected PROXY_MEDIA_TYPES value, got %v instead of %v`, opts.MediaProxyResourceTypes(), expected)
}
resultMap := make(map[string]bool)
for _, mediaType := range opts.ProxyMediaTypes() {
for _, mediaType := range opts.MediaProxyResourceTypes() {
resultMap[mediaType] = true
}
for _, mediaType := range expected {
if !resultMap[mediaType] {
t.Fatalf(`Unexpected PROXY_MEDIA_TYPES value, got %v instead of %v`, opts.ProxyMediaTypes(), expected)
t.Fatalf(`Unexpected PROXY_MEDIA_TYPES value, got %v instead of %v`, opts.MediaProxyResourceTypes(), expected)
}
}
}
func TestProxyHTTPClientTimeout(t *testing.T) {
func TestProxyOptionForBackwardCompatibility(t *testing.T) {
os.Clearenv()
os.Setenv("PROXY_OPTION", "all")
parser := NewParser()
opts, err := parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
expected := "all"
result := opts.MediaProxyMode()
if result != expected {
t.Fatalf(`Unexpected PROXY_OPTION value, got %q instead of %q`, result, expected)
}
}
func TestProxyHTTPClientTimeoutOptionForBackwardCompatibility(t *testing.T) {
os.Clearenv()
os.Setenv("PROXY_HTTP_CLIENT_TIMEOUT", "24")
@ -1601,29 +1747,26 @@ func TestProxyHTTPClientTimeout(t *testing.T) {
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
expected := 24
result := opts.ProxyHTTPClientTimeout()
result := opts.MediaProxyHTTPClientTimeout()
if result != expected {
t.Fatalf(`Unexpected PROXY_HTTP_CLIENT_TIMEOUT value, got %d instead of %d`, result, expected)
}
}
func TestDefaultProxyHTTPClientTimeoutValue(t *testing.T) {
func TestProxyPrivateKeyOptionForBackwardCompatibility(t *testing.T) {
os.Clearenv()
os.Setenv("PROXY_PRIVATE_KEY", "foobar")
parser := NewParser()
opts, err := parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
expected := defaultProxyHTTPClientTimeout
result := opts.ProxyHTTPClientTimeout()
if result != expected {
t.Fatalf(`Unexpected PROXY_HTTP_CLIENT_TIMEOUT value, got %d instead of %d`, result, expected)
expected := []byte("foobar")
result := opts.MediaProxyPrivateKey()
if !bytes.Equal(result, expected) {
t.Fatalf(`Unexpected PROXY_PRIVATE_KEY value, got %q instead of %q`, result, expected)
}
}
@ -1966,3 +2109,21 @@ func TestParseConfigDumpOutput(t *testing.T) {
t.Fatal(err)
}
}
func TestContentSecurityPolicy(t *testing.T) {
os.Clearenv()
os.Setenv("CONTENT_SECURITY_POLICY", "default-src 'self' fonts.googleapis.com fonts.gstatic.com; img-src * data:; media-src *; frame-src *; style-src 'self' fonts.googleapis.com fonts.gstatic.com 'nonce-%s'")
parser := NewParser()
opts, err := parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
expected := "default-src 'self' fonts.googleapis.com fonts.gstatic.com; img-src * data:; media-src *; frame-src *; style-src 'self' fonts.googleapis.com fonts.gstatic.com 'nonce-%s'"
result := opts.ContentSecurityPolicy()
if result != expected {
t.Fatalf(`Unexpected CONTENT_SECURITY_POLICY value, got %v instead of %v`, result, expected)
}
}

View File

@ -51,10 +51,11 @@ const (
defaultCleanupArchiveUnreadDays = 180
defaultCleanupArchiveBatchSize = 10000
defaultCleanupRemoveSessionsDays = 30
defaultProxyHTTPClientTimeout = 120
defaultProxyOption = "http-only"
defaultProxyMediaTypes = "image"
defaultProxyUrl = ""
defaultMediaProxyHTTPClientTimeout = 120
defaultMediaProxyMode = "http-only"
defaultMediaResourceTypes = "image"
defaultMediaProxyURL = ""
defaultFilterEntryMaxAgeDays = 0
defaultFetchOdyseeWatchTime = false
defaultFetchYouTubeWatchTime = false
defaultYouTubeEmbedUrlOverride = "https://www.youtube-nocookie.com/embed/"
@ -84,6 +85,7 @@ const (
defaultWatchdog = true
defaultInvidiousInstance = "yewtu.be"
defaultWebAuthn = false
defaultContentSecurityPolicy = "default-src 'self'; img-src * data:; media-src *; frame-src *; style-src 'self' 'nonce-%s'; require-trusted-types-for 'script'; trusted-types ttpolicy;"
)
var defaultHTTPClientUserAgent = "Mozilla/5.0 (compatible; Miniflux/" + version.Version + "; +https://miniflux.app)"
@ -135,12 +137,13 @@ type Options struct {
createAdmin bool
adminUsername string
adminPassword string
proxyHTTPClientTimeout int
proxyOption string
proxyMediaTypes []string
proxyUrl string
mediaProxyHTTPClientTimeout int
mediaProxyMode string
mediaProxyResourceTypes []string
mediaProxyCustomURL string
fetchOdyseeWatchTime bool
fetchYouTubeWatchTime bool
filterEntryMaxAgeDays int
youTubeEmbedUrlOverride string
oauth2UserCreationAllowed bool
oauth2ClientID string
@ -165,8 +168,9 @@ type Options struct {
metricsPassword string
watchdog bool
invidiousInstance string
proxyPrivateKey []byte
mediaProxyPrivateKey []byte
webAuthn bool
contentSecurityPolicy string
}
// NewOptions returns Options with default values.
@ -209,10 +213,11 @@ func NewOptions() *Options {
pollingParsingErrorLimit: defaultPollingParsingErrorLimit,
workerPoolSize: defaultWorkerPoolSize,
createAdmin: defaultCreateAdmin,
proxyHTTPClientTimeout: defaultProxyHTTPClientTimeout,
proxyOption: defaultProxyOption,
proxyMediaTypes: []string{defaultProxyMediaTypes},
proxyUrl: defaultProxyUrl,
mediaProxyHTTPClientTimeout: defaultMediaProxyHTTPClientTimeout,
mediaProxyMode: defaultMediaProxyMode,
mediaProxyResourceTypes: []string{defaultMediaResourceTypes},
mediaProxyCustomURL: defaultMediaProxyURL,
filterEntryMaxAgeDays: defaultFilterEntryMaxAgeDays,
fetchOdyseeWatchTime: defaultFetchOdyseeWatchTime,
fetchYouTubeWatchTime: defaultFetchYouTubeWatchTime,
youTubeEmbedUrlOverride: defaultYouTubeEmbedUrlOverride,
@ -239,8 +244,9 @@ func NewOptions() *Options {
metricsPassword: defaultMetricsPassword,
watchdog: defaultWatchdog,
invidiousInstance: defaultInvidiousInstance,
proxyPrivateKey: crypto.GenerateRandomBytes(16),
mediaProxyPrivateKey: crypto.GenerateRandomBytes(16),
webAuthn: defaultWebAuthn,
contentSecurityPolicy: defaultContentSecurityPolicy,
}
}
@ -489,24 +495,29 @@ func (o *Options) FetchOdyseeWatchTime() bool {
return o.fetchOdyseeWatchTime
}
// ProxyOption returns "none" to never proxy, "http-only" to proxy non-HTTPS, "all" to always proxy.
func (o *Options) ProxyOption() string {
return o.proxyOption
// MediaProxyMode returns "none" to never proxy, "http-only" to proxy non-HTTPS, "all" to always proxy.
func (o *Options) MediaProxyMode() string {
return o.mediaProxyMode
}
// ProxyMediaTypes returns a slice of media types to proxy.
func (o *Options) ProxyMediaTypes() []string {
return o.proxyMediaTypes
// MediaProxyResourceTypes returns a slice of resource types to proxy.
func (o *Options) MediaProxyResourceTypes() []string {
return o.mediaProxyResourceTypes
}
// ProxyUrl returns a string of a URL to use to proxy image requests
func (o *Options) ProxyUrl() string {
return o.proxyUrl
// MediaCustomProxyURL returns the custom proxy URL for medias.
func (o *Options) MediaCustomProxyURL() string {
return o.mediaProxyCustomURL
}
// ProxyHTTPClientTimeout returns the time limit in seconds before the proxy HTTP client cancel the request.
func (o *Options) ProxyHTTPClientTimeout() int {
return o.proxyHTTPClientTimeout
// MediaProxyHTTPClientTimeout returns the time limit in seconds before the proxy HTTP client cancel the request.
func (o *Options) MediaProxyHTTPClientTimeout() int {
return o.mediaProxyHTTPClientTimeout
}
// MediaProxyPrivateKey returns the private key used by the media proxy.
func (o *Options) MediaProxyPrivateKey() []byte {
return o.mediaProxyPrivateKey
}
// HasHTTPService returns true if the HTTP service is enabled.
@ -602,16 +613,21 @@ func (o *Options) InvidiousInstance() string {
return o.invidiousInstance
}
// ProxyPrivateKey returns the private key used by the media proxy
func (o *Options) ProxyPrivateKey() []byte {
return o.proxyPrivateKey
}
// WebAuthn returns true if WebAuthn logins are supported
func (o *Options) WebAuthn() bool {
return o.webAuthn
}
// FilterEntryMaxAgeDays returns the number of days after which entries should be retained.
func (o *Options) FilterEntryMaxAgeDays() int {
return o.filterEntryMaxAgeDays
}
// ContentSecurityPolicy returns value for Content-Security-Policy meta tag.
func (o *Options) ContentSecurityPolicy() string {
return o.contentSecurityPolicy
}
// SortedOptions returns options as a list of key value pairs, sorted by keys.
func (o *Options) SortedOptions(redactSecret bool) []*Option {
var keyValues = map[string]interface{}{
@ -637,6 +653,7 @@ func (o *Options) SortedOptions(redactSecret bool) []*Option {
"DISABLE_HSTS": !o.hsts,
"DISABLE_HTTP_SERVICE": !o.httpService,
"DISABLE_SCHEDULER_SERVICE": !o.schedulerService,
"FILTER_ENTRY_MAX_AGE_DAYS": o.filterEntryMaxAgeDays,
"FETCH_YOUTUBE_WATCH_TIME": o.fetchYouTubeWatchTime,
"FETCH_ODYSEE_WATCH_TIME": o.fetchOdyseeWatchTime,
"HTTPS": o.HTTPS,
@ -671,11 +688,11 @@ func (o *Options) SortedOptions(redactSecret bool) []*Option {
"FORCE_REFRESH_INTERVAL": o.forceRefreshInterval,
"POLLING_PARSING_ERROR_LIMIT": o.pollingParsingErrorLimit,
"POLLING_SCHEDULER": o.pollingScheduler,
"PROXY_HTTP_CLIENT_TIMEOUT": o.proxyHTTPClientTimeout,
"PROXY_MEDIA_TYPES": o.proxyMediaTypes,
"PROXY_OPTION": o.proxyOption,
"PROXY_PRIVATE_KEY": redactSecretValue(string(o.proxyPrivateKey), redactSecret),
"PROXY_URL": o.proxyUrl,
"MEDIA_PROXY_HTTP_CLIENT_TIMEOUT": o.mediaProxyHTTPClientTimeout,
"MEDIA_PROXY_RESOURCE_TYPES": o.mediaProxyResourceTypes,
"MEDIA_PROXY_MODE": o.mediaProxyMode,
"MEDIA_PROXY_PRIVATE_KEY": redactSecretValue(string(o.mediaProxyPrivateKey), redactSecret),
"MEDIA_PROXY_CUSTOM_URL": o.mediaProxyCustomURL,
"ROOT_URL": o.rootURL,
"RUN_MIGRATIONS": o.runMigrations,
"SCHEDULER_ENTRY_FREQUENCY_MAX_INTERVAL": o.schedulerEntryFrequencyMaxInterval,
@ -688,6 +705,7 @@ func (o *Options) SortedOptions(redactSecret bool) []*Option {
"WORKER_POOL_SIZE": o.workerPoolSize,
"YOUTUBE_EMBED_URL_OVERRIDE": o.youTubeEmbedUrlOverride,
"WEBAUTHN": o.webAuthn,
"CONTENT_SECURITY_POLICY": o.contentSecurityPolicy,
}
keys := make([]string, 0, len(keyValues))

View File

@ -10,6 +10,7 @@ import (
"errors"
"fmt"
"io"
"log/slog"
"net/url"
"os"
"strconv"
@ -87,6 +88,7 @@ func (p *Parser) parseLines(lines []string) (err error) {
p.opts.logFormat = parsedValue
}
case "DEBUG":
slog.Warn("The DEBUG environment variable is deprecated, use LOG_LEVEL instead")
parsedValue := parseBool(value, defaultDebug)
if parsedValue {
p.opts.logLevel = "debug"
@ -112,6 +114,8 @@ func (p *Parser) parseLines(lines []string) (err error) {
p.opts.databaseMinConns = parseInt(value, defaultDatabaseMinConns)
case "DATABASE_CONNECTION_LIFETIME":
p.opts.databaseConnectionLifetime = parseInt(value, defaultDatabaseConnectionLifetime)
case "FILTER_ENTRY_MAX_AGE_DAYS":
p.opts.filterEntryMaxAgeDays = parseInt(value, defaultFilterEntryMaxAgeDays)
case "RUN_MIGRATIONS":
p.opts.runMigrations = parseBool(value, defaultRunMigrations)
case "DISABLE_HSTS":
@ -158,20 +162,41 @@ func (p *Parser) parseLines(lines []string) (err error) {
p.opts.schedulerRoundRobinMinInterval = parseInt(value, defaultSchedulerRoundRobinMinInterval)
case "POLLING_PARSING_ERROR_LIMIT":
p.opts.pollingParsingErrorLimit = parseInt(value, defaultPollingParsingErrorLimit)
// kept for compatibility purpose
case "PROXY_IMAGES":
p.opts.proxyOption = parseString(value, defaultProxyOption)
slog.Warn("The PROXY_IMAGES environment variable is deprecated, use MEDIA_PROXY_MODE instead")
p.opts.mediaProxyMode = parseString(value, defaultMediaProxyMode)
case "PROXY_HTTP_CLIENT_TIMEOUT":
p.opts.proxyHTTPClientTimeout = parseInt(value, defaultProxyHTTPClientTimeout)
slog.Warn("The PROXY_HTTP_CLIENT_TIMEOUT environment variable is deprecated, use MEDIA_PROXY_HTTP_CLIENT_TIMEOUT instead")
p.opts.mediaProxyHTTPClientTimeout = parseInt(value, defaultMediaProxyHTTPClientTimeout)
case "MEDIA_PROXY_HTTP_CLIENT_TIMEOUT":
p.opts.mediaProxyHTTPClientTimeout = parseInt(value, defaultMediaProxyHTTPClientTimeout)
case "PROXY_OPTION":
p.opts.proxyOption = parseString(value, defaultProxyOption)
slog.Warn("The PROXY_OPTION environment variable is deprecated, use MEDIA_PROXY_MODE instead")
p.opts.mediaProxyMode = parseString(value, defaultMediaProxyMode)
case "MEDIA_PROXY_MODE":
p.opts.mediaProxyMode = parseString(value, defaultMediaProxyMode)
case "PROXY_MEDIA_TYPES":
p.opts.proxyMediaTypes = parseStringList(value, []string{defaultProxyMediaTypes})
// kept for compatibility purpose
slog.Warn("The PROXY_MEDIA_TYPES environment variable is deprecated, use MEDIA_PROXY_RESOURCE_TYPES instead")
p.opts.mediaProxyResourceTypes = parseStringList(value, []string{defaultMediaResourceTypes})
case "MEDIA_PROXY_RESOURCE_TYPES":
p.opts.mediaProxyResourceTypes = parseStringList(value, []string{defaultMediaResourceTypes})
case "PROXY_IMAGE_URL":
p.opts.proxyUrl = parseString(value, defaultProxyUrl)
slog.Warn("The PROXY_IMAGE_URL environment variable is deprecated, use MEDIA_PROXY_CUSTOM_URL instead")
p.opts.mediaProxyCustomURL = parseString(value, defaultMediaProxyURL)
case "PROXY_URL":
p.opts.proxyUrl = parseString(value, defaultProxyUrl)
slog.Warn("The PROXY_URL environment variable is deprecated, use MEDIA_PROXY_CUSTOM_URL instead")
p.opts.mediaProxyCustomURL = parseString(value, defaultMediaProxyURL)
case "PROXY_PRIVATE_KEY":
slog.Warn("The PROXY_PRIVATE_KEY environment variable is deprecated, use MEDIA_PROXY_PRIVATE_KEY instead")
randomKey := make([]byte, 16)
rand.Read(randomKey)
p.opts.mediaProxyPrivateKey = parseBytes(value, randomKey)
case "MEDIA_PROXY_PRIVATE_KEY":
randomKey := make([]byte, 16)
rand.Read(randomKey)
p.opts.mediaProxyPrivateKey = parseBytes(value, randomKey)
case "MEDIA_PROXY_CUSTOM_URL":
p.opts.mediaProxyCustomURL = parseString(value, defaultMediaProxyURL)
case "CREATE_ADMIN":
p.opts.createAdmin = parseBool(value, defaultCreateAdmin)
case "ADMIN_USERNAME":
@ -244,12 +269,10 @@ func (p *Parser) parseLines(lines []string) (err error) {
p.opts.watchdog = parseBool(value, defaultWatchdog)
case "INVIDIOUS_INSTANCE":
p.opts.invidiousInstance = parseString(value, defaultInvidiousInstance)
case "PROXY_PRIVATE_KEY":
randomKey := make([]byte, 16)
rand.Read(randomKey)
p.opts.proxyPrivateKey = parseBytes(value, randomKey)
case "WEBAUTHN":
p.opts.webAuthn = parseBool(value, defaultWebAuthn)
case "CONTENT_SECURITY_POLICY":
p.opts.contentSecurityPolicy = parseString(value, defaultContentSecurityPolicy)
}
}

View File

@ -876,4 +876,16 @@ var migrations = []func(tx *sql.Tx) error{
_, err = tx.Exec(sql)
return err
},
func(tx *sql.Tx) (err error) {
// the WHERE part speed-up the request a lot
sql := `UPDATE entries SET tags = array_remove(tags, '') WHERE '' = ANY(tags);`
_, err = tx.Exec(sql)
return err
},
func(tx *sql.Tx) (err error) {
// Entry URLs can exceeds btree maximum size
// Checking entry existence is now using entries_feed_id_status_hash_idx index
_, err = tx.Exec(`DROP INDEX entries_feed_url_idx`)
return err
},
}

View File

@ -13,8 +13,8 @@ import (
"miniflux.app/v2/internal/http/request"
"miniflux.app/v2/internal/http/response/json"
"miniflux.app/v2/internal/integration"
"miniflux.app/v2/internal/mediaproxy"
"miniflux.app/v2/internal/model"
"miniflux.app/v2/internal/proxy"
"miniflux.app/v2/internal/storage"
"github.com/gorilla/mux"
@ -324,7 +324,7 @@ func (h *handler) handleItems(w http.ResponseWriter, r *http.Request) {
FeedID: entry.FeedID,
Title: entry.Title,
Author: entry.Author,
HTML: proxy.AbsoluteProxyRewriter(h.router, r.Host, entry.Content),
HTML: mediaproxy.RewriteDocumentWithAbsoluteProxyURL(h.router, r.Host, entry.Content),
URL: entry.URL,
IsSaved: isSaved,
IsRead: isRead,

View File

@ -18,8 +18,8 @@ import (
"miniflux.app/v2/internal/http/response/json"
"miniflux.app/v2/internal/http/route"
"miniflux.app/v2/internal/integration"
"miniflux.app/v2/internal/mediaproxy"
"miniflux.app/v2/internal/model"
"miniflux.app/v2/internal/proxy"
"miniflux.app/v2/internal/reader/fetcher"
mff "miniflux.app/v2/internal/reader/handler"
mfs "miniflux.app/v2/internal/reader/subscription"
@ -766,7 +766,7 @@ func subscribe(newFeed Stream, category Stream, title string, store *storage.Sto
}
created, localizedError := mff.CreateFeed(store, userID, &feedRequest)
if err != nil {
if localizedError != nil {
return nil, localizedError.Error()
}
@ -1003,14 +1003,14 @@ func (h *handler) streamItemContentsHandler(w http.ResponseWriter, r *http.Reque
categories = append(categories, userStarred)
}
entry.Content = proxy.AbsoluteProxyRewriter(h.router, r.Host, entry.Content)
proxyOption := config.Opts.ProxyOption()
entry.Content = mediaproxy.RewriteDocumentWithAbsoluteProxyURL(h.router, r.Host, entry.Content)
proxyOption := config.Opts.MediaProxyMode()
for i := range entry.Enclosures {
if proxyOption == "all" || proxyOption != "none" && !urllib.IsHTTPS(entry.Enclosures[i].URL) {
for _, mediaType := range config.Opts.ProxyMediaTypes() {
for _, mediaType := range config.Opts.MediaProxyResourceTypes() {
if strings.HasPrefix(entry.Enclosures[i].MimeType, mediaType+"/") {
entry.Enclosures[i].URL = proxy.AbsoluteProxifyURL(h.router, r.Host, entry.Enclosures[i].URL)
entry.Enclosures[i].URL = mediaproxy.ProxifyAbsoluteURL(h.router, r.Host, entry.Enclosures[i].URL)
break
}
}

View File

@ -37,14 +37,10 @@ const (
func WebAuthnSessionData(r *http.Request) *model.WebAuthnSession {
if v := r.Context().Value(WebAuthnDataContextKey); v != nil {
value, valid := v.(model.WebAuthnSession)
if !valid {
return nil
if value, valid := v.(model.WebAuthnSession); valid {
return &value
}
return &value
}
return nil
}
@ -151,39 +147,27 @@ func ClientIP(r *http.Request) string {
func getContextStringValue(r *http.Request, key ContextKey) string {
if v := r.Context().Value(key); v != nil {
value, valid := v.(string)
if !valid {
return ""
if value, valid := v.(string); valid {
return value
}
return value
}
return ""
}
func getContextBoolValue(r *http.Request, key ContextKey) bool {
if v := r.Context().Value(key); v != nil {
value, valid := v.(bool)
if !valid {
return false
if value, valid := v.(bool); valid {
return value
}
return value
}
return false
}
func getContextInt64Value(r *http.Request, key ContextKey) int64 {
if v := r.Context().Value(key); v != nil {
value, valid := v.(int64)
if !valid {
return 0
if value, valid := v.(int64); valid {
return value
}
return value
}
return 0
}

View File

@ -12,6 +12,8 @@ import (
"net/http"
"strings"
"time"
"github.com/andybalholm/brotli"
)
const compressionThreshold = 1024
@ -96,7 +98,6 @@ func (b *Builder) Write() {
}
func (b *Builder) writeHeaders() {
b.headers["X-XSS-Protection"] = "1; mode=block"
b.headers["X-Content-Type-Options"] = "nosniff"
b.headers["X-Frame-Options"] = "DENY"
b.headers["Referrer-Policy"] = "no-referrer"
@ -111,8 +112,15 @@ func (b *Builder) writeHeaders() {
func (b *Builder) compress(data []byte) {
if b.enableCompression && len(data) > compressionThreshold {
acceptEncoding := b.r.Header.Get("Accept-Encoding")
switch {
case strings.Contains(acceptEncoding, "br"):
b.headers["Content-Encoding"] = "br"
b.writeHeaders()
brotliWriter := brotli.NewWriterV2(b.w, brotli.DefaultCompression)
defer brotliWriter.Close()
brotliWriter.Write(data)
return
case strings.Contains(acceptEncoding, "gzip"):
b.headers["Content-Encoding"] = "gzip"
b.writeHeaders()

View File

@ -28,7 +28,6 @@ func TestResponseHasCommonHeaders(t *testing.T) {
resp := w.Result()
headers := map[string]string{
"X-XSS-Protection": "1; mode=block",
"X-Content-Type-Options": "nosniff",
"X-Frame-Options": "DENY",
}
@ -229,7 +228,7 @@ func TestBuildResponseWithCachingAndEtag(t *testing.T) {
}
}
func TestBuildResponseWithGzipCompression(t *testing.T) {
func TestBuildResponseWithBrotliCompression(t *testing.T) {
body := strings.Repeat("a", compressionThreshold+1)
r, err := http.NewRequest("GET", "/", nil)
r.Header.Set("Accept-Encoding", "gzip, deflate, br")
@ -246,6 +245,30 @@ func TestBuildResponseWithGzipCompression(t *testing.T) {
handler.ServeHTTP(w, r)
resp := w.Result()
expected := "br"
actual := resp.Header.Get("Content-Encoding")
if actual != expected {
t.Fatalf(`Unexpected header value, got %q instead of %q`, actual, expected)
}
}
func TestBuildResponseWithGzipCompression(t *testing.T) {
body := strings.Repeat("a", compressionThreshold+1)
r, err := http.NewRequest("GET", "/", nil)
r.Header.Set("Accept-Encoding", "gzip, deflate")
if err != nil {
t.Fatal(err)
}
w := httptest.NewRecorder()
handler := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
New(w, r).WithBody(body).Write()
})
handler.ServeHTTP(w, r)
resp := w.Result()
expected := "gzip"
actual := resp.Header.Get("Content-Encoding")
if actual != expected {

View File

@ -10,7 +10,7 @@ import (
"miniflux.app/v2/internal/model"
)
// PushEntry pushes entries to matrix chat using integration settings provided
// PushEntries pushes entries to matrix chat using integration settings provided
func PushEntries(feed *model.Feed, entries model.Entries, matrixBaseURL, matrixUsername, matrixPassword, matrixRoomID string) error {
client := NewClient(matrixBaseURL)
discovery, err := client.DiscoverEndpoints()

View File

@ -1,7 +1,7 @@
// SPDX-FileCopyrightText: Copyright The Miniflux Authors. All rights reserved.
// SPDX-License-Identifier: Apache-2.0
package rssbridge // import "miniflux.app/integration/rssbridge"
package rssbridge // import "miniflux.app/v2/internal/integration/rssbridge"
import (
"encoding/json"

View File

@ -57,6 +57,7 @@ func (c *Client) SendSaveEntryWebhookEvent(entry *model.Entry) error {
ID: entry.Feed.ID,
UserID: entry.Feed.UserID,
CategoryID: entry.Feed.Category.ID,
Category: &WebhookCategory{ID: entry.Feed.Category.ID, Title: entry.Feed.Category.Title},
FeedURL: entry.Feed.FeedURL,
SiteURL: entry.Feed.SiteURL,
Title: entry.Feed.Title,
@ -94,13 +95,13 @@ func (c *Client) SendNewEntriesWebhookEvent(feed *model.Feed, entries model.Entr
Tags: entry.Tags,
})
}
return c.makeRequest(NewEntriesEventType, &WebhookNewEntriesEvent{
EventType: NewEntriesEventType,
Feed: &WebhookFeed{
ID: feed.ID,
UserID: feed.UserID,
CategoryID: feed.Category.ID,
Category: &WebhookCategory{ID: feed.Category.ID, Title: feed.Category.Title},
FeedURL: feed.FeedURL,
SiteURL: feed.SiteURL,
Title: feed.Title,
@ -145,13 +146,19 @@ func (c *Client) makeRequest(eventType string, payload any) error {
}
type WebhookFeed struct {
ID int64 `json:"id"`
UserID int64 `json:"user_id"`
CategoryID int64 `json:"category_id"`
FeedURL string `json:"feed_url"`
SiteURL string `json:"site_url"`
Title string `json:"title"`
CheckedAt time.Time `json:"checked_at"`
ID int64 `json:"id"`
UserID int64 `json:"user_id"`
CategoryID int64 `json:"category_id"`
Category *WebhookCategory `json:"category,omitempty"`
FeedURL string `json:"feed_url"`
SiteURL string `json:"site_url"`
Title string `json:"title"`
CheckedAt time.Time `json:"checked_at"`
}
type WebhookCategory struct {
ID int64 `json:"id"`
Title string `json:"title"`
}
type WebhookEntry struct {

View File

@ -178,6 +178,8 @@
"page.keyboard_shortcuts.go_to_feed": "Zum Abonnement gehen",
"page.keyboard_shortcuts.go_to_previous_page": "Zur vorherigen Seite gehen",
"page.keyboard_shortcuts.go_to_next_page": "Zur nächsten Seite gehen",
"page.keyboard_shortcuts.go_to_bottom_item": "Gehen Sie zum untersten Element",
"page.keyboard_shortcuts.go_to_top_item": "Zum obersten Artikel gehen",
"page.keyboard_shortcuts.open_item": "Gewählten Artikel öffnen",
"page.keyboard_shortcuts.open_original": "Original-Artikel öffnen",
"page.keyboard_shortcuts.open_original_same_window": "Öffne den Original-Link in der aktuellen Registerkarte",
@ -256,6 +258,7 @@
"alert.no_bookmark": "Es existiert derzeit kein Lesezeichen.",
"alert.no_category": "Es ist keine Kategorie vorhanden.",
"alert.no_category_entry": "Es befindet sich kein Artikel in dieser Kategorie.",
"alert.no_tag_entry": "Es gibt keine Artikel, die diesem Tag entsprechen.",
"alert.no_feed_entry": "Es existiert kein Artikel für dieses Abonnement.",
"alert.no_feed": "Es sind keine Abonnements vorhanden.",
"alert.no_feed_in_category": "Für diese Kategorie gibt es kein Abonnement.",

View File

@ -178,6 +178,8 @@
"page.keyboard_shortcuts.go_to_feed": "Πηγαίνετε στη ροή",
"page.keyboard_shortcuts.go_to_previous_page": "Μετάβαση στην προηγούμενη σελίδα",
"page.keyboard_shortcuts.go_to_next_page": "Μετάβαση στην επόμενη σελίδα",
"page.keyboard_shortcuts.go_to_bottom_item": "Μετάβαση στο κάτω στοιχείο",
"page.keyboard_shortcuts.go_to_top_item": "Μετάβαση στο επάνω στοιχείο",
"page.keyboard_shortcuts.open_item": "Άνοιγμα επιλεγμένου στοιχείου",
"page.keyboard_shortcuts.open_original": "Άνοιγμα αρχικού συνδέσμου",
"page.keyboard_shortcuts.open_original_same_window": "Άνοιγμα αρχικού συνδέσμου στην τρέχουσα καρτέλα",
@ -256,6 +258,7 @@
"alert.no_bookmark": "Δεν υπάρχει σελιδοδείκτης αυτή τη στιγμή.",
"alert.no_category": "Δεν υπάρχει κατηγορία.",
"alert.no_category_entry": "Δεν υπάρχουν άρθρα σε αυτήν την κατηγορία.",
"alert.no_tag_entry": "Δεν υπάρχουν αντικείμενα που να ταιριάζουν με αυτή την ετικέτα.",
"alert.no_feed_entry": "Δεν υπάρχουν άρθρα για αυτήν τη ροή.",
"alert.no_feed": "Δεν έχετε συνδρομές.",
"alert.no_feed_in_category": "Δεν υπάρχει συνδρομή για αυτήν την κατηγορία.",

View File

@ -176,6 +176,8 @@
"page.keyboard_shortcuts.go_to_previous_item": "Go to previous item",
"page.keyboard_shortcuts.go_to_next_item": "Go to next item",
"page.keyboard_shortcuts.go_to_feed": "Go to feed",
"page.keyboard_shortcuts.go_to_top_item": "Go to top item",
"page.keyboard_shortcuts.go_to_bottom_item": "Go to bottom item",
"page.keyboard_shortcuts.go_to_previous_page": "Go to previous page",
"page.keyboard_shortcuts.go_to_next_page": "Go to next page",
"page.keyboard_shortcuts.open_item": "Open selected item",
@ -215,7 +217,7 @@
"page.settings.webauthn.last_seen_on": "Last Used",
"page.settings.webauthn.register": "Register passkey",
"page.settings.webauthn.register.error": "Unable to register passkey",
"page.settings.webauthn.delete" : [
"page.settings.webauthn.delete": [
"Remove %d passkey",
"Remove %d passkeys"
],
@ -256,6 +258,7 @@
"alert.no_bookmark": "There are no starred entries.",
"alert.no_category": "There is no category.",
"alert.no_category_entry": "There are no entries in this category.",
"alert.no_tag_entry": "There are no entries matching this tag.",
"alert.no_feed_entry": "There are no entries for this feed.",
"alert.no_feed": "You dont have any feeds.",
"alert.no_feed_in_category": "There is no feed for this category.",

View File

@ -178,6 +178,8 @@
"page.keyboard_shortcuts.go_to_feed": "Ir a la fuente",
"page.keyboard_shortcuts.go_to_previous_page": "Ir al página anterior",
"page.keyboard_shortcuts.go_to_next_page": "Ir al página siguiente",
"page.keyboard_shortcuts.go_to_bottom_item": "Ir al elemento inferior",
"page.keyboard_shortcuts.go_to_top_item": "Ir al elemento superior",
"page.keyboard_shortcuts.open_item": "Abrir el elemento seleccionado",
"page.keyboard_shortcuts.open_original": "Abrir el enlace original",
"page.keyboard_shortcuts.open_original_same_window": "Abrir enlace original en la pestaña actual",
@ -256,6 +258,7 @@
"alert.no_bookmark": "No hay marcador en este momento.",
"alert.no_category": "No hay categoría.",
"alert.no_category_entry": "No hay artículos en esta categoría.",
"alert.no_tag_entry": "No hay artículos con esta etiqueta.",
"alert.no_feed_entry": "No hay artículos para esta fuente.",
"alert.no_feed": "No tienes fuentes.",
"alert.no_feed_in_category": "No hay fuentes para esta categoría.",

View File

@ -178,6 +178,8 @@
"page.keyboard_shortcuts.go_to_feed": "Siirry syötteeseen",
"page.keyboard_shortcuts.go_to_previous_page": "Siirry edelliselle sivulle",
"page.keyboard_shortcuts.go_to_next_page": "Siirry seuraavalle sivulle",
"page.keyboard_shortcuts.go_to_bottom_item": "Siirry alimpaan kohtaan",
"page.keyboard_shortcuts.go_to_top_item": "Siirry alkuun",
"page.keyboard_shortcuts.open_item": "Avaa valittu kohde",
"page.keyboard_shortcuts.open_original": "Avaa alkuperäinen linkki",
"page.keyboard_shortcuts.open_original_same_window": "Avaa alkuperäinen linkki nykyisessä välilehdessä",
@ -256,6 +258,7 @@
"alert.no_bookmark": "Tällä hetkellä ei ole kirjanmerkkiä.",
"alert.no_category": "Ei ole kategoriaa.",
"alert.no_category_entry": "Tässä kategoriassa ei ole artikkeleita.",
"alert.no_tag_entry": "Tätä tunnistetta vastaavia merkintöjä ei ole.",
"alert.no_feed_entry": "Tässä syötteessä ei ole artikkeleita.",
"alert.no_feed": "Sinulla ei ole tilauksia.",
"alert.no_feed_in_category": "Tälle kategorialle ei ole tilausta.",

View File

@ -178,6 +178,8 @@
"page.keyboard_shortcuts.go_to_feed": "Voir abonnement",
"page.keyboard_shortcuts.go_to_previous_page": "Page précédente",
"page.keyboard_shortcuts.go_to_next_page": "Page suivante",
"page.keyboard_shortcuts.go_to_bottom_item": "Aller à l'élément du bas",
"page.keyboard_shortcuts.go_to_top_item": "Aller à l'élément supérieur",
"page.keyboard_shortcuts.open_item": "Ouvrir élément sélectionné",
"page.keyboard_shortcuts.open_original": "Ouvrir le lien original",
"page.keyboard_shortcuts.open_original_same_window": "Ouvrir le lien original dans l'onglet en cours",
@ -215,7 +217,7 @@
"page.settings.webauthn.last_seen_on": "Dernière utilisation",
"page.settings.webauthn.register": "Enregister une nouvelle clé daccès",
"page.settings.webauthn.register.error": "Impossible d'enregistrer la clé daccès",
"page.settings.webauthn.delete" : [
"page.settings.webauthn.delete": [
"Supprimer %d clé daccès",
"Supprimer %d clés daccès"
],
@ -256,6 +258,7 @@
"alert.no_bookmark": "Il n'y a aucun favoris pour le moment.",
"alert.no_category": "Il n'y a aucune catégorie.",
"alert.no_category_entry": "Il n'y a aucun article dans cette catégorie.",
"alert.no_tag_entry": "Il n'y a aucun article correspondant à ce tag.",
"alert.no_feed_entry": "Il n'y a aucun article pour cet abonnement.",
"alert.no_feed": "Vous n'avez aucun abonnement.",
"alert.no_feed_in_category": "Il n'y a pas d'abonnement pour cette catégorie.",

View File

@ -178,6 +178,8 @@
"page.keyboard_shortcuts.go_to_feed": "फ़ीड पर जाएं",
"page.keyboard_shortcuts.go_to_previous_page": "पिछले पृष्ठ पर जाएं",
"page.keyboard_shortcuts.go_to_next_page": "अगले पेज पर जाएं",
"page.keyboard_shortcuts.go_to_bottom_item": "निचले आइटम पर जाएँ",
"page.keyboard_shortcuts.go_to_top_item": "शीर्ष आइटम पर जाएँ",
"page.keyboard_shortcuts.open_item": "चयनित आइटम खोलें",
"page.keyboard_shortcuts.open_original": "मूल लिंक खोलें",
"page.keyboard_shortcuts.open_original_same_window": "वर्तमान टैब में मूल लिंक खोलें",
@ -256,6 +258,7 @@
"alert.no_bookmark": "इस समय कोई बुकमार्क नहीं है",
"alert.no_category": "कोई श्रेणी नहीं है।",
"alert.no_category_entry": "इस श्रेणी में कोई विषय-वस्तु नहीं है।",
"alert.no_tag_entry": "इस टैग से मेल खाती कोई प्रविष्टियाँ नहीं हैं।",
"alert.no_feed_entry": "इस फ़ीड के लिए कोई विषय-वस्तु नहीं है।",
"alert.no_feed": "आपके पास कोई सदस्यता नहीं है।",
"alert.no_feed_in_category": "इस श्रेणी के लिए कोई सदस्यता नहीं है।",

View File

@ -169,6 +169,8 @@
"page.keyboard_shortcuts.go_to_feed": "Ke umpan",
"page.keyboard_shortcuts.go_to_previous_page": "Ke halaman sebelumnya",
"page.keyboard_shortcuts.go_to_next_page": "Ke halaman berikutnya",
"page.keyboard_shortcuts.go_to_bottom_item": "Pergi ke item paling bawah",
"page.keyboard_shortcuts.go_to_top_item": "Pergi ke item teratas",
"page.keyboard_shortcuts.open_item": "Buka entri yang dipilih",
"page.keyboard_shortcuts.open_original": "Buka tautan asli",
"page.keyboard_shortcuts.open_original_same_window": "Buka tautan asli di bilah saat ini",
@ -246,6 +248,7 @@
"alert.no_bookmark": "Tidak ada markah.",
"alert.no_category": "Tidak ada kategori.",
"alert.no_category_entry": "Tidak ada artikel di kategori ini.",
"alert.no_tag_entry": "Tidak ada entri yang cocok dengan tag ini.",
"alert.no_feed_entry": "Tidak ada artikel di umpan ini.",
"alert.no_feed": "Anda tidak memiliki langganan.",
"alert.no_feed_in_category": "Tidak ada langganan untuk kategori ini.",

View File

@ -178,6 +178,8 @@
"page.keyboard_shortcuts.go_to_feed": "Mostra il feed",
"page.keyboard_shortcuts.go_to_previous_page": "Mostra la pagina precedente",
"page.keyboard_shortcuts.go_to_next_page": "Mostra la pagina successiva",
"page.keyboard_shortcuts.go_to_bottom_item": "Vai all'elemento in fondo",
"page.keyboard_shortcuts.go_to_top_item": "Vai all'elemento principale",
"page.keyboard_shortcuts.open_item": "Apri l'articolo selezionato",
"page.keyboard_shortcuts.open_original": "Apri la pagina web originale",
"page.keyboard_shortcuts.open_original_same_window": "Apri il link originale nella scheda corrente",
@ -256,6 +258,7 @@
"alert.no_bookmark": "Nessun preferito disponibile.",
"alert.no_category": "Nessuna categoria disponibile.",
"alert.no_category_entry": "Questa categoria non contiene alcun articolo.",
"alert.no_tag_entry": "Non ci sono voci corrispondenti a questo tag.",
"alert.no_feed_entry": "Questo feed non contiene alcun articolo.",
"alert.no_feed": "Nessun feed disponibile.",
"alert.no_feed_in_category": "Non esiste un abbonamento per questa categoria.",

View File

@ -169,6 +169,8 @@
"page.keyboard_shortcuts.go_to_feed": "フィード",
"page.keyboard_shortcuts.go_to_previous_page": "前のページ",
"page.keyboard_shortcuts.go_to_next_page": "次のページ",
"page.keyboard_shortcuts.go_to_bottom_item": "一番下の項目に移動",
"page.keyboard_shortcuts.go_to_top_item": "先頭の項目に移動",
"page.keyboard_shortcuts.open_item": "選択されたアイテムを開く",
"page.keyboard_shortcuts.open_original": "オリジナルのリンクを開く",
"page.keyboard_shortcuts.open_original_same_window": "現在のタブでオリジナルのリンクを開く",
@ -246,6 +248,7 @@
"alert.no_bookmark": "現在星付きはありません。",
"alert.no_category": "カテゴリが存在しません。",
"alert.no_category_entry": "このカテゴリには記事がありません。",
"alert.no_tag_entry": "このタグに一致するエントリーはありません。",
"alert.no_feed_entry": "このフィードには記事がありません。",
"alert.no_feed": "何も購読していません。",
"alert.no_feed_in_category": "このカテゴリには購読中のフィードがありません。",

View File

@ -179,6 +179,8 @@
"page.keyboard_shortcuts.go_to_feed": "Ga naar feed",
"page.keyboard_shortcuts.go_to_previous_page": "Vorige pagina",
"page.keyboard_shortcuts.go_to_next_page": "Volgende pagina",
"page.keyboard_shortcuts.go_to_bottom_item": "Ga naar het onderste item",
"page.keyboard_shortcuts.go_to_top_item": "Ga naar het bovenste item",
"page.keyboard_shortcuts.open_item": "Open geselecteerde link",
"page.keyboard_shortcuts.open_original": "Open originele link",
"page.keyboard_shortcuts.open_original_same_window": "Oorspronkelijke koppeling op huidig tabblad openen",
@ -256,6 +258,7 @@
"alert.no_bookmark": "Er zijn op dit moment geen favorieten.",
"alert.no_category": "Er zijn geen categorieën.",
"alert.no_category_entry": "Deze categorie bevat geen feeds.",
"alert.no_tag_entry": "Er zijn geen items die overeenkomen met deze tag.",
"alert.no_feed_entry": "Er zijn geen artikelen in deze feed.",
"alert.no_feed": "Je hebt nog geen feeds geabboneerd staan.",
"alert.no_feed_in_category": "Er is geen abonnement voor deze categorie.",

View File

@ -187,6 +187,8 @@
"page.keyboard_shortcuts.go_to_feed": "Przejdź do subskrypcji",
"page.keyboard_shortcuts.go_to_previous_page": "Przejdź do poprzedniej strony",
"page.keyboard_shortcuts.go_to_next_page": "Przejdź do następnej strony",
"page.keyboard_shortcuts.go_to_bottom_item": "Przejdź do dolnego elementu",
"page.keyboard_shortcuts.go_to_top_item": "Przejdź do najwyższego elementu",
"page.keyboard_shortcuts.open_item": "Otwórz zaznaczony artykuł",
"page.keyboard_shortcuts.open_original": "Otwórz oryginalny artykuł",
"page.keyboard_shortcuts.open_original_same_window": "Otwórz oryginalny link w bieżącej karcie",
@ -266,6 +268,7 @@
"alert.no_bookmark": "Obecnie nie ma żadnych zakładek.",
"alert.no_category": "Nie ma żadnej kategorii!",
"alert.no_category_entry": "W tej kategorii nie ma żadnych artykułów",
"alert.no_tag_entry": "Nie ma wpisów pasujących do tego tagu.",
"alert.no_feed_entry": "Nie ma artykułu dla tego kanału.",
"alert.no_feed": "Nie masz żadnej subskrypcji.",
"alert.no_feed_in_category": "Nie ma subskrypcji dla tej kategorii.",

View File

@ -178,6 +178,8 @@
"page.keyboard_shortcuts.go_to_feed": "Ir a fonte",
"page.keyboard_shortcuts.go_to_previous_page": "Ir a página anterior",
"page.keyboard_shortcuts.go_to_next_page": "Ir a página seguinte",
"page.keyboard_shortcuts.go_to_bottom_item": "Ir para o item inferior",
"page.keyboard_shortcuts.go_to_top_item": "Ir para o item superior",
"page.keyboard_shortcuts.open_item": "Abrir o item selecionado",
"page.keyboard_shortcuts.open_original": "Abrir o conteúdo original",
"page.keyboard_shortcuts.open_original_same_window": "Abrir o conteúdo original na janela atual",
@ -256,6 +258,7 @@
"alert.no_bookmark": "Não há favorito neste momento.",
"alert.no_category": "Não há categoria.",
"alert.no_category_entry": "Não há itens nesta categoria.",
"alert.no_tag_entry": "Não há itens que correspondam a esta etiqueta.",
"alert.no_feed_entry": "Não há itens nessa fonte.",
"alert.no_feed": "Não há inscrições.",
"alert.no_feed_in_category": "Não há inscrições nessa categoria.",

View File

@ -187,6 +187,8 @@
"page.keyboard_shortcuts.go_to_feed": "Перейти к подписке",
"page.keyboard_shortcuts.go_to_previous_page": "Перейти к предыдущей странице",
"page.keyboard_shortcuts.go_to_next_page": "Перейти к следующей странице",
"page.keyboard_shortcuts.go_to_bottom_item": "Перейти к нижнему элементу",
"page.keyboard_shortcuts.go_to_top_item": "Перейти к верхнему элементу",
"page.keyboard_shortcuts.open_item": "Открыть выбранный элемент",
"page.keyboard_shortcuts.open_original_same_window": "Открыть оригинальную ссылку в текущей вкладке",
"page.keyboard_shortcuts.open_original": "Открыть оригинальную ссылку",
@ -266,6 +268,7 @@
"alert.no_bookmark": "Избранное отсутствует.",
"alert.no_category": "Категории отсутствуют.",
"alert.no_category_entry": "В этой категории нет статей.",
"alert.no_tag_entry": "Нет записей, соответствующих этому тегу.",
"alert.no_feed_entry": "В этой подписке отсутствуют статьи.",
"alert.no_feed": "У вас нет ни одной подписки.",
"alert.no_feed_in_category": "Для этой категории нет подписки.",

File diff suppressed because it is too large Load Diff

View File

@ -187,6 +187,8 @@
"page.keyboard_shortcuts.go_to_feed": "Перейти до стрічки",
"page.keyboard_shortcuts.go_to_previous_page": "Перейти до попередньої сторінки",
"page.keyboard_shortcuts.go_to_next_page": "Перейти до наступної сторінки",
"page.keyboard_shortcuts.go_to_bottom_item": "Перейти до нижнього пункту",
"page.keyboard_shortcuts.go_to_top_item": "Перейти до верхнього пункту",
"page.keyboard_shortcuts.open_item": "Відкрити виділений запис",
"page.keyboard_shortcuts.open_original": "Відкрити оригінальне посилання",
"page.keyboard_shortcuts.open_original_same_window": "Відкрити оригінальне посилання в поточній вкладці",
@ -266,6 +268,7 @@
"alert.no_bookmark": "Наразі закладки відсутні.",
"alert.no_category": "Немає категорії.",
"alert.no_category_entry": "У цій категорії немає записів.",
"alert.no_tag_entry": "Немає записів, що відповідають цьому тегу.",
"alert.no_feed_entry": "У цій стрічці немає записів.",
"alert.no_feed": "У вас немає підписок.",
"alert.no_feed_in_category": "У цій категорії немає підписок.",

View File

@ -169,6 +169,8 @@
"page.keyboard_shortcuts.go_to_feed": "转到源页面",
"page.keyboard_shortcuts.go_to_previous_page": "上一页",
"page.keyboard_shortcuts.go_to_next_page": "下一页",
"page.keyboard_shortcuts.go_to_bottom_item": "转到底部项目",
"page.keyboard_shortcuts.go_to_top_item": "转到顶部项目",
"page.keyboard_shortcuts.open_item": "打开选定的文章",
"page.keyboard_shortcuts.open_original": "打开原始链接",
"page.keyboard_shortcuts.open_original_same_window": "在当前标签页中打开原始链接",
@ -246,6 +248,7 @@
"alert.no_bookmark": "目前没有收藏",
"alert.no_category": "目前没有分类",
"alert.no_category_entry": "该分类下没有文章",
"alert.no_tag_entry": "没有与此标签匹配的条目。",
"alert.no_feed_entry": "该源中没有文章",
"alert.no_feed": "目前没有源",
"alert.no_history": "目前没有历史",
@ -406,9 +409,9 @@
"form.integration.omnivore_activate": "保存文章到 Omnivore",
"form.integration.omnivore_url": "Omnivore API 端点",
"form.integration.omnivore_api_key": "Omnivore API 密钥",
"form.integration.espial_activate": "保存文章到 Espial",
"form.integration.espial_endpoint": "Espial API 端点",
"form.integration.espial_api_key": "Espial API 密钥",
"form.integration.espial_activate": "保存文章到 Espial",
"form.integration.espial_endpoint": "Espial API 端点",
"form.integration.espial_api_key": "Espial API 密钥",
"form.integration.espial_tags": "Espial 标签",
"form.integration.readwise_activate": "保存文章到 Readwise Reader",
"form.integration.readwise_api_key": "Readwise Reader Access Token",

View File

@ -169,6 +169,8 @@
"page.keyboard_shortcuts.go_to_feed": "轉到Feed頁面",
"page.keyboard_shortcuts.go_to_previous_page": "上一頁",
"page.keyboard_shortcuts.go_to_next_page": "下一頁",
"page.keyboard_shortcuts.go_to_bottom_item": "转到底部项目",
"page.keyboard_shortcuts.go_to_top_item": "转到顶部项目",
"page.keyboard_shortcuts.open_item": "開啟選定的文章",
"page.keyboard_shortcuts.open_original": "開啟原始連結",
"page.keyboard_shortcuts.open_original_same_window": "在當前標籤頁中開啟原始連結",
@ -246,6 +248,7 @@
"alert.no_bookmark": "目前沒有收藏",
"alert.no_category": "目前沒有分類",
"alert.no_category_entry": "該分類下沒有文章",
"alert.no_tag_entry": "沒有與此標籤相符的條目。",
"alert.no_feed_entry": "該Feed中沒有文章",
"alert.no_feed": "目前沒有Feed",
"alert.no_history": "目前沒有歷史",

View File

@ -1,7 +1,7 @@
// SPDX-FileCopyrightText: Copyright The Miniflux Authors. All rights reserved.
// SPDX-License-Identifier: Apache-2.0
package proxy // import "miniflux.app/v2/internal/proxy"
package mediaproxy // import "miniflux.app/v2/internal/mediaproxy"
import (
"net/http"
@ -29,11 +29,11 @@ func TestProxyFilterWithHttpDefault(t *testing.T) {
r.HandleFunc("/proxy/{encodedDigest}/{encodedURL}", func(w http.ResponseWriter, r *http.Request) {}).Name("proxy")
input := `<p><img src="http://website/folder/image.png" alt="Test"/></p>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
expected := `<p><img src="/proxy/okK5PsdNY8F082UMQEAbLPeUFfbe2WnNfInNmR9T4WA=/aHR0cDovL3dlYnNpdGUvZm9sZGVyL2ltYWdlLnBuZw==" alt="Test"/></p>`
if expected != output {
t.Errorf(`Not expected output: got "%s" instead of "%s"`, output, expected)
t.Errorf(`Not expected output: got %q instead of %q`, output, expected)
}
}
@ -53,11 +53,11 @@ func TestProxyFilterWithHttpsDefault(t *testing.T) {
r.HandleFunc("/proxy/{encodedDigest}/{encodedURL}", func(w http.ResponseWriter, r *http.Request) {}).Name("proxy")
input := `<p><img src="https://website/folder/image.png" alt="Test"/></p>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
expected := `<p><img src="https://website/folder/image.png" alt="Test"/></p>`
if expected != output {
t.Errorf(`Not expected output: got "%s" instead of "%s"`, output, expected)
t.Errorf(`Not expected output: got %q instead of %q`, output, expected)
}
}
@ -76,11 +76,11 @@ func TestProxyFilterWithHttpNever(t *testing.T) {
r.HandleFunc("/proxy/{encodedDigest}/{encodedURL}", func(w http.ResponseWriter, r *http.Request) {}).Name("proxy")
input := `<p><img src="http://website/folder/image.png" alt="Test"/></p>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
expected := input
if expected != output {
t.Errorf(`Not expected output: got "%s" instead of "%s"`, output, expected)
t.Errorf(`Not expected output: got %q instead of %q`, output, expected)
}
}
@ -99,11 +99,11 @@ func TestProxyFilterWithHttpsNever(t *testing.T) {
r.HandleFunc("/proxy/{encodedDigest}/{encodedURL}", func(w http.ResponseWriter, r *http.Request) {}).Name("proxy")
input := `<p><img src="https://website/folder/image.png" alt="Test"/></p>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
expected := input
if expected != output {
t.Errorf(`Not expected output: got "%s" instead of "%s"`, output, expected)
t.Errorf(`Not expected output: got %q instead of %q`, output, expected)
}
}
@ -124,11 +124,11 @@ func TestProxyFilterWithHttpAlways(t *testing.T) {
r.HandleFunc("/proxy/{encodedDigest}/{encodedURL}", func(w http.ResponseWriter, r *http.Request) {}).Name("proxy")
input := `<p><img src="http://website/folder/image.png" alt="Test"/></p>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
expected := `<p><img src="/proxy/okK5PsdNY8F082UMQEAbLPeUFfbe2WnNfInNmR9T4WA=/aHR0cDovL3dlYnNpdGUvZm9sZGVyL2ltYWdlLnBuZw==" alt="Test"/></p>`
if expected != output {
t.Errorf(`Not expected output: got "%s" instead of "%s"`, output, expected)
t.Errorf(`Not expected output: got %q instead of %q`, output, expected)
}
}
@ -149,11 +149,87 @@ func TestProxyFilterWithHttpsAlways(t *testing.T) {
r.HandleFunc("/proxy/{encodedDigest}/{encodedURL}", func(w http.ResponseWriter, r *http.Request) {}).Name("proxy")
input := `<p><img src="https://website/folder/image.png" alt="Test"/></p>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
expected := `<p><img src="/proxy/LdPNR1GBDigeeNp2ArUQRyZsVqT_PWLfHGjYFrrWWIY=/aHR0cHM6Ly93ZWJzaXRlL2ZvbGRlci9pbWFnZS5wbmc=" alt="Test"/></p>`
if expected != output {
t.Errorf(`Not expected output: got "%s" instead of "%s"`, output, expected)
t.Errorf(`Not expected output: got %q instead of %q`, output, expected)
}
}
func TestAbsoluteProxyFilterWithHttpsAlways(t *testing.T) {
os.Clearenv()
os.Setenv("PROXY_OPTION", "all")
os.Setenv("PROXY_MEDIA_TYPES", "image")
os.Setenv("PROXY_PRIVATE_KEY", "test")
var err error
parser := config.NewParser()
config.Opts, err = parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
r := mux.NewRouter()
r.HandleFunc("/proxy/{encodedDigest}/{encodedURL}", func(w http.ResponseWriter, r *http.Request) {}).Name("proxy")
input := `<p><img src="https://website/folder/image.png" alt="Test"/></p>`
output := RewriteDocumentWithAbsoluteProxyURL(r, "localhost", input)
expected := `<p><img src="http://localhost/proxy/LdPNR1GBDigeeNp2ArUQRyZsVqT_PWLfHGjYFrrWWIY=/aHR0cHM6Ly93ZWJzaXRlL2ZvbGRlci9pbWFnZS5wbmc=" alt="Test"/></p>`
if expected != output {
t.Errorf(`Not expected output: got %q instead of %q`, output, expected)
}
}
func TestAbsoluteProxyFilterWithHttpsScheme(t *testing.T) {
os.Clearenv()
os.Setenv("PROXY_OPTION", "all")
os.Setenv("PROXY_MEDIA_TYPES", "image")
os.Setenv("PROXY_PRIVATE_KEY", "test")
os.Setenv("HTTPS", "1")
var err error
parser := config.NewParser()
config.Opts, err = parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
r := mux.NewRouter()
r.HandleFunc("/proxy/{encodedDigest}/{encodedURL}", func(w http.ResponseWriter, r *http.Request) {}).Name("proxy")
input := `<p><img src="https://website/folder/image.png" alt="Test"/></p>`
output := RewriteDocumentWithAbsoluteProxyURL(r, "localhost", input)
expected := `<p><img src="https://localhost/proxy/LdPNR1GBDigeeNp2ArUQRyZsVqT_PWLfHGjYFrrWWIY=/aHR0cHM6Ly93ZWJzaXRlL2ZvbGRlci9pbWFnZS5wbmc=" alt="Test"/></p>`
if expected != output {
t.Errorf(`Not expected output: got %q instead of %q`, output, expected)
}
}
func TestAbsoluteProxyFilterWithHttpsAlwaysAndAudioTag(t *testing.T) {
os.Clearenv()
os.Setenv("PROXY_OPTION", "all")
os.Setenv("PROXY_MEDIA_TYPES", "audio")
os.Setenv("PROXY_PRIVATE_KEY", "test")
var err error
parser := config.NewParser()
config.Opts, err = parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
r := mux.NewRouter()
r.HandleFunc("/proxy/{encodedDigest}/{encodedURL}", func(w http.ResponseWriter, r *http.Request) {}).Name("proxy")
input := `<audio src="https://website/folder/audio.mp3"></audio>`
output := RewriteDocumentWithAbsoluteProxyURL(r, "localhost", input)
expected := `<audio src="http://localhost/proxy/EmBTvmU5B17wGuONkeknkptYopW_Tl6Y6_W8oYbN_Xs=/aHR0cHM6Ly93ZWJzaXRlL2ZvbGRlci9hdWRpby5tcDM="></audio>`
if expected != output {
t.Errorf(`Not expected output: got %q instead of %q`, output, expected)
}
}
@ -174,11 +250,61 @@ func TestProxyFilterWithHttpsAlwaysAndCustomProxyServer(t *testing.T) {
r.HandleFunc("/proxy/{encodedDigest}/{encodedURL}", func(w http.ResponseWriter, r *http.Request) {}).Name("proxy")
input := `<p><img src="https://website/folder/image.png" alt="Test"/></p>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
expected := `<p><img src="https://proxy-example/proxy/aHR0cHM6Ly93ZWJzaXRlL2ZvbGRlci9pbWFnZS5wbmc=" alt="Test"/></p>`
if expected != output {
t.Errorf(`Not expected output: got "%s" instead of "%s"`, output, expected)
t.Errorf(`Not expected output: got %q instead of %q`, output, expected)
}
}
func TestProxyFilterWithHttpsAlwaysAndIncorrectCustomProxyServer(t *testing.T) {
os.Clearenv()
os.Setenv("PROXY_OPTION", "all")
os.Setenv("PROXY_MEDIA_TYPES", "image")
os.Setenv("PROXY_URL", "http://:8080example.com")
var err error
parser := config.NewParser()
config.Opts, err = parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
r := mux.NewRouter()
r.HandleFunc("/proxy/{encodedDigest}/{encodedURL}", func(w http.ResponseWriter, r *http.Request) {}).Name("proxy")
input := `<p><img src="https://website/folder/image.png" alt="Test"/></p>`
output := RewriteDocumentWithRelativeProxyURL(r, input)
expected := `<p><img src="https://website/folder/image.png" alt="Test"/></p>`
if expected != output {
t.Errorf(`Not expected output: got %q instead of %q`, output, expected)
}
}
func TestAbsoluteProxyFilterWithHttpsAlwaysAndCustomProxyServer(t *testing.T) {
os.Clearenv()
os.Setenv("MEDIA_PROXY_MODE", "all")
os.Setenv("MEDIA_PROXY_RESOURCE_TYPES", "image")
os.Setenv("MEDIA_PROXY_CUSTOM_URL", "https://proxy-example/proxy")
var err error
parser := config.NewParser()
config.Opts, err = parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
r := mux.NewRouter()
r.HandleFunc("/proxy/{encodedDigest}/{encodedURL}", func(w http.ResponseWriter, r *http.Request) {}).Name("proxy")
input := `<p><img src="https://website/folder/image.png" alt="Test"/></p>`
output := RewriteDocumentWithAbsoluteProxyURL(r, "localhost", input)
expected := `<p><img src="https://proxy-example/proxy/aHR0cHM6Ly93ZWJzaXRlL2ZvbGRlci9pbWFnZS5wbmc=" alt="Test"/></p>`
if expected != output {
t.Errorf(`Not expected output: got %q instead of %q`, output, expected)
}
}
@ -198,11 +324,11 @@ func TestProxyFilterWithHttpInvalid(t *testing.T) {
r.HandleFunc("/proxy/{encodedDigest}/{encodedURL}", func(w http.ResponseWriter, r *http.Request) {}).Name("proxy")
input := `<p><img src="http://website/folder/image.png" alt="Test"/></p>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
expected := `<p><img src="/proxy/okK5PsdNY8F082UMQEAbLPeUFfbe2WnNfInNmR9T4WA=/aHR0cDovL3dlYnNpdGUvZm9sZGVyL2ltYWdlLnBuZw==" alt="Test"/></p>`
if expected != output {
t.Errorf(`Not expected output: got "%s" instead of "%s"`, output, expected)
t.Errorf(`Not expected output: got %q instead of %q`, output, expected)
}
}
@ -222,11 +348,11 @@ func TestProxyFilterWithHttpsInvalid(t *testing.T) {
r.HandleFunc("/proxy/{encodedDigest}/{encodedURL}", func(w http.ResponseWriter, r *http.Request) {}).Name("proxy")
input := `<p><img src="https://website/folder/image.png" alt="Test"/></p>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
expected := `<p><img src="https://website/folder/image.png" alt="Test"/></p>`
if expected != output {
t.Errorf(`Not expected output: got "%s" instead of "%s"`, output, expected)
t.Errorf(`Not expected output: got %q instead of %q`, output, expected)
}
}
@ -248,7 +374,7 @@ func TestProxyFilterWithSrcset(t *testing.T) {
input := `<p><img src="http://website/folder/image.png" srcset="http://website/folder/image2.png 656w, http://website/folder/image3.png 360w" alt="test"></p>`
expected := `<p><img src="/proxy/okK5PsdNY8F082UMQEAbLPeUFfbe2WnNfInNmR9T4WA=/aHR0cDovL3dlYnNpdGUvZm9sZGVyL2ltYWdlLnBuZw==" srcset="/proxy/aY5Hb4urDnUCly2vTJ7ExQeeaVS-52O7kjUr2v9VrAs=/aHR0cDovL3dlYnNpdGUvZm9sZGVyL2ltYWdlMi5wbmc= 656w, /proxy/QgAmrJWiAud_nNAsz3F8OTxaIofwAiO36EDzH_YfMzo=/aHR0cDovL3dlYnNpdGUvZm9sZGVyL2ltYWdlMy5wbmc= 360w" alt="test"/></p>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
if expected != output {
t.Errorf(`Not expected output: got %s`, output)
@ -273,7 +399,7 @@ func TestProxyFilterWithEmptySrcset(t *testing.T) {
input := `<p><img src="http://website/folder/image.png" srcset="" alt="test"></p>`
expected := `<p><img src="/proxy/okK5PsdNY8F082UMQEAbLPeUFfbe2WnNfInNmR9T4WA=/aHR0cDovL3dlYnNpdGUvZm9sZGVyL2ltYWdlLnBuZw==" srcset="" alt="test"/></p>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
if expected != output {
t.Errorf(`Not expected output: got %s`, output)
@ -298,7 +424,7 @@ func TestProxyFilterWithPictureSource(t *testing.T) {
input := `<picture><source srcset="http://website/folder/image2.png 656w, http://website/folder/image3.png 360w, https://website/some,image.png 2x"></picture>`
expected := `<picture><source srcset="/proxy/aY5Hb4urDnUCly2vTJ7ExQeeaVS-52O7kjUr2v9VrAs=/aHR0cDovL3dlYnNpdGUvZm9sZGVyL2ltYWdlMi5wbmc= 656w, /proxy/QgAmrJWiAud_nNAsz3F8OTxaIofwAiO36EDzH_YfMzo=/aHR0cDovL3dlYnNpdGUvZm9sZGVyL2ltYWdlMy5wbmc= 360w, /proxy/ZIw0hv8WhSTls5aSqhnFaCXlUrKIqTnBRaY0-NaLnds=/aHR0cHM6Ly93ZWJzaXRlL3NvbWUsaW1hZ2UucG5n 2x"/></picture>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
if expected != output {
t.Errorf(`Not expected output: got %s`, output)
@ -323,7 +449,7 @@ func TestProxyFilterOnlyNonHTTPWithPictureSource(t *testing.T) {
input := `<picture><source srcset="http://website/folder/image2.png 656w, https://website/some,image.png 2x"></picture>`
expected := `<picture><source srcset="/proxy/aY5Hb4urDnUCly2vTJ7ExQeeaVS-52O7kjUr2v9VrAs=/aHR0cDovL3dlYnNpdGUvZm9sZGVyL2ltYWdlMi5wbmc= 656w, https://website/some,image.png 2x"/></picture>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
if expected != output {
t.Errorf(`Not expected output: got %s`, output)
@ -347,7 +473,7 @@ func TestProxyWithImageDataURL(t *testing.T) {
input := `<img src="data:image/gif;base64,test">`
expected := `<img src="data:image/gif;base64,test"/>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
if expected != output {
t.Errorf(`Not expected output: got %s`, output)
@ -371,7 +497,7 @@ func TestProxyWithImageSourceDataURL(t *testing.T) {
input := `<picture><source srcset="data:image/gif;base64,test"/></picture>`
expected := `<picture><source srcset="data:image/gif;base64,test"/></picture>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
if expected != output {
t.Errorf(`Not expected output: got %s`, output)
@ -396,7 +522,7 @@ func TestProxyFilterWithVideo(t *testing.T) {
input := `<video poster="https://example.com/img.png" src="https://example.com/video.mp4"></video>`
expected := `<video poster="/proxy/aDFfroYL57q5XsojIzATT6OYUCkuVSPXYJQAVrotnLw=/aHR0cHM6Ly9leGFtcGxlLmNvbS9pbWcucG5n" src="/proxy/0y3LR8zlx8S8qJkj1qWFOO6x3a-5yf2gLWjGIJV5yyc=/aHR0cHM6Ly9leGFtcGxlLmNvbS92aWRlby5tcDQ="></video>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
if expected != output {
t.Errorf(`Not expected output: got %s`, output)
@ -421,7 +547,7 @@ func TestProxyFilterVideoPoster(t *testing.T) {
input := `<video poster="https://example.com/img.png" src="https://example.com/video.mp4"></video>`
expected := `<video poster="/proxy/aDFfroYL57q5XsojIzATT6OYUCkuVSPXYJQAVrotnLw=/aHR0cHM6Ly9leGFtcGxlLmNvbS9pbWcucG5n" src="https://example.com/video.mp4"></video>`
output := ProxyRewriter(r, input)
output := RewriteDocumentWithRelativeProxyURL(r, input)
if expected != output {
t.Errorf(`Not expected output: got %s`, output)

View File

@ -1,7 +1,7 @@
// SPDX-FileCopyrightText: Copyright The Miniflux Authors. All rights reserved.
// SPDX-License-Identifier: Apache-2.0
package proxy // import "miniflux.app/v2/internal/proxy"
package mediaproxy // import "miniflux.app/v2/internal/mediaproxy"
import (
"strings"
@ -16,31 +16,29 @@ import (
type urlProxyRewriter func(router *mux.Router, url string) string
// ProxyRewriter replaces media URLs with internal proxy URLs.
func ProxyRewriter(router *mux.Router, data string) string {
return genericProxyRewriter(router, ProxifyURL, data)
func RewriteDocumentWithRelativeProxyURL(router *mux.Router, htmlDocument string) string {
return genericProxyRewriter(router, ProxifyRelativeURL, htmlDocument)
}
// AbsoluteProxyRewriter do the same as ProxyRewriter except it uses absolute URLs.
func AbsoluteProxyRewriter(router *mux.Router, host, data string) string {
func RewriteDocumentWithAbsoluteProxyURL(router *mux.Router, host, htmlDocument string) string {
proxifyFunction := func(router *mux.Router, url string) string {
return AbsoluteProxifyURL(router, host, url)
return ProxifyAbsoluteURL(router, host, url)
}
return genericProxyRewriter(router, proxifyFunction, data)
return genericProxyRewriter(router, proxifyFunction, htmlDocument)
}
func genericProxyRewriter(router *mux.Router, proxifyFunction urlProxyRewriter, data string) string {
proxyOption := config.Opts.ProxyOption()
func genericProxyRewriter(router *mux.Router, proxifyFunction urlProxyRewriter, htmlDocument string) string {
proxyOption := config.Opts.MediaProxyMode()
if proxyOption == "none" {
return data
return htmlDocument
}
doc, err := goquery.NewDocumentFromReader(strings.NewReader(data))
doc, err := goquery.NewDocumentFromReader(strings.NewReader(htmlDocument))
if err != nil {
return data
return htmlDocument
}
for _, mediaType := range config.Opts.ProxyMediaTypes() {
for _, mediaType := range config.Opts.MediaProxyResourceTypes() {
switch mediaType {
case "image":
doc.Find("img, picture source").Each(func(i int, img *goquery.Selection) {
@ -91,7 +89,7 @@ func genericProxyRewriter(router *mux.Router, proxifyFunction urlProxyRewriter,
output, err := doc.Find("body").First().Html()
if err != nil {
return data
return htmlDocument
}
return output

View File

@ -0,0 +1,70 @@
// SPDX-FileCopyrightText: Copyright The Miniflux Authors. All rights reserved.
// SPDX-License-Identifier: Apache-2.0
package mediaproxy // import "miniflux.app/v2/internal/mediaproxy"
import (
"crypto/hmac"
"crypto/sha256"
"encoding/base64"
"log/slog"
"net/url"
"path"
"miniflux.app/v2/internal/http/route"
"github.com/gorilla/mux"
"miniflux.app/v2/internal/config"
)
func ProxifyRelativeURL(router *mux.Router, mediaURL string) string {
if mediaURL == "" {
return ""
}
if customProxyURL := config.Opts.MediaCustomProxyURL(); customProxyURL != "" {
return proxifyURLWithCustomProxy(mediaURL, customProxyURL)
}
mac := hmac.New(sha256.New, config.Opts.MediaProxyPrivateKey())
mac.Write([]byte(mediaURL))
digest := mac.Sum(nil)
return route.Path(router, "proxy", "encodedDigest", base64.URLEncoding.EncodeToString(digest), "encodedURL", base64.URLEncoding.EncodeToString([]byte(mediaURL)))
}
func ProxifyAbsoluteURL(router *mux.Router, host, mediaURL string) string {
if mediaURL == "" {
return ""
}
if customProxyURL := config.Opts.MediaCustomProxyURL(); customProxyURL != "" {
return proxifyURLWithCustomProxy(mediaURL, customProxyURL)
}
proxifiedUrl := ProxifyRelativeURL(router, mediaURL)
scheme := "http"
if config.Opts.HTTPS {
scheme = "https"
}
return scheme + "://" + host + proxifiedUrl
}
func proxifyURLWithCustomProxy(mediaURL, customProxyURL string) string {
if customProxyURL == "" {
return mediaURL
}
proxyUrl, err := url.Parse(customProxyURL)
if err != nil {
slog.Error("Incorrect custom media proxy URL",
slog.String("custom_proxy_url", customProxyURL),
slog.Any("error", err),
)
return mediaURL
}
proxyUrl.Path = path.Join(proxyUrl.Path, base64.URLEncoding.EncodeToString([]byte(mediaURL)))
return proxyUrl.String()
}

View File

@ -159,25 +159,7 @@ type FeedCreationRequestFromSubscriptionDiscovery struct {
ETag string
LastModified string
FeedURL string `json:"feed_url"`
CategoryID int64 `json:"category_id"`
UserAgent string `json:"user_agent"`
Cookie string `json:"cookie"`
Username string `json:"username"`
Password string `json:"password"`
Crawler bool `json:"crawler"`
Disabled bool `json:"disabled"`
NoMediaPlayer bool `json:"no_media_player"`
IgnoreHTTPCache bool `json:"ignore_http_cache"`
AllowSelfSignedCertificates bool `json:"allow_self_signed_certificates"`
FetchViaProxy bool `json:"fetch_via_proxy"`
ScraperRules string `json:"scraper_rules"`
RewriteRules string `json:"rewrite_rules"`
BlocklistRules string `json:"blocklist_rules"`
KeeplistRules string `json:"keeplist_rules"`
HideGlobally bool `json:"hide_globally"`
UrlRewriteRules string `json:"urlrewrite_rules"`
DisableHTTP2 bool `json:"disable_http2"`
FeedCreationRequest
}
// FeedModificationRequest represents the request to update a feed.

View File

@ -1,52 +0,0 @@
// SPDX-FileCopyrightText: Copyright The Miniflux Authors. All rights reserved.
// SPDX-License-Identifier: Apache-2.0
package proxy // import "miniflux.app/v2/internal/proxy"
import (
"crypto/hmac"
"crypto/sha256"
"encoding/base64"
"net/url"
"path"
"miniflux.app/v2/internal/http/route"
"github.com/gorilla/mux"
"miniflux.app/v2/internal/config"
)
// ProxifyURL generates a relative URL for a proxified resource.
func ProxifyURL(router *mux.Router, link string) string {
if link == "" {
return ""
}
if proxyImageUrl := config.Opts.ProxyUrl(); proxyImageUrl != "" {
proxyUrl, err := url.Parse(proxyImageUrl)
if err != nil {
return ""
}
proxyUrl.Path = path.Join(proxyUrl.Path, base64.URLEncoding.EncodeToString([]byte(link)))
return proxyUrl.String()
}
mac := hmac.New(sha256.New, config.Opts.ProxyPrivateKey())
mac.Write([]byte(link))
digest := mac.Sum(nil)
return route.Path(router, "proxy", "encodedDigest", base64.URLEncoding.EncodeToString(digest), "encodedURL", base64.URLEncoding.EncodeToString([]byte(link)))
}
// AbsoluteProxifyURL generates an absolute URL for a proxified resource.
func AbsoluteProxifyURL(router *mux.Router, host, link string) string {
proxifiedUrl := ProxifyURL(router, link)
if config.Opts.ProxyUrl() == "" {
return proxifiedUrl
}
if config.Opts.HTTPS {
return "https://" + host + proxifiedUrl
}
return "http://" + host + proxifiedUrl
}

View File

@ -65,6 +65,12 @@ func (a *Atom10Adapter) BuildFeed(baseURL string) *model.Feed {
feed.IconURL = absoluteLogoURL
}
}
feed.Entries = a.populateEntries(feed.SiteURL)
return feed
}
func (a *Atom10Adapter) populateEntries(siteURL string) model.Entries {
entries := make(model.Entries, 0, len(a.atomFeed.Entries))
for _, atomEntry := range a.atomFeed.Entries {
entry := model.NewEntry()
@ -72,7 +78,7 @@ func (a *Atom10Adapter) BuildFeed(baseURL string) *model.Feed {
// Populate the entry URL.
entry.URL = atomEntry.Links.OriginalLink()
if entry.URL != "" {
if absoluteEntryURL, err := urllib.AbsoluteURL(feed.SiteURL, entry.URL); err == nil {
if absoluteEntryURL, err := urllib.AbsoluteURL(siteURL, entry.URL); err == nil {
entry.URL = absoluteEntryURL
}
}
@ -81,27 +87,27 @@ func (a *Atom10Adapter) BuildFeed(baseURL string) *model.Feed {
entry.Content = atomEntry.Content.Body()
if entry.Content == "" {
entry.Content = atomEntry.Summary.Body()
}
if entry.Content == "" {
entry.Content = atomEntry.FirstMediaDescription()
if entry.Content == "" {
entry.Content = atomEntry.FirstMediaDescription()
}
}
// Populate the entry title.
entry.Title = atomEntry.Title.Title()
if entry.Title == "" {
entry.Title = sanitizer.TruncateHTML(entry.Content, 100)
}
if entry.Title == "" {
entry.Title = entry.URL
if entry.Title == "" {
entry.Title = entry.URL
}
}
// Populate the entry author.
authors := atomEntry.Authors.PersonNames()
if len(authors) == 0 {
authors = append(authors, a.atomFeed.Authors.PersonNames()...)
authors = a.atomFeed.Authors.PersonNames()
}
authors = slices.Compact(authors)
sort.Strings(authors)
authors = slices.Compact(authors)
entry.Author = strings.Join(authors, ", ")
// Populate the entry date.
@ -126,13 +132,10 @@ func (a *Atom10Adapter) BuildFeed(baseURL string) *model.Feed {
// Populate categories.
categories := atomEntry.Categories.CategoryNames()
if len(categories) == 0 {
categories = append(categories, a.atomFeed.Categories.CategoryNames()...)
}
if len(categories) > 0 {
categories = slices.Compact(categories)
sort.Strings(categories)
entry.Tags = categories
categories = a.atomFeed.Categories.CategoryNames()
}
sort.Strings(categories)
entry.Tags = slices.Compact(categories)
// Populate the commentsURL if defined.
// See https://tools.ietf.org/html/rfc4685#section-4
@ -155,27 +158,42 @@ func (a *Atom10Adapter) BuildFeed(baseURL string) *model.Feed {
uniqueEnclosuresMap := make(map[string]bool)
for _, mediaThumbnail := range atomEntry.AllMediaThumbnails() {
if _, found := uniqueEnclosuresMap[mediaThumbnail.URL]; !found {
uniqueEnclosuresMap[mediaThumbnail.URL] = true
entry.Enclosures = append(entry.Enclosures, &model.Enclosure{
URL: mediaThumbnail.URL,
MimeType: mediaThumbnail.MimeType(),
Size: mediaThumbnail.Size(),
})
mediaURL := strings.TrimSpace(mediaThumbnail.URL)
if mediaURL == "" {
continue
}
if _, found := uniqueEnclosuresMap[mediaURL]; !found {
if mediaAbsoluteURL, err := urllib.AbsoluteURL(siteURL, mediaURL); err != nil {
slog.Debug("Unable to build absolute URL for media thumbnail",
slog.String("url", mediaThumbnail.URL),
slog.String("site_url", siteURL),
slog.Any("error", err),
)
} else {
uniqueEnclosuresMap[mediaAbsoluteURL] = true
entry.Enclosures = append(entry.Enclosures, &model.Enclosure{
URL: mediaAbsoluteURL,
MimeType: mediaThumbnail.MimeType(),
Size: mediaThumbnail.Size(),
})
}
}
}
for _, link := range atomEntry.Links {
if strings.EqualFold(link.Rel, "enclosure") {
if link.Href == "" {
continue
}
if _, found := uniqueEnclosuresMap[link.Href]; !found {
uniqueEnclosuresMap[link.Href] = true
for _, link := range atomEntry.Links.findAllLinksWithRelation("enclosure") {
absoluteEnclosureURL, err := urllib.AbsoluteURL(siteURL, link.Href)
if err != nil {
slog.Debug("Unable to resolve absolute URL for enclosure",
slog.String("enclosure_url", link.Href),
slog.String("entry_url", entry.URL),
slog.Any("error", err),
)
} else {
if _, found := uniqueEnclosuresMap[absoluteEnclosureURL]; !found {
uniqueEnclosuresMap[absoluteEnclosureURL] = true
length, _ := strconv.ParseInt(link.Length, 10, 0)
entry.Enclosures = append(entry.Enclosures, &model.Enclosure{
URL: link.Href,
URL: absoluteEnclosureURL,
MimeType: link.Type,
Size: length,
})
@ -184,29 +202,53 @@ func (a *Atom10Adapter) BuildFeed(baseURL string) *model.Feed {
}
for _, mediaContent := range atomEntry.AllMediaContents() {
if _, found := uniqueEnclosuresMap[mediaContent.URL]; !found {
uniqueEnclosuresMap[mediaContent.URL] = true
entry.Enclosures = append(entry.Enclosures, &model.Enclosure{
URL: mediaContent.URL,
MimeType: mediaContent.MimeType(),
Size: mediaContent.Size(),
})
mediaURL := strings.TrimSpace(mediaContent.URL)
if mediaURL == "" {
continue
}
if mediaAbsoluteURL, err := urllib.AbsoluteURL(siteURL, mediaURL); err != nil {
slog.Debug("Unable to build absolute URL for media content",
slog.String("url", mediaContent.URL),
slog.String("site_url", siteURL),
slog.Any("error", err),
)
} else {
if _, found := uniqueEnclosuresMap[mediaAbsoluteURL]; !found {
uniqueEnclosuresMap[mediaAbsoluteURL] = true
entry.Enclosures = append(entry.Enclosures, &model.Enclosure{
URL: mediaAbsoluteURL,
MimeType: mediaContent.MimeType(),
Size: mediaContent.Size(),
})
}
}
}
for _, mediaPeerLink := range atomEntry.AllMediaPeerLinks() {
if _, found := uniqueEnclosuresMap[mediaPeerLink.URL]; !found {
uniqueEnclosuresMap[mediaPeerLink.URL] = true
entry.Enclosures = append(entry.Enclosures, &model.Enclosure{
URL: mediaPeerLink.URL,
MimeType: mediaPeerLink.MimeType(),
Size: mediaPeerLink.Size(),
})
mediaURL := strings.TrimSpace(mediaPeerLink.URL)
if mediaURL == "" {
continue
}
if mediaAbsoluteURL, err := urllib.AbsoluteURL(siteURL, mediaURL); err != nil {
slog.Debug("Unable to build absolute URL for media peer link",
slog.String("url", mediaPeerLink.URL),
slog.String("site_url", siteURL),
slog.Any("error", err),
)
} else {
if _, found := uniqueEnclosuresMap[mediaAbsoluteURL]; !found {
uniqueEnclosuresMap[mediaAbsoluteURL] = true
entry.Enclosures = append(entry.Enclosures, &model.Enclosure{
URL: mediaAbsoluteURL,
MimeType: mediaPeerLink.MimeType(),
Size: mediaPeerLink.Size(),
})
}
}
}
feed.Entries = append(feed.Entries, entry)
entries = append(entries, entry)
}
return feed
return entries
}

View File

@ -217,12 +217,31 @@ func TestParseFeedURL(t *testing.T) {
}
}
func TestParseFeedWithRelativeURL(t *testing.T) {
func TestParseFeedWithRelativeFeedURL(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<title>Example Feed</title>
<link rel="alternate" type="text/html" href="https://example.org/"/>
<link rel="self" type="application/atom+xml" href="/feed"/>
<updated>2003-12-13T18:30:02Z</updated>
</feed>`
feed, err := Parse("https://example.org/", bytes.NewReader([]byte(data)), "10")
if err != nil {
t.Fatal(err)
}
if feed.FeedURL != "https://example.org/feed" {
t.Errorf("Incorrect feed URL, got: %s", feed.FeedURL)
}
}
func TestParseFeedWithRelativeSiteURL(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<title>Example Feed</title>
<link href="/blog/atom.xml" rel="self" type="application/atom+xml"/>
<link href="/blog"/>
<link href="/blog "/>
<entry>
<title>Test</title>
@ -241,15 +260,47 @@ func TestParseFeedWithRelativeURL(t *testing.T) {
}
if feed.FeedURL != "https://example.org/blog/atom.xml" {
t.Errorf("Incorrect feed URL, got: %s", feed.FeedURL)
t.Errorf("Incorrect feed URL, got: %q", feed.FeedURL)
}
if feed.SiteURL != "https://example.org/blog" {
t.Errorf("Incorrect site URL, got: %s", feed.SiteURL)
t.Errorf("Incorrect site URL, got: %q", feed.SiteURL)
}
if feed.Entries[0].URL != "https://example.org/blog/article.html" {
t.Errorf("Incorrect entry URL, got: %s", feed.Entries[0].URL)
t.Errorf("Incorrect entry URL, got: %q", feed.Entries[0].URL)
}
}
func TestParseFeedSiteURLWithTrailingSpace(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<link href="http://example.org "/>
</feed>`
feed, err := Parse("https://example.org/", bytes.NewReader([]byte(data)), "10")
if err != nil {
t.Fatal(err)
}
if feed.SiteURL != "http://example.org" {
t.Errorf("Incorrect site URL, got: %q", feed.SiteURL)
}
}
func TestParseFeedWithFeedURLWithTrailingSpace(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<link href="/blog/atom.xml " rel="self" type="application/atom+xml"/>
</feed>`
feed, err := Parse("https://example.org/", bytes.NewReader([]byte(data)), "10")
if err != nil {
t.Fatal(err)
}
if feed.FeedURL != "https://example.org/blog/atom.xml" {
t.Errorf("Incorrect site URL, got: %q", feed.FeedURL)
}
}
@ -1054,7 +1105,7 @@ func TestParseEntryWithEnclosures(t *testing.T) {
}
if len(feed.Entries) != 1 {
t.Errorf("Incorrect number of entries, got: %d", len(feed.Entries))
t.Fatalf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if feed.Entries[0].URL != "http://www.example.org/entries/1" {
@ -1089,6 +1140,89 @@ func TestParseEntryWithEnclosures(t *testing.T) {
}
}
func TestParseEntryWithRelativeEnclosureURL(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<id>https://www.example.org/myfeed</id>
<title>My Podcast Feed</title>
<link href="https://example.org" />
<link rel="self" href="https://example.org/myfeed" />
<entry>
<id>https://www.example.org/entries/1</id>
<title>Atom 1.0</title>
<updated>2005-07-15T12:00:00Z</updated>
<link href="https://www.example.org/entries/1" />
<link rel="enclosure"
type="audio/mpeg"
title="MP3"
href=" /myaudiofile.mp3 "
length="1234" />
</content>
</entry>
</feed>`
feed, err := Parse("https://example.org/", bytes.NewReader([]byte(data)), "10")
if err != nil {
t.Fatal(err)
}
if len(feed.Entries) != 1 {
t.Fatalf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if len(feed.Entries[0].Enclosures) != 1 {
t.Fatalf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
}
if feed.Entries[0].Enclosures[0].URL != "https://example.org/myaudiofile.mp3" {
t.Errorf("Incorrect enclosure URL, got: %q", feed.Entries[0].Enclosures[0].URL)
}
}
func TestParseEntryWithDuplicateEnclosureURL(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<id>http://www.example.org/myfeed</id>
<title>My Podcast Feed</title>
<link href="http://example.org" />
<link rel="self" href="http://example.org/myfeed" />
<entry>
<id>http://www.example.org/entries/1</id>
<title>Atom 1.0</title>
<updated>2005-07-15T12:00:00Z</updated>
<link href="http://www.example.org/entries/1" />
<link rel="enclosure"
type="audio/mpeg"
title="MP3"
href="http://www.example.org/myaudiofile.mp3"
length="1234" />
<link rel="enclosure"
type="audio/mpeg"
title="MP3"
href=" http://www.example.org/myaudiofile.mp3 "
length="1234" />
</content>
</entry>
</feed>`
feed, err := Parse("https://example.org/", bytes.NewReader([]byte(data)), "10")
if err != nil {
t.Fatal(err)
}
if len(feed.Entries) != 1 {
t.Fatalf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if len(feed.Entries[0].Enclosures) != 1 {
t.Fatalf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
}
if feed.Entries[0].Enclosures[0].URL != "http://www.example.org/myaudiofile.mp3" {
t.Errorf("Incorrect enclosure URL, got: %q", feed.Entries[0].Enclosures[0].URL)
}
}
func TestParseEntryWithoutEnclosureURL(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
@ -1283,20 +1417,25 @@ func TestParseWithInvalidCharacterEntity(t *testing.T) {
func TestParseMediaGroup(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom" xmlns:media="http://search.yahoo.com/mrss/">
<id>http://www.example.org/myfeed</id>
<id>https://www.example.org/myfeed</id>
<title>My Video Feed</title>
<updated>2005-07-15T12:00:00Z</updated>
<link href="http://example.org" />
<link rel="self" href="http://example.org/myfeed" />
<link href="https://example.org" />
<link rel="self" href="https://example.org/myfeed" />
<entry>
<id>http://www.example.org/entries/1</id>
<id>https://www.example.org/entries/1</id>
<title>Some Video</title>
<updated>2005-07-15T12:00:00Z</updated>
<link href="http://www.example.org/entries/1" />
<link href="https://www.example.org/entries/1" />
<media:group>
<media:title>Another title</media:title>
<media:content url="https://www.youtube.com/v/abcd" type="application/x-shockwave-flash" width="640" height="390"/>
<media:thumbnail url="https://example.org/thumbnail.jpg" width="480" height="360"/>
<media:content url=" /v/efg " type="application/x-shockwave-flash" width="640" height="390"/>
<media:content url=" " type="application/x-shockwave-flash" width="640" height="390"/>
<media:thumbnail url="https://www.example.org/duplicate-thumbnail.jpg" width="480" height="360"/>
<media:thumbnail url="https://www.example.org/duplicate-thumbnail.jpg" width="480" height="360"/>
<media:thumbnail url=" /thumbnail2.jpg " width="480" height="360"/>
<media:thumbnail url=" " width="480" height="360"/>
<media:description>Some description
A website: http://example.org/</media:description>
</media:group>
@ -1309,18 +1448,10 @@ A website: http://example.org/</media:description>
}
if len(feed.Entries) != 1 {
t.Errorf("Incorrect number of entries, got: %d", len(feed.Entries))
t.Fatalf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if feed.Entries[0].URL != "http://www.example.org/entries/1" {
t.Errorf("Incorrect entry URL, got: %s", feed.Entries[0].URL)
}
if feed.Entries[0].Content != `Some description<br>A website: <a href="http://example.org/">http://example.org/</a>` {
t.Errorf("Incorrect entry content, got: %q", feed.Entries[0].Content)
}
if len(feed.Entries[0].Enclosures) != 2 {
if len(feed.Entries[0].Enclosures) != 4 {
t.Fatalf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
}
@ -1329,8 +1460,10 @@ A website: http://example.org/</media:description>
mimeType string
size int64
}{
{"https://example.org/thumbnail.jpg", "image/*", 0},
{"https://www.example.org/duplicate-thumbnail.jpg", "image/*", 0},
{"https://example.org/thumbnail2.jpg", "image/*", 0},
{"https://www.youtube.com/v/abcd", "application/x-shockwave-flash", 0},
{"https://example.org/v/efg", "application/x-shockwave-flash", 0},
}
for index, enclosure := range feed.Entries[0].Enclosures {
@ -1351,19 +1484,26 @@ A website: http://example.org/</media:description>
func TestParseMediaElements(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom" xmlns:media="http://search.yahoo.com/mrss/">
<id>http://www.example.org/myfeed</id>
<id>https://www.example.org/myfeed</id>
<title>My Video Feed</title>
<updated>2005-07-15T12:00:00Z</updated>
<link href="http://example.org" />
<link rel="self" href="http://example.org/myfeed" />
<link href="https://example.org" />
<link rel="self" href="https://example.org/myfeed" />
<entry>
<id>http://www.example.org/entries/1</id>
<id>https://www.example.org/entries/1</id>
<title>Some Video</title>
<updated>2005-07-15T12:00:00Z</updated>
<link href="http://www.example.org/entries/1" />
<link href="https://www.example.org/entries/1" />
<media:title>Another title</media:title>
<media:content url="https://www.youtube.com/v/abcd" type="application/x-shockwave-flash" width="640" height="390"/>
<media:thumbnail url="https://example.org/thumbnail.jpg" width="480" height="360"/>
<media:content url=" /relative/media.mp4 " type="application/x-shockwave-flash" width="640" height="390"/>
<media:content url=" " type="application/x-shockwave-flash" width="640" height="390"/>
<media:thumbnail url="https://example.org/duplicated-thumbnail.jpg" width="480" height="360"/>
<media:thumbnail url=" https://example.org/duplicated-thumbnail.jpg " width="480" height="360"/>
<media:thumbnail url=" " width="480" height="360"/>
<media:peerLink type="application/x-bittorrent" href=" http://www.example.org/sampleFile.torrent " />
<media:peerLink type="application/x-bittorrent" href=" /sampleFile2.torrent" />
<media:peerLink type="application/x-bittorrent" href=" " />
<media:description>Some description
A website: http://example.org/</media:description>
</entry>
@ -1375,18 +1515,10 @@ A website: http://example.org/</media:description>
}
if len(feed.Entries) != 1 {
t.Errorf("Incorrect number of entries, got: %d", len(feed.Entries))
t.Fatalf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if feed.Entries[0].URL != "http://www.example.org/entries/1" {
t.Errorf("Incorrect entry URL, got: %s", feed.Entries[0].URL)
}
if feed.Entries[0].Content != `Some description<br>A website: <a href="http://example.org/">http://example.org/</a>` {
t.Errorf("Incorrect entry content, got: %q", feed.Entries[0].Content)
}
if len(feed.Entries[0].Enclosures) != 2 {
if len(feed.Entries[0].Enclosures) != 5 {
t.Fatalf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
}
@ -1395,8 +1527,11 @@ A website: http://example.org/</media:description>
mimeType string
size int64
}{
{"https://example.org/thumbnail.jpg", "image/*", 0},
{"https://example.org/duplicated-thumbnail.jpg", "image/*", 0},
{"https://www.youtube.com/v/abcd", "application/x-shockwave-flash", 0},
{"https://example.org/relative/media.mp4", "application/x-shockwave-flash", 0},
{"http://www.example.org/sampleFile.torrent", "application/x-bittorrent", 0},
{"https://example.org/sampleFile2.torrent", "application/x-bittorrent", 0},
}
for index, enclosure := range feed.Entries[0].Enclosures {

View File

@ -96,6 +96,21 @@ func (a AtomLinks) firstLinkWithRelationAndType(relation string, contentTypes ..
return ""
}
func (a AtomLinks) findAllLinksWithRelation(relation string) []*AtomLink {
var links []*AtomLink
for _, link := range a {
if strings.EqualFold(link.Rel, relation) {
link.Href = strings.TrimSpace(link.Href)
if link.Href != "" {
links = append(links, link)
}
}
}
return links
}
// The "atom:category" element conveys information about a category
// associated with an entry or feed. This specification assigns no
// meaning to the content (if any) of this element.

View File

@ -35,8 +35,3 @@ func CharsetReader(charsetLabel string, input io.Reader) (io.Reader, error) {
// Transform document to UTF-8 from the specified encoding in XML prolog.
return charset.NewReaderLabel(charsetLabel, r)
}
// CharsetReaderFromContentType is used when the encoding is not specified for the input document.
func CharsetReaderFromContentType(contentType string, input io.Reader) (io.Reader, error) {
return charset.NewReader(input, contentType)
}

View File

@ -0,0 +1,55 @@
package fetcher
import (
"compress/gzip"
"io"
"github.com/andybalholm/brotli"
)
type brotliReadCloser struct {
body io.ReadCloser
brotliReader io.Reader
}
func NewBrotliReadCloser(body io.ReadCloser) *brotliReadCloser {
return &brotliReadCloser{
body: body,
brotliReader: brotli.NewReader(body),
}
}
func (b *brotliReadCloser) Read(p []byte) (n int, err error) {
return b.brotliReader.Read(p)
}
func (b *brotliReadCloser) Close() error {
return b.body.Close()
}
type gzipReadCloser struct {
body io.ReadCloser
gzipReader io.Reader
gzipErr error
}
func NewGzipReadCloser(body io.ReadCloser) *gzipReadCloser {
return &gzipReadCloser{body: body}
}
func (gz *gzipReadCloser) Read(p []byte) (n int, err error) {
if gz.gzipReader == nil {
if gz.gzipErr == nil {
gz.gzipReader, gz.gzipErr = gzip.NewReader(gz.body)
}
if gz.gzipErr != nil {
return 0, gz.gzipErr
}
}
return gz.gzipReader.Read(p)
}
func (gz *gzipReadCloser) Close() error {
return gz.body.Close()
}

View File

@ -169,6 +169,7 @@ func (r *RequestBuilder) ExecuteRequest(requestURL string) (*http.Response, erro
}
req.Header = r.headers
req.Header.Set("Accept-Encoding", "br, gzip")
req.Header.Set("Accept", defaultAcceptHeader)
req.Header.Set("Connection", "close")

View File

@ -8,10 +8,12 @@ import (
"errors"
"fmt"
"io"
"log/slog"
"net"
"net/http"
"net/url"
"os"
"strings"
"miniflux.app/v2/internal/locale"
)
@ -71,12 +73,31 @@ func (r *ResponseHandler) Close() {
}
}
func (r *ResponseHandler) getReader(maxBodySize int64) io.ReadCloser {
contentEncoding := strings.ToLower(r.httpResponse.Header.Get("Content-Encoding"))
slog.Debug("Request response",
slog.String("effective_url", r.EffectiveURL()),
slog.String("content_length", r.httpResponse.Header.Get("Content-Length")),
slog.String("content_encoding", contentEncoding),
slog.String("content_type", r.httpResponse.Header.Get("Content-Type")),
)
reader := r.httpResponse.Body
switch contentEncoding {
case "br":
reader = NewBrotliReadCloser(r.httpResponse.Body)
case "gzip":
reader = NewGzipReadCloser(r.httpResponse.Body)
}
return http.MaxBytesReader(nil, reader, maxBodySize)
}
func (r *ResponseHandler) Body(maxBodySize int64) io.ReadCloser {
return http.MaxBytesReader(nil, r.httpResponse.Body, maxBodySize)
return r.getReader(maxBodySize)
}
func (r *ResponseHandler) ReadBody(maxBodySize int64) ([]byte, *locale.LocalizedErrorWrapper) {
limitedReader := http.MaxBytesReader(nil, r.httpResponse.Body, maxBodySize)
limitedReader := r.getReader(maxBodySize)
buffer, err := io.ReadAll(limitedReader)
if err != nil && err != io.EOF {

View File

@ -15,11 +15,11 @@ import (
"miniflux.app/v2/internal/config"
"miniflux.app/v2/internal/crypto"
"miniflux.app/v2/internal/model"
"miniflux.app/v2/internal/reader/encoding"
"miniflux.app/v2/internal/reader/fetcher"
"miniflux.app/v2/internal/urllib"
"github.com/PuerkitoBio/goquery"
"golang.org/x/net/html/charset"
)
type IconFinder struct {
@ -191,7 +191,7 @@ func findIconURLsFromHTMLDocument(body io.Reader, contentType string) ([]string,
"link[rel='apple-touch-icon-precomposed.png']",
}
htmlDocumentReader, err := encoding.CharsetReaderFromContentType(contentType, body)
htmlDocumentReader, err := charset.NewReader(body, contentType)
if err != nil {
return nil, fmt.Errorf("icon: unable to create charset reader: %w", err)
}

View File

@ -24,15 +24,15 @@ func NewJSONAdapter(jsonFeed *JSONFeed) *JSONAdapter {
return &JSONAdapter{jsonFeed}
}
func (j *JSONAdapter) BuildFeed(feedURL string) *model.Feed {
func (j *JSONAdapter) BuildFeed(baseURL string) *model.Feed {
feed := &model.Feed{
Title: strings.TrimSpace(j.jsonFeed.Title),
FeedURL: j.jsonFeed.FeedURL,
SiteURL: j.jsonFeed.HomePageURL,
FeedURL: strings.TrimSpace(j.jsonFeed.FeedURL),
SiteURL: strings.TrimSpace(j.jsonFeed.HomePageURL),
}
if feed.FeedURL == "" {
feed.FeedURL = feedURL
feed.FeedURL = strings.TrimSpace(baseURL)
}
// Fallback to the feed URL if the site URL is empty.
@ -40,11 +40,11 @@ func (j *JSONAdapter) BuildFeed(feedURL string) *model.Feed {
feed.SiteURL = feed.FeedURL
}
if feedURL, err := urllib.AbsoluteURL(feedURL, j.jsonFeed.FeedURL); err == nil {
if feedURL, err := urllib.AbsoluteURL(baseURL, feed.FeedURL); err == nil {
feed.FeedURL = feedURL
}
if siteURL, err := urllib.AbsoluteURL(feedURL, j.jsonFeed.HomePageURL); err == nil {
if siteURL, err := urllib.AbsoluteURL(baseURL, feed.SiteURL); err == nil {
feed.SiteURL = siteURL
}

View File

@ -177,6 +177,82 @@ func TestParsePodcast(t *testing.T) {
}
}
func TestParseFeedWithFeedURLWithTrailingSpace(t *testing.T) {
data := `{
"version": "https://jsonfeed.org/version/1",
"title": "My Example Feed",
"home_page_url": "https://example.org/",
"feed_url": "https://example.org/feed.json ",
"items": []
}`
feed, err := Parse("https://example.org/feed.json", bytes.NewBufferString(data))
if err != nil {
t.Fatal(err)
}
if feed.FeedURL != "https://example.org/feed.json" {
t.Errorf("Incorrect feed URL, got: %s", feed.FeedURL)
}
}
func TestParseFeedWithRelativeFeedURL(t *testing.T) {
data := `{
"version": "https://jsonfeed.org/version/1",
"title": "My Example Feed",
"home_page_url": "https://example.org/",
"feed_url": "/feed.json",
"items": []
}`
feed, err := Parse("https://example.org/feed.json", bytes.NewBufferString(data))
if err != nil {
t.Fatal(err)
}
if feed.FeedURL != "https://example.org/feed.json" {
t.Errorf("Incorrect feed URL, got: %s", feed.FeedURL)
}
}
func TestParseFeedSiteURLWithTrailingSpace(t *testing.T) {
data := `{
"version": "https://jsonfeed.org/version/1",
"title": "My Example Feed",
"home_page_url": "https://example.org/ ",
"feed_url": "https://example.org/feed.json",
"items": []
}`
feed, err := Parse("https://example.org/feed.json", bytes.NewBufferString(data))
if err != nil {
t.Fatal(err)
}
if feed.SiteURL != "https://example.org/" {
t.Errorf("Incorrect site URL, got: %s", feed.SiteURL)
}
}
func TestParseFeedWithRelativeSiteURL(t *testing.T) {
data := `{
"version": "https://jsonfeed.org/version/1",
"title": "My Example Feed",
"home_page_url": "/home ",
"feed_url": "https://example.org/feed.json",
"items": []
}`
feed, err := Parse("https://example.org/feed.json", bytes.NewBufferString(data))
if err != nil {
t.Fatal(err)
}
if feed.SiteURL != "https://example.org/home" {
t.Errorf("Incorrect site URL, got: %s", feed.SiteURL)
}
}
func TestParseFeedWithoutTitle(t *testing.T) {
data := `{
"version": "https://jsonfeed.org/version/1",
@ -772,6 +848,42 @@ func TestParseFeedIcon(t *testing.T) {
}
}
func TestParseFeedWithRelativeAttachmentURL(t *testing.T) {
data := `{
"version": "https://jsonfeed.org/version/1",
"title": "My Example Feed",
"home_page_url": "https://example.org/",
"feed_url": "https://example.org/feed.json",
"items": [
{
"id": "2",
"content_text": "This is a second item.",
"url": "https://example.org/second-item",
"attachments": [
{
"url": " /attachment.mp3 ",
"mime_type": "audio/mpeg",
"size_in_bytes": 123456
}
]
}
]
}`
feed, err := Parse("https://example.org/feed.json", bytes.NewBufferString(data))
if err != nil {
t.Fatal(err)
}
if len(feed.Entries[0].Enclosures) != 1 {
t.Fatalf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
}
if feed.Entries[0].Enclosures[0].URL != "https://example.org/attachment.mp3" {
t.Errorf("Incorrect enclosure URL, got: %q", feed.Entries[0].Enclosures[0].URL)
}
}
func TestParseInvalidJSON(t *testing.T) {
data := `garbage`
_, err := Parse("https://example.org/feed.json", bytes.NewBufferString(data))

View File

@ -86,15 +86,17 @@ type Content struct {
// MimeType returns the attachment mime type.
func (mc *Content) MimeType() string {
switch {
case mc.Type == "" && mc.Medium == "image":
return "image/*"
case mc.Type == "" && mc.Medium == "video":
return "video/*"
case mc.Type == "" && mc.Medium == "audio":
return "audio/*"
case mc.Type != "":
if mc.Type != "" {
return mc.Type
}
switch mc.Medium {
case "image":
return "image/*"
case "video":
return "video/*"
case "audio":
return "audio/*"
default:
return "application/octet-stream"
}
@ -102,9 +104,6 @@ func (mc *Content) MimeType() string {
// Size returns the attachment size.
func (mc *Content) Size() int64 {
if mc.FileSize == "" {
return 0
}
size, _ := strconv.ParseInt(mc.FileSize, 10, 0)
return size
}

View File

@ -23,6 +23,8 @@ import (
"miniflux.app/v2/internal/storage"
"github.com/PuerkitoBio/goquery"
"github.com/tdewolff/minify/v2"
"github.com/tdewolff/minify/v2/html"
)
var (
@ -36,31 +38,38 @@ var (
func ProcessFeedEntries(store *storage.Storage, feed *model.Feed, user *model.User, forceRefresh bool) {
var filteredEntries model.Entries
minifier := minify.New()
minifier.AddFunc("text/html", html.Minify)
// Process older entries first
for i := len(feed.Entries) - 1; i >= 0; i-- {
entry := feed.Entries[i]
slog.Debug("Processing entry",
slog.Int64("user_id", user.ID),
slog.Int64("entry_id", entry.ID),
slog.String("entry_url", entry.URL),
slog.String("entry_hash", entry.Hash),
slog.String("entry_title", entry.Title),
slog.Int64("feed_id", feed.ID),
slog.String("feed_url", feed.FeedURL),
)
if isBlockedEntry(feed, entry) || !isAllowedEntry(feed, entry) {
if isBlockedEntry(feed, entry) || !isAllowedEntry(feed, entry) || !isRecentEntry(entry) {
continue
}
websiteURL := getUrlFromEntry(feed, entry)
entryIsNew := !store.EntryURLExists(feed.ID, entry.URL)
entryIsNew := store.IsNewEntry(feed.ID, entry.Hash)
if feed.Crawler && (entryIsNew || forceRefresh) {
slog.Debug("Scraping entry",
slog.Int64("user_id", user.ID),
slog.Int64("entry_id", entry.ID),
slog.String("entry_url", entry.URL),
slog.String("entry_hash", entry.Hash),
slog.String("entry_title", entry.Title),
slog.Int64("feed_id", feed.ID),
slog.String("feed_url", feed.FeedURL),
slog.Bool("entry_is_new", entryIsNew),
slog.Bool("force_refresh", forceRefresh),
slog.String("website_url", websiteURL),
)
startTime := time.Now()
@ -91,7 +100,6 @@ func ProcessFeedEntries(store *storage.Storage, feed *model.Feed, user *model.Us
if scraperErr != nil {
slog.Warn("Unable to scrape entry",
slog.Int64("user_id", user.ID),
slog.Int64("entry_id", entry.ID),
slog.String("entry_url", entry.URL),
slog.Int64("feed_id", feed.ID),
slog.String("feed_url", feed.FeedURL),
@ -99,7 +107,11 @@ func ProcessFeedEntries(store *storage.Storage, feed *model.Feed, user *model.Us
)
} else if content != "" {
// We replace the entry content only if the scraper doesn't return any error.
entry.Content = content
if minifiedHTML, err := minifier.String("text/html", content); err == nil {
entry.Content = minifiedHTML
} else {
entry.Content = content
}
}
}
@ -135,7 +147,6 @@ func isBlockedEntry(feed *model.Feed, entry *model.Entry) bool {
if compiledBlocklist.MatchString(entry.URL) || compiledBlocklist.MatchString(entry.Title) || compiledBlocklist.MatchString(entry.Author) || containsBlockedTag {
slog.Debug("Blocking entry based on rule",
slog.Int64("entry_id", entry.ID),
slog.String("entry_url", entry.URL),
slog.Int64("feed_id", feed.ID),
slog.String("feed_url", feed.FeedURL),
@ -166,7 +177,6 @@ func isAllowedEntry(feed *model.Feed, entry *model.Entry) bool {
if compiledKeeplist.MatchString(entry.URL) || compiledKeeplist.MatchString(entry.Title) || compiledKeeplist.MatchString(entry.Author) || containsAllowedTag {
slog.Debug("Allow entry based on rule",
slog.Int64("entry_id", entry.ID),
slog.String("entry_url", entry.URL),
slog.Int64("feed_id", feed.ID),
slog.String("feed_url", feed.FeedURL),
@ -179,6 +189,9 @@ func isAllowedEntry(feed *model.Feed, entry *model.Entry) bool {
// ProcessEntryWebPage downloads the entry web page and apply rewrite rules.
func ProcessEntryWebPage(feed *model.Feed, entry *model.Entry, user *model.User) error {
minifier := minify.New()
minifier.AddFunc("text/html", html.Minify)
startTime := time.Now()
websiteURL := getUrlFromEntry(feed, entry)
@ -210,7 +223,11 @@ func ProcessEntryWebPage(feed *model.Feed, entry *model.Entry, user *model.User)
}
if content != "" {
entry.Content = content
if minifiedHTML, err := minifier.String("text/html", content); err == nil {
entry.Content = minifiedHTML
} else {
entry.Content = content
}
if user.ShowReadingTime {
entry.ReadingTime = readingtime.EstimateReadingTime(entry.Content, user.DefaultReadingSpeed, user.CJKReadingSpeed)
}
@ -231,7 +248,6 @@ func getUrlFromEntry(feed *model.Feed, entry *model.Entry) string {
re := regexp.MustCompile(parts[1])
url = re.ReplaceAllString(entry.URL, parts[2])
slog.Debug("Rewriting entry URL",
slog.Int64("entry_id", entry.ID),
slog.String("original_entry_url", entry.URL),
slog.String("rewritten_entry_url", url),
slog.Int64("feed_id", feed.ID),
@ -239,7 +255,6 @@ func getUrlFromEntry(feed *model.Feed, entry *model.Entry) string {
)
} else {
slog.Debug("Cannot find search and replace terms for replace rule",
slog.Int64("entry_id", entry.ID),
slog.String("original_entry_url", entry.URL),
slog.String("rewritten_entry_url", url),
slog.Int64("feed_id", feed.ID),
@ -252,6 +267,11 @@ func getUrlFromEntry(feed *model.Feed, entry *model.Entry) string {
}
func updateEntryReadingTime(store *storage.Storage, feed *model.Feed, entry *model.Entry, entryIsNew bool, user *model.User) {
if !user.ShowReadingTime {
slog.Debug("Skip reading time estimation for this user", slog.Int64("user_id", user.ID))
return
}
if shouldFetchYouTubeWatchTime(entry) {
if entryIsNew {
watchTime, err := fetchYouTubeWatchTime(entry.URL)
@ -267,7 +287,7 @@ func updateEntryReadingTime(store *storage.Storage, feed *model.Feed, entry *mod
}
entry.ReadingTime = watchTime
} else {
entry.ReadingTime = store.GetReadTime(entry, feed)
entry.ReadingTime = store.GetReadTime(feed.ID, entry.Hash)
}
}
@ -286,14 +306,13 @@ func updateEntryReadingTime(store *storage.Storage, feed *model.Feed, entry *mod
}
entry.ReadingTime = watchTime
} else {
entry.ReadingTime = store.GetReadTime(entry, feed)
entry.ReadingTime = store.GetReadTime(feed.ID, entry.Hash)
}
}
// Handle YT error case and non-YT entries.
if entry.ReadingTime == 0 {
if user.ShowReadingTime {
entry.ReadingTime = readingtime.EstimateReadingTime(entry.Content, user.DefaultReadingSpeed, user.CJKReadingSpeed)
}
entry.ReadingTime = readingtime.EstimateReadingTime(entry.Content, user.DefaultReadingSpeed, user.CJKReadingSpeed)
}
}
@ -413,3 +432,10 @@ func parseISO8601(from string) (time.Duration, error) {
return d, nil
}
func isRecentEntry(entry *model.Entry) bool {
if config.Opts.FilterEntryMaxAgeDays() == 0 || entry.Date.After(time.Now().AddDate(0, 0, -config.Opts.FilterEntryMaxAgeDays())) {
return true
}
return false
}

View File

@ -7,6 +7,7 @@ import (
"testing"
"time"
"miniflux.app/v2/internal/config"
"miniflux.app/v2/internal/model"
)
@ -92,3 +93,27 @@ func TestParseISO8601(t *testing.T) {
}
}
}
func TestIsRecentEntry(t *testing.T) {
parser := config.NewParser()
var err error
config.Opts, err = parser.ParseEnvironmentVariables()
if err != nil {
t.Fatalf(`Parsing failure: %v`, err)
}
var scenarios = []struct {
entry *model.Entry
expected bool
}{
{&model.Entry{Title: "Example1", Date: time.Date(2005, 5, 1, 05, 05, 05, 05, time.UTC)}, true},
{&model.Entry{Title: "Example2", Date: time.Date(2010, 5, 1, 05, 05, 05, 05, time.UTC)}, true},
{&model.Entry{Title: "Example3", Date: time.Date(2020, 5, 1, 05, 05, 05, 05, time.UTC)}, true},
{&model.Entry{Title: "Example4", Date: time.Date(2024, 3, 15, 05, 05, 05, 05, time.UTC)}, true},
}
for _, tc := range scenarios {
result := isRecentEntry(tc.entry)
if tc.expected != result {
t.Errorf(`Unexpected result, got %v for entry %q`, result, tc.entry.Title)
}
}
}

View File

@ -24,18 +24,18 @@ func NewRDFAdapter(rdf *RDF) *RDFAdapter {
return &RDFAdapter{rdf}
}
func (r *RDFAdapter) BuildFeed(feedURL string) *model.Feed {
func (r *RDFAdapter) BuildFeed(baseURL string) *model.Feed {
feed := &model.Feed{
Title: stripTags(r.rdf.Channel.Title),
FeedURL: feedURL,
SiteURL: r.rdf.Channel.Link,
FeedURL: strings.TrimSpace(baseURL),
SiteURL: strings.TrimSpace(r.rdf.Channel.Link),
}
if feed.Title == "" {
feed.Title = feedURL
feed.Title = baseURL
}
if siteURL, err := urllib.AbsoluteURL(feedURL, r.rdf.Channel.Link); err == nil {
if siteURL, err := urllib.AbsoluteURL(feed.FeedURL, feed.SiteURL); err == nil {
feed.SiteURL = siteURL
}

View File

@ -289,7 +289,37 @@ func TestParseRDFFeedWithRelativeLink(t *testing.T) {
xmlns="http://purl.org/rss/1.0/">
<channel>
<title>Example Feed</title>
<link>/test/index.html</link>
<link>/test/index.html </link>
</channel>
<item>
<title>Example</title>
<link>http://example.org/item</link>
<description>Test</description>
</item>
</rdf:RDF>`
feed, err := Parse("http://example.org/feed", bytes.NewReader([]byte(data)))
if err != nil {
t.Fatal(err)
}
if feed.SiteURL != "http://example.org/test/index.html" {
t.Errorf(`Incorrect SiteURL, got: %q`, feed.SiteURL)
}
if feed.FeedURL != "http://example.org/feed" {
t.Errorf(`Incorrect FeedURL, got: %q`, feed.FeedURL)
}
}
func TestParseRDFFeedSiteURLWithTrailingSpace(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<rdf:RDF
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns="http://purl.org/rss/1.0/">
<channel>
<title>Example Feed</title>
<link>http://example.org/test/index.html </link>
</channel>
<item>
<title>Example</title>

View File

@ -24,7 +24,7 @@ var predefinedRules = map[string]string{
"monkeyuser.com": "add_image_title",
"mrlovenstein.com": "add_image_title",
"nedroid.com": "add_image_title",
"oglaf.com": "add_image_title",
"oglaf.com": `replace("media.oglaf.com/story/tt(.+).gif"|"media.oglaf.com/comic/$1.jpg"),add_image_title`,
"optipess.com": "add_image_title",
"peebleslab.com": "add_image_title",
"quantamagazine.org": `add_youtube_video_from_id, remove("h6:not(.byline,.post__title__kicker), #comments, .next-post__content, .footer__section, figure .outer--content, script")`,

View File

@ -26,14 +26,15 @@ func NewRSSAdapter(rss *RSS) *RSSAdapter {
return &RSSAdapter{rss}
}
func (r *RSSAdapter) BuildFeed(feedURL string) *model.Feed {
func (r *RSSAdapter) BuildFeed(baseURL string) *model.Feed {
feed := &model.Feed{
Title: html.UnescapeString(strings.TrimSpace(r.rss.Channel.Title)),
FeedURL: feedURL,
SiteURL: r.rss.Channel.Link,
FeedURL: strings.TrimSpace(baseURL),
SiteURL: strings.TrimSpace(r.rss.Channel.Link),
}
if siteURL, err := urllib.AbsoluteURL(feedURL, r.rss.Channel.Link); err == nil {
// Ensure the Site URL is absolute.
if siteURL, err := urllib.AbsoluteURL(baseURL, feed.SiteURL); err == nil {
feed.SiteURL = siteURL
}
@ -41,7 +42,7 @@ func (r *RSSAdapter) BuildFeed(feedURL string) *model.Feed {
for _, atomLink := range r.rss.Channel.AtomLinks.Links {
atomLinkHref := strings.TrimSpace(atomLink.Href)
if atomLinkHref != "" && atomLink.Rel == "self" {
if absoluteFeedURL, err := urllib.AbsoluteURL(feedURL, atomLinkHref); err == nil {
if absoluteFeedURL, err := urllib.AbsoluteURL(feed.FeedURL, atomLinkHref); err == nil {
feed.FeedURL = absoluteFeedURL
break
}
@ -71,7 +72,7 @@ func (r *RSSAdapter) BuildFeed(feedURL string) *model.Feed {
entry := model.NewEntry()
entry.Date = findEntryDate(&item)
entry.Content = findEntryContent(&item)
entry.Enclosures = findEntryEnclosures(&item)
entry.Enclosures = findEntryEnclosures(&item, feed.SiteURL)
// Populate the entry URL.
entryURL := findEntryURL(&item)
@ -89,9 +90,9 @@ func (r *RSSAdapter) BuildFeed(feedURL string) *model.Feed {
entry.Title = findEntryTitle(&item)
if entry.Title == "" {
entry.Title = sanitizer.TruncateHTML(entry.Content, 100)
}
if entry.Title == "" {
entry.Title = entry.URL
if entry.Title == "" {
entry.Title = entry.URL
}
}
entry.Author = findEntryAuthor(&item)
@ -100,11 +101,10 @@ func (r *RSSAdapter) BuildFeed(feedURL string) *model.Feed {
}
// Generate the entry hash.
for _, value := range []string{item.GUID.Data, entryURL} {
if value != "" {
entry.Hash = crypto.Hash(value)
break
}
if item.GUID.Data != "" {
entry.Hash = crypto.Hash(item.GUID.Data)
} else if entryURL != "" {
entry.Hash = crypto.Hash(entryURL)
}
// Find CommentsURL if defined.
@ -120,13 +120,30 @@ func (r *RSSAdapter) BuildFeed(feedURL string) *model.Feed {
}
// Populate entry categories.
entry.Tags = append(entry.Tags, item.Categories...)
entry.Tags = append(entry.Tags, item.MediaCategories.Labels()...)
entry.Tags = append(entry.Tags, r.rss.Channel.Categories...)
entry.Tags = append(entry.Tags, r.rss.Channel.GetItunesCategories()...)
if r.rss.Channel.GooglePlayCategory.Text != "" {
entry.Tags = append(entry.Tags, r.rss.Channel.GooglePlayCategory.Text)
for _, tag := range item.Categories {
if tag != "" {
entry.Tags = append(entry.Tags, tag)
}
}
for _, tag := range item.MediaCategories.Labels() {
if tag != "" {
entry.Tags = append(entry.Tags, tag)
}
}
if len(entry.Tags) == 0 {
for _, tag := range r.rss.Channel.Categories {
if tag != "" {
entry.Tags = append(entry.Tags, tag)
}
}
for _, tag := range r.rss.Channel.GetItunesCategories() {
if tag != "" {
entry.Tags = append(entry.Tags, tag)
}
}
if r.rss.Channel.GooglePlayCategory.Text != "" {
entry.Tags = append(entry.Tags, r.rss.Channel.GooglePlayCategory.Text)
}
}
feed.Entries = append(feed.Entries, entry)
@ -244,18 +261,30 @@ func findEntryAuthor(rssItem *RSSItem) string {
return strings.TrimSpace(sanitizer.StripTags(author))
}
func findEntryEnclosures(rssItem *RSSItem) model.EnclosureList {
func findEntryEnclosures(rssItem *RSSItem, siteURL string) model.EnclosureList {
enclosures := make(model.EnclosureList, 0)
duplicates := make(map[string]bool)
for _, mediaThumbnail := range rssItem.AllMediaThumbnails() {
if _, found := duplicates[mediaThumbnail.URL]; !found {
duplicates[mediaThumbnail.URL] = true
enclosures = append(enclosures, &model.Enclosure{
URL: mediaThumbnail.URL,
MimeType: mediaThumbnail.MimeType(),
Size: mediaThumbnail.Size(),
})
mediaURL := strings.TrimSpace(mediaThumbnail.URL)
if mediaURL == "" {
continue
}
if _, found := duplicates[mediaURL]; !found {
if mediaAbsoluteURL, err := urllib.AbsoluteURL(siteURL, mediaURL); err != nil {
slog.Debug("Unable to build absolute URL for media thumbnail",
slog.String("url", mediaThumbnail.URL),
slog.String("site_url", siteURL),
slog.Any("error", err),
)
} else {
duplicates[mediaAbsoluteURL] = true
enclosures = append(enclosures, &model.Enclosure{
URL: mediaAbsoluteURL,
MimeType: mediaThumbnail.MimeType(),
Size: mediaThumbnail.Size(),
})
}
}
}
@ -264,15 +293,20 @@ func findEntryEnclosures(rssItem *RSSItem) model.EnclosureList {
if rssItem.FeedBurnerEnclosureLink != "" {
filename := path.Base(rssItem.FeedBurnerEnclosureLink)
if strings.Contains(enclosureURL, filename) {
if strings.HasSuffix(enclosureURL, filename) {
enclosureURL = rssItem.FeedBurnerEnclosureLink
}
}
enclosureURL = strings.TrimSpace(enclosureURL)
if enclosureURL == "" {
continue
}
if absoluteEnclosureURL, err := urllib.AbsoluteURL(siteURL, enclosureURL); err == nil {
enclosureURL = absoluteEnclosureURL
}
if _, found := duplicates[enclosureURL]; !found {
duplicates[enclosureURL] = true
@ -285,24 +319,50 @@ func findEntryEnclosures(rssItem *RSSItem) model.EnclosureList {
}
for _, mediaContent := range rssItem.AllMediaContents() {
if _, found := duplicates[mediaContent.URL]; !found {
duplicates[mediaContent.URL] = true
enclosures = append(enclosures, &model.Enclosure{
URL: mediaContent.URL,
MimeType: mediaContent.MimeType(),
Size: mediaContent.Size(),
})
mediaURL := strings.TrimSpace(mediaContent.URL)
if mediaURL == "" {
continue
}
if _, found := duplicates[mediaURL]; !found {
mediaURL := strings.TrimSpace(mediaContent.URL)
if mediaAbsoluteURL, err := urllib.AbsoluteURL(siteURL, mediaURL); err != nil {
slog.Debug("Unable to build absolute URL for media content",
slog.String("url", mediaContent.URL),
slog.String("site_url", siteURL),
slog.Any("error", err),
)
} else {
duplicates[mediaAbsoluteURL] = true
enclosures = append(enclosures, &model.Enclosure{
URL: mediaAbsoluteURL,
MimeType: mediaContent.MimeType(),
Size: mediaContent.Size(),
})
}
}
}
for _, mediaPeerLink := range rssItem.AllMediaPeerLinks() {
if _, found := duplicates[mediaPeerLink.URL]; !found {
duplicates[mediaPeerLink.URL] = true
enclosures = append(enclosures, &model.Enclosure{
URL: mediaPeerLink.URL,
MimeType: mediaPeerLink.MimeType(),
Size: mediaPeerLink.Size(),
})
mediaURL := strings.TrimSpace(mediaPeerLink.URL)
if mediaURL == "" {
continue
}
if _, found := duplicates[mediaURL]; !found {
mediaURL := strings.TrimSpace(mediaPeerLink.URL)
if mediaAbsoluteURL, err := urllib.AbsoluteURL(siteURL, mediaURL); err != nil {
slog.Debug("Unable to build absolute URL for media peer link",
slog.String("url", mediaPeerLink.URL),
slog.String("site_url", siteURL),
slog.Any("error", err),
)
} else {
duplicates[mediaAbsoluteURL] = true
enclosures = append(enclosures, &model.Enclosure{
URL: mediaAbsoluteURL,
MimeType: mediaPeerLink.MimeType(),
Size: mediaPeerLink.Size(),
})
}
}
}

View File

@ -109,6 +109,100 @@ func TestParseRss2Sample(t *testing.T) {
}
}
func TestParseFeedWithFeedURLWithTrailingSpace(t *testing.T) {
data := `<?xml version="1.0"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
<channel>
<title>Example</title>
<link>https://example.org/</link>
<atom:link href="https://example.org/rss " type="application/rss+xml" rel="self"></atom:link>
<item>
<title>Test</title>
<link>https://example.org/item</link>
</item>
</channel>
</rss>`
feed, err := Parse("https://example.org/ ", bytes.NewReader([]byte(data)))
if err != nil {
t.Fatal(err)
}
if feed.FeedURL != "https://example.org/rss" {
t.Errorf("Incorrect feed URL, got: %s", feed.FeedURL)
}
}
func TestParseFeedWithRelativeFeedURL(t *testing.T) {
data := `<?xml version="1.0"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
<channel>
<title>Example</title>
<link>https://example.org/</link>
<atom:link href="/rss" type="application/rss+xml" rel="self"></atom:link>
<item>
<title>Test</title>
<link>https://example.org/item</link>
</item>
</channel>
</rss>`
feed, err := Parse("https://example.org/", bytes.NewReader([]byte(data)))
if err != nil {
t.Fatal(err)
}
if feed.FeedURL != "https://example.org/rss" {
t.Errorf("Incorrect feed URL, got: %s", feed.FeedURL)
}
}
func TestParseFeedSiteURLWithTrailingSpace(t *testing.T) {
data := `<?xml version="1.0"?>
<rss version="2.0">
<channel>
<title>Example</title>
<link>https://example.org/ </link>
<item>
<title>Test</title>
<link>https://example.org/item</link>
</item>
</channel>
</rss>`
feed, err := Parse("https://example.org/", bytes.NewReader([]byte(data)))
if err != nil {
t.Fatal(err)
}
if feed.SiteURL != "https://example.org/" {
t.Errorf("Incorrect site URL, got: %s", feed.SiteURL)
}
}
func TestParseFeedWithRelativeSiteURL(t *testing.T) {
data := `<?xml version="1.0"?>
<rss version="2.0">
<channel>
<title>Example</title>
<link>/example </link>
<item>
<title>Test</title>
<link>https://example.org/item</link>
</item>
</channel>
</rss>`
feed, err := Parse("https://example.org/", bytes.NewReader([]byte(data)))
if err != nil {
t.Fatal(err)
}
if feed.SiteURL != "https://example.org/example" {
t.Errorf("Incorrect site URL, got: %s", feed.SiteURL)
}
}
func TestParseFeedWithoutTitle(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0">
@ -922,15 +1016,11 @@ func TestParseEntryWithEnclosures(t *testing.T) {
}
if len(feed.Entries) != 1 {
t.Errorf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if feed.Entries[0].URL != "http://www.example.org/entries/1" {
t.Errorf("Incorrect entry URL, got: %s", feed.Entries[0].URL)
t.Fatalf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if len(feed.Entries[0].Enclosures) != 1 {
t.Errorf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
t.Fatalf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
}
if feed.Entries[0].Enclosures[0].URL != "http://www.example.org/myaudiofile.mp3" {
@ -971,15 +1061,11 @@ func TestParseEntryWithIncorrectEnclosureLength(t *testing.T) {
}
if len(feed.Entries) != 1 {
t.Errorf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if feed.Entries[0].URL != "http://www.example.org/entries/1" {
t.Errorf("Incorrect entry URL, got: %s", feed.Entries[0].URL)
t.Fatalf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if len(feed.Entries[0].Enclosures) != 2 {
t.Errorf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
t.Fatalf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
}
if feed.Entries[0].Enclosures[0].URL != "http://www.example.org/myaudiofile.mp3" {
@ -999,6 +1085,39 @@ func TestParseEntryWithIncorrectEnclosureLength(t *testing.T) {
}
}
func TestParseEntryWithDuplicatedEnclosureURL(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0">
<channel>
<title>My Podcast Feed</title>
<link>http://example.org</link>
<item>
<title>Podcasting with RSS</title>
<link>http://www.example.org/entries/1</link>
<enclosure url="http://www.example.org/myaudiofile.mp3" type="audio/mpeg" />
<enclosure url=" http://www.example.org/myaudiofile.mp3 " type="audio/mpeg" />
</item>
</channel>
</rss>`
feed, err := Parse("https://example.org/", bytes.NewReader([]byte(data)))
if err != nil {
t.Fatal(err)
}
if len(feed.Entries) != 1 {
t.Fatalf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if len(feed.Entries[0].Enclosures) != 1 {
t.Fatalf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
}
if feed.Entries[0].Enclosures[0].URL != "http://www.example.org/myaudiofile.mp3" {
t.Errorf("Incorrect enclosure URL, got: %s", feed.Entries[0].Enclosures[0].URL)
}
}
func TestParseEntryWithEmptyEnclosureURL(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0">
@ -1012,7 +1131,7 @@ func TestParseEntryWithEmptyEnclosureURL(t *testing.T) {
<description>An overview of RSS podcasting</description>
<pubDate>Fri, 15 Jul 2005 00:00:00 -0500</pubDate>
<guid isPermaLink="true">http://www.example.org/entries/1</guid>
<enclosure url="" length="0"/>
<enclosure url=" " length="0"/>
</item>
</channel>
</rss>`
@ -1023,15 +1142,47 @@ func TestParseEntryWithEmptyEnclosureURL(t *testing.T) {
}
if len(feed.Entries) != 1 {
t.Errorf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if feed.Entries[0].URL != "http://www.example.org/entries/1" {
t.Errorf("Incorrect entry URL, got: %s", feed.Entries[0].URL)
t.Fatalf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if len(feed.Entries[0].Enclosures) != 0 {
t.Errorf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
t.Fatalf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
}
}
func TestParseEntryWithRelativeEnclosureURL(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0">
<channel>
<title>My Podcast Feed</title>
<link>http://example.org</link>
<author>some.email@example.org</author>
<item>
<title>Podcasting with RSS</title>
<link>http://www.example.org/entries/1</link>
<description>An overview of RSS podcasting</description>
<pubDate>Fri, 15 Jul 2005 00:00:00 -0500</pubDate>
<guid isPermaLink="true">http://www.example.org/entries/1</guid>
<enclosure url=" /files/file.mp3 "/>
</item>
</channel>
</rss>`
feed, err := Parse("https://example.org/", bytes.NewReader([]byte(data)))
if err != nil {
t.Fatal(err)
}
if len(feed.Entries) != 1 {
t.Fatalf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if len(feed.Entries[0].Enclosures) != 1 {
t.Fatalf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
}
if feed.Entries[0].Enclosures[0].URL != "http://example.org/files/file.mp3" {
t.Errorf("Incorrect enclosure URL, got: %q", feed.Entries[0].Enclosures[0].URL)
}
}
@ -1060,15 +1211,11 @@ func TestParseEntryWithFeedBurnerEnclosures(t *testing.T) {
}
if len(feed.Entries) != 1 {
t.Errorf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if feed.Entries[0].URL != "http://www.example.org/entries/1" {
t.Errorf("Incorrect entry URL, got: %s", feed.Entries[0].URL)
t.Fatalf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if len(feed.Entries[0].Enclosures) != 1 {
t.Errorf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
t.Fatalf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
}
if feed.Entries[0].Enclosures[0].URL != "http://example.org/67ca416c-f22a-4228-a681-68fc9998ec10/File.mp3" {
@ -1084,6 +1231,42 @@ func TestParseEntryWithFeedBurnerEnclosures(t *testing.T) {
}
}
func TestParseEntryWithFeedBurnerEnclosuresAndRelativeURL(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:feedburner="http://rssnamespace.org/feedburner/ext/1.0">
<channel>
<title>My Example Feed</title>
<link>http://example.org</link>
<item>
<title>Example Item</title>
<link>http://www.example.org/entries/1</link>
<enclosure
url="http://feedproxy.google.com/~r/example/~5/lpMyFSCvubs/File.mp3"
length="76192460"
type="audio/mpeg" />
<feedburner:origEnclosureLink>/67ca416c-f22a-4228-a681-68fc9998ec10/File.mp3</feedburner:origEnclosureLink>
</item>
</channel>
</rss>`
feed, err := Parse("https://example.org/", bytes.NewReader([]byte(data)))
if err != nil {
t.Fatal(err)
}
if len(feed.Entries) != 1 {
t.Errorf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if len(feed.Entries[0].Enclosures) != 1 {
t.Fatalf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
}
if feed.Entries[0].Enclosures[0].URL != "http://example.org/67ca416c-f22a-4228-a681-68fc9998ec10/File.mp3" {
t.Errorf("Incorrect enclosure URL, got: %s", feed.Entries[0].Enclosures[0].URL)
}
}
func TestParseEntryWithRelativeURL(t *testing.T) {
data := `<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0">
@ -1295,7 +1478,7 @@ func TestParseEntryWithMediaGroup(t *testing.T) {
<rss version="2.0" xmlns:media="http://search.yahoo.com/mrss/">
<channel>
<title>My Example Feed</title>
<link>http://example.org</link>
<link>https://example.org</link>
<item>
<title>Example Item</title>
<link>http://www.example.org/entries/1</link>
@ -1306,7 +1489,9 @@ func TestParseEntryWithMediaGroup(t *testing.T) {
<media:content type="application/x-bittorrent" url="https://example.org/file2.torrent" isDefault="true"></media:content>
<media:content type="application/x-bittorrent" url="https://example.org/file3.torrent"></media:content>
<media:content type="application/x-bittorrent" url="https://example.org/file4.torrent"></media:content>
<media:content type="application/x-bittorrent" url="https://example.org/file5.torrent" fileSize="42"></media:content>
<media:content type="application/x-bittorrent" url="https://example.org/file4.torrent"></media:content>
<media:content type="application/x-bittorrent" url=" file5.torrent " fileSize="42"></media:content>
<media:content type="application/x-bittorrent" url=" " fileSize="42"></media:content>
<media:rating>nonadult</media:rating>
</media:group>
<media:thumbnail url="https://example.org/image.jpg" height="122" width="223"></media:thumbnail>
@ -1359,15 +1544,19 @@ func TestParseEntryWithMediaContent(t *testing.T) {
<rss version="2.0" xmlns:media="http://search.yahoo.com/mrss/">
<channel>
<title>My Example Feed</title>
<link>http://example.org</link>
<link>https://example.org</link>
<item>
<title>Example Item</title>
<link>http://www.example.org/entries/1</link>
<media:thumbnail url="https://example.org/thumbnail.jpg" />
<media:thumbnail url="https://example.org/thumbnail.jpg" />
<media:thumbnail url=" thumbnail.jpg " />
<media:thumbnail url=" " />
<media:content url="https://example.org/media1.jpg" medium="image">
<media:title type="html">Some Title for Media 1</media:title>
</media:content>
<media:content url="https://example.org/media2.jpg" medium="image" />
<media:content url=" /media2.jpg " medium="image" />
<media:content url=" " medium="image" />
</item>
</channel>
</rss>`
@ -1378,9 +1567,9 @@ func TestParseEntryWithMediaContent(t *testing.T) {
}
if len(feed.Entries) != 1 {
t.Errorf("Incorrect number of entries, got: %d", len(feed.Entries))
t.Fatalf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if len(feed.Entries[0].Enclosures) != 3 {
if len(feed.Entries[0].Enclosures) != 4 {
t.Fatalf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
}
@ -1389,6 +1578,7 @@ func TestParseEntryWithMediaContent(t *testing.T) {
mimeType string
size int64
}{
{"https://example.org/thumbnail.jpg", "image/*", 0},
{"https://example.org/thumbnail.jpg", "image/*", 0},
{"https://example.org/media1.jpg", "image/*", 0},
{"https://example.org/media2.jpg", "image/*", 0},
@ -1414,11 +1604,14 @@ func TestParseEntryWithMediaPeerLink(t *testing.T) {
<rss version="2.0" xmlns:media="http://search.yahoo.com/mrss/">
<channel>
<title>My Example Feed</title>
<link>http://example.org</link>
<link>https://website.example.org</link>
<item>
<title>Example Item</title>
<link>http://www.example.org/entries/1</link>
<media:peerLink type="application/x-bittorrent" href="http://www.example.org/file.torrent" />
<media:peerLink type="application/x-bittorrent" href="https://www.example.org/file.torrent" />
<media:peerLink type="application/x-bittorrent" href="https://www.example.org/file.torrent" />
<media:peerLink type="application/x-bittorrent" href=" file2.torrent " />
<media:peerLink type="application/x-bittorrent" href=" " />
</item>
</channel>
</rss>`
@ -1429,10 +1622,10 @@ func TestParseEntryWithMediaPeerLink(t *testing.T) {
}
if len(feed.Entries) != 1 {
t.Errorf("Incorrect number of entries, got: %d", len(feed.Entries))
t.Fatalf("Incorrect number of entries, got: %d", len(feed.Entries))
}
if len(feed.Entries[0].Enclosures) != 1 {
if len(feed.Entries[0].Enclosures) != 2 {
t.Fatalf("Incorrect number of enclosures, got: %d", len(feed.Entries[0].Enclosures))
}
@ -1441,7 +1634,8 @@ func TestParseEntryWithMediaPeerLink(t *testing.T) {
mimeType string
size int64
}{
{"http://www.example.org/file.torrent", "application/x-bittorrent", 0},
{"https://www.example.org/file.torrent", "application/x-bittorrent", 0},
{"https://website.example.org/file2.torrent", "application/x-bittorrent", 0},
}
for index, enclosure := range feed.Entries[0].Enclosures {
@ -1696,11 +1890,11 @@ func TestParseEntryWithCategories(t *testing.T) {
t.Fatal(err)
}
if len(feed.Entries[0].Tags) != 3 {
t.Errorf("Incorrect number of tags, got: %d", len(feed.Entries[0].Tags))
if len(feed.Entries[0].Tags) != 2 {
t.Fatalf("Incorrect number of tags, got: %d", len(feed.Entries[0].Tags))
}
expected := []string{"Category 1", "Category 2", "Category 3"}
expected := []string{"Category 1", "Category 2"}
result := feed.Entries[0].Tags
for i, tag := range result {

View File

@ -10,12 +10,12 @@ import (
"strings"
"miniflux.app/v2/internal/config"
"miniflux.app/v2/internal/reader/encoding"
"miniflux.app/v2/internal/reader/fetcher"
"miniflux.app/v2/internal/reader/readability"
"miniflux.app/v2/internal/urllib"
"github.com/PuerkitoBio/goquery"
"golang.org/x/net/html/charset"
)
func ScrapeWebsite(requestBuilder *fetcher.RequestBuilder, websiteURL, rules string) (string, error) {
@ -42,9 +42,9 @@ func ScrapeWebsite(requestBuilder *fetcher.RequestBuilder, websiteURL, rules str
var content string
var err error
htmlDocumentReader, err := encoding.CharsetReaderFromContentType(
responseHandler.ContentType(),
htmlDocumentReader, err := charset.NewReader(
responseHandler.Body(config.Opts.HTTPClientMaxBodySize()),
responseHandler.ContentType(),
)
if err != nil {
return "", fmt.Errorf("scraper: unable to read HTML document: %v", err)

View File

@ -14,17 +14,18 @@ import (
"miniflux.app/v2/internal/integration/rssbridge"
"miniflux.app/v2/internal/locale"
"miniflux.app/v2/internal/model"
"miniflux.app/v2/internal/reader/encoding"
"miniflux.app/v2/internal/reader/fetcher"
"miniflux.app/v2/internal/reader/parser"
"miniflux.app/v2/internal/urllib"
"github.com/PuerkitoBio/goquery"
"golang.org/x/net/html/charset"
)
var (
youtubeChannelRegex = regexp.MustCompile(`youtube\.com/channel/(.*)$`)
youtubeVideoRegex = regexp.MustCompile(`youtube\.com/watch\?v=(.*)$`)
youtubeChannelRegex = regexp.MustCompile(`youtube\.com/channel/(.*)$`)
youtubeVideoRegex = regexp.MustCompile(`youtube\.com/watch\?v=(.*)$`)
youtubePlaylistRegex = regexp.MustCompile(`youtube\.com/playlist\?list=(.*)$`)
)
type SubscriptionFinder struct {
@ -98,7 +99,19 @@ func (f *SubscriptionFinder) FindSubscriptions(websiteURL, rssBridgeURL string)
return subscriptions, nil
}
// Step 4) Parse web page to find feeds from HTML meta tags.
// Step 4) Check if the website URL is a YouTube playlist.
slog.Debug("Try to detect feeds from YouTube playlist page", slog.String("website_url", websiteURL))
subscriptions, localizedError = f.FindSubscriptionsFromYouTubePlaylistPage(websiteURL)
if localizedError != nil {
return nil, localizedError
}
if len(subscriptions) > 0 {
slog.Debug("Subscriptions found from YouTube playlist page", slog.String("website_url", websiteURL), slog.Any("subscriptions", subscriptions))
return subscriptions, nil
}
// Step 5) Parse web page to find feeds from HTML meta tags.
slog.Debug("Try to detect feeds from HTML meta tags",
slog.String("website_url", websiteURL),
slog.String("content_type", responseHandler.ContentType()),
@ -113,7 +126,7 @@ func (f *SubscriptionFinder) FindSubscriptions(websiteURL, rssBridgeURL string)
return subscriptions, nil
}
// Step 5) Check if the website URL can use RSS-Bridge.
// Step 6) Check if the website URL can use RSS-Bridge.
if rssBridgeURL != "" {
slog.Debug("Try to detect feeds with RSS-Bridge", slog.String("website_url", websiteURL))
subscriptions, localizedError := f.FindSubscriptionsFromRSSBridge(websiteURL, rssBridgeURL)
@ -127,7 +140,7 @@ func (f *SubscriptionFinder) FindSubscriptions(websiteURL, rssBridgeURL string)
}
}
// Step 6) Check if the website has a known feed URL.
// Step 7) Check if the website has a known feed URL.
slog.Debug("Try to detect feeds from well-known URLs", slog.String("website_url", websiteURL))
subscriptions, localizedError = f.FindSubscriptionsFromWellKnownURLs(websiteURL)
if localizedError != nil {
@ -150,7 +163,7 @@ func (f *SubscriptionFinder) FindSubscriptionsFromWebPage(websiteURL, contentTyp
"link[type='application/feed+json']": parser.FormatJSON,
}
htmlDocumentReader, err := encoding.CharsetReaderFromContentType(contentType, body)
htmlDocumentReader, err := charset.NewReader(body, contentType)
if err != nil {
return nil, locale.NewLocalizedErrorWrapper(err, "error.unable_to_parse_html_document", err)
}
@ -322,3 +335,16 @@ func (f *SubscriptionFinder) FindSubscriptionsFromYouTubeVideoPage(websiteURL st
return nil, nil
}
func (f *SubscriptionFinder) FindSubscriptionsFromYouTubePlaylistPage(websiteURL string) (Subscriptions, *locale.LocalizedErrorWrapper) {
matches := youtubePlaylistRegex.FindStringSubmatch(websiteURL)
if len(matches) == 2 {
feedURL := fmt.Sprintf(`https://www.youtube.com/feeds/videos.xml?playlist_id=%s`, matches[1])
return Subscriptions{NewSubscription(websiteURL, feedURL, parser.FormatAtom)}, nil
}
slog.Debug("This website is not a YouTube playlist page, the regex doesn't match", slog.String("website_url", websiteURL))
return nil, nil
}

View File

@ -8,6 +8,28 @@ import (
"testing"
)
func TestFindYoutubePlaylistFeed(t *testing.T) {
scenarios := map[string]string{
"https://www.youtube.com/playlist?list=PLOOwEPgFWm_NHcQd9aCi5JXWASHO_n5uR": "https://www.youtube.com/feeds/videos.xml?playlist_id=PLOOwEPgFWm_NHcQd9aCi5JXWASHO_n5uR",
"https://www.youtube.com/playlist?list=PLOOwEPgFWm_N42HlCLhqyJ0ZBWr5K1QDM": "https://www.youtube.com/feeds/videos.xml?playlist_id=PLOOwEPgFWm_N42HlCLhqyJ0ZBWr5K1QDM",
}
for websiteURL, expectedFeedURL := range scenarios {
subscriptions, localizedError := NewSubscriptionFinder(nil).FindSubscriptionsFromYouTubePlaylistPage(websiteURL)
if localizedError != nil {
t.Fatalf(`Parsing a correctly formatted YouTube playlist page should not return any error: %v`, localizedError)
}
if len(subscriptions) != 1 {
t.Fatal(`Incorrect number of subscriptions returned`)
}
if subscriptions[0].URL != expectedFeedURL {
t.Errorf(`Unexpected Feed, got %s, instead of %s`, subscriptions[0].URL, expectedFeedURL)
}
}
}
func TestFindYoutubeChannelFeed(t *testing.T) {
scenarios := map[string]string{
"https://www.youtube.com/channel/UC-Qj80avWItNRjkZ41rzHyw": "https://www.youtube.com/feeds/videos.xml?channel_id=UC-Qj80avWItNRjkZ41rzHyw",

View File

@ -60,18 +60,16 @@ func (b *BatchBuilder) WithoutDisabledFeeds() *BatchBuilder {
}
func (b *BatchBuilder) FetchJobs() (jobs model.JobList, err error) {
var parts []string
parts = append(parts, `SELECT id, user_id FROM feeds`)
query := `SELECT id, user_id FROM feeds`
if len(b.conditions) > 0 {
parts = append(parts, fmt.Sprintf("WHERE %s", strings.Join(b.conditions, " AND ")))
query += fmt.Sprintf(" WHERE %s", strings.Join(b.conditions, " AND "))
}
if b.limit > 0 {
parts = append(parts, fmt.Sprintf("ORDER BY next_check_at ASC LIMIT %d", b.limit))
query += fmt.Sprintf(" ORDER BY next_check_at ASC LIMIT %d", b.limit)
}
query := strings.Join(parts, " ")
rows, err := b.db.Query(query, b.args...)
if err != nil {
return nil, fmt.Errorf(`store: unable to fetch batch of jobs: %v`, err)

View File

@ -8,6 +8,8 @@ import (
"errors"
"fmt"
"log/slog"
"slices"
"strings"
"time"
"miniflux.app/v2/internal/crypto"
@ -138,7 +140,7 @@ func (s *Storage) createEntry(tx *sql.Tx, entry *model.Entry) error {
entry.UserID,
entry.FeedID,
entry.ReadingTime,
pq.Array(removeDuplicates(entry.Tags)),
pq.Array(removeEmpty(removeDuplicates(entry.Tags))),
).Scan(
&entry.ID,
&entry.Status,
@ -194,7 +196,7 @@ func (s *Storage) updateEntry(tx *sql.Tx, entry *model.Entry) error {
entry.UserID,
entry.FeedID,
entry.Hash,
pq.Array(removeDuplicates(entry.Tags)),
pq.Array(removeEmpty(removeDuplicates(entry.Tags))),
).Scan(&entry.ID)
if err != nil {
@ -223,24 +225,27 @@ func (s *Storage) entryExists(tx *sql.Tx, entry *model.Entry) (bool, error) {
return result, nil
}
// GetReadTime fetches the read time of an entry based on its hash, and the feed id and user id from the feed.
// It's intended to be used on entries objects created by parsing a feed as they don't contain much information.
// The feed param helps to scope the search to a specific user and feed in order to avoid hash clashes.
func (s *Storage) GetReadTime(entry *model.Entry, feed *model.Feed) int {
func (s *Storage) IsNewEntry(feedID int64, entryHash string) bool {
var result bool
s.db.QueryRow(`SELECT true FROM entries WHERE feed_id=$1 AND hash=$2`, feedID, entryHash).Scan(&result)
return !result
}
func (s *Storage) GetReadTime(feedID int64, entryHash string) int {
var result int
// Note: This query uses entries_feed_id_hash_key index
s.db.QueryRow(
`SELECT
reading_time
FROM
entries
WHERE
user_id=$1 AND
feed_id=$2 AND
hash=$3
feed_id=$1 AND
hash=$2
`,
feed.UserID,
feed.ID,
entry.Hash,
feedID,
entryHash,
).Scan(&result)
return result
}
@ -573,14 +578,6 @@ func (s *Storage) MarkCategoryAsRead(userID, categoryID int64, before time.Time)
return nil
}
// EntryURLExists returns true if an entry with this URL already exists.
func (s *Storage) EntryURLExists(feedID int64, entryURL string) bool {
var result bool
query := `SELECT true FROM entries WHERE feed_id=$1 AND url=$2`
s.db.QueryRow(query, feedID, entryURL).Scan(&result)
return result
}
// EntryShareCode returns the share code of the provided entry.
// It generates a new one if not already defined.
func (s *Storage) EntryShareCode(userID int64, entryID int64) (shareCode string, err error) {
@ -615,15 +612,17 @@ func (s *Storage) UnshareEntry(userID int64, entryID int64) (err error) {
return
}
// removeDuplicate removes duplicate entries from a slice
func removeDuplicates[T string | int](sliceList []T) []T {
allKeys := make(map[T]bool)
list := []T{}
for _, item := range sliceList {
if _, value := allKeys[item]; !value {
allKeys[item] = true
list = append(list, item)
func removeDuplicates(l []string) []string {
slices.Sort(l)
return slices.Compact(l)
}
func removeEmpty(l []string) []string {
var finalSlice []string
for _, item := range l {
if strings.TrimSpace(item) != "" {
finalSlice = append(finalSlice, item)
}
}
return list
return finalSlice
}

View File

@ -58,6 +58,15 @@ func (e *EntryPaginationBuilder) WithStatus(status string) {
}
}
func (e *EntryPaginationBuilder) WithTags(tags []string) {
if len(tags) > 0 {
for _, tag := range tags {
e.conditions = append(e.conditions, fmt.Sprintf("LOWER($%d) = ANY(LOWER(e.tags::text)::text[])", len(e.args)+1))
e.args = append(e.args, tag)
}
}
}
// WithGloballyVisible adds global visibility to the condition.
func (e *EntryPaginationBuilder) WithGloballyVisible() {
e.conditions = append(e.conditions, "not c.hide_globally")

View File

@ -160,7 +160,7 @@ func (e *EntryQueryBuilder) WithStatuses(statuses []string) *EntryQueryBuilder {
func (e *EntryQueryBuilder) WithTags(tags []string) *EntryQueryBuilder {
if len(tags) > 0 {
for _, cat := range tags {
e.conditions = append(e.conditions, fmt.Sprintf("$%d = ANY(e.tags)", len(e.args)+1))
e.conditions = append(e.conditions, fmt.Sprintf("LOWER($%d) = ANY(LOWER(e.tags::text)::text[])", len(e.args)+1))
e.args = append(e.args, cat)
}
}
@ -439,21 +439,21 @@ func (e *EntryQueryBuilder) buildCondition() string {
}
func (e *EntryQueryBuilder) buildSorting() string {
var parts []string
var parts string
if len(e.sortExpressions) > 0 {
parts = append(parts, fmt.Sprintf(`ORDER BY %s`, strings.Join(e.sortExpressions, ", ")))
parts += fmt.Sprintf(" ORDER BY %s", strings.Join(e.sortExpressions, ", "))
}
if e.limit > 0 {
parts = append(parts, fmt.Sprintf(`LIMIT %d`, e.limit))
parts += fmt.Sprintf(" LIMIT %d", e.limit)
}
if e.offset > 0 {
parts = append(parts, fmt.Sprintf(`OFFSET %d`, e.offset))
parts += fmt.Sprintf(" OFFSET %d", e.offset)
}
return strings.Join(parts, " ")
return parts
}
// NewEntryQueryBuilder returns a new EntryQueryBuilder.

View File

@ -91,25 +91,25 @@ func (f *FeedQueryBuilder) buildCounterCondition() string {
}
func (f *FeedQueryBuilder) buildSorting() string {
var parts []string
var parts string
if len(f.sortExpressions) > 0 {
parts = append(parts, fmt.Sprintf(`ORDER BY %s`, strings.Join(f.sortExpressions, ", ")))
parts += fmt.Sprintf(" ORDER BY %s", strings.Join(f.sortExpressions, ", "))
}
if len(parts) > 0 {
parts = append(parts, ", lower(f.title) ASC")
parts += ", lower(f.title) ASC"
}
if f.limit > 0 {
parts = append(parts, fmt.Sprintf(`LIMIT %d`, f.limit))
parts += fmt.Sprintf(" LIMIT %d", f.limit)
}
if f.offset > 0 {
parts = append(parts, fmt.Sprintf(`OFFSET %d`, f.offset))
parts += fmt.Sprintf(" OFFSET %d", f.offset)
}
return strings.Join(parts, " ")
return parts
}
// GetFeed returns a single feed that match the condition.

View File

@ -8,6 +8,7 @@ import (
"html/template"
"math"
"net/mail"
"net/url"
"slices"
"strings"
"time"
@ -16,8 +17,8 @@ import (
"miniflux.app/v2/internal/crypto"
"miniflux.app/v2/internal/http/route"
"miniflux.app/v2/internal/locale"
"miniflux.app/v2/internal/mediaproxy"
"miniflux.app/v2/internal/model"
"miniflux.app/v2/internal/proxy"
"miniflux.app/v2/internal/timezone"
"miniflux.app/v2/internal/urllib"
@ -57,19 +58,19 @@ func (f *funcMap) Map() template.FuncMap {
return template.HTML(str)
},
"proxyFilter": func(data string) string {
return proxy.ProxyRewriter(f.router, data)
return mediaproxy.RewriteDocumentWithRelativeProxyURL(f.router, data)
},
"proxyURL": func(link string) string {
proxyOption := config.Opts.ProxyOption()
mediaProxyMode := config.Opts.MediaProxyMode()
if proxyOption == "all" || (proxyOption != "none" && !urllib.IsHTTPS(link)) {
return proxy.ProxifyURL(f.router, link)
if mediaProxyMode == "all" || (mediaProxyMode != "none" && !urllib.IsHTTPS(link)) {
return mediaproxy.ProxifyRelativeURL(f.router, link)
}
return link
},
"mustBeProxyfied": func(mediaType string) bool {
return slices.Contains(config.Opts.ProxyMediaTypes(), mediaType)
return slices.Contains(config.Opts.MediaProxyResourceTypes(), mediaType)
},
"domain": urllib.Domain,
"hasPrefix": strings.HasPrefix,
@ -91,8 +92,9 @@ func (f *funcMap) Map() template.FuncMap {
"nonce": func() string {
return crypto.GenerateRandomStringHex(16)
},
"deRef": func(i *int) int { return *i },
"duration": duration,
"deRef": func(i *int) int { return *i },
"duration": duration,
"urlEncode": url.PathEscape,
// These functions are overrode at runtime after the parsing.
"elapsed": func(timezone string, t time.Time) string {

View File

@ -36,10 +36,15 @@
{{ if and .user .user.Stylesheet }}
{{ $stylesheetNonce := nonce }}
<meta http-equiv="Content-Security-Policy" content="default-src 'self'; img-src * data:; media-src *; frame-src *; style-src 'self' 'nonce-{{ $stylesheetNonce }}'">
<style nonce="{{ $stylesheetNonce }}">{{ .user.Stylesheet | safeCSS }}</style>
{{ $containsNonce := contains .contentSecurityPolicy "nonce-%s" }}
{{ if $containsNonce }}
{{ noescape ( printf "<meta http-equiv=\"Content-Security-Policy\" content=\"%s\">" (printf .contentSecurityPolicy $stylesheetNonce ) ) }}
{{ else }}
<meta http-equiv="Content-Security-Policy" content="default-src 'self'; img-src * data:; media-src *; frame-src *">
{{ noescape ( printf "<meta http-equiv=\"Content-Security-Policy\" content=\"%s\">" .contentSecurityPolicy ) }}
{{ end }}
<style {{ if $containsNonce }}nonce="{{ $stylesheetNonce }}"{{end}}>{{ .user.Stylesheet | safeCSS }}</style>
{{ else }}
<meta http-equiv="Content-Security-Policy" content="default-src 'self'; img-src * data:; media-src *; frame-src *; require-trusted-types-for 'script'; trusted-types ttpolicy;">
{{ end }}
<script src="{{ route "javascript" "name" "app" "checksum" .app_js_checksum }}" defer></script>
@ -58,7 +63,6 @@
data-webauthn-delete-all-url="{{ route "webauthnDeleteAll" }}"
{{ end }}
{{ if .user }}{{ if not .user.KeyboardShortcuts }}data-disable-keyboard-shortcuts="true"{{ end }}{{ end }}>
{{ if .user }}
<a class="skip-to-content-link" href="#main">{{ t "skip_to_content" }}</a>
<header class="header">
@ -154,6 +158,8 @@
<li>{{ t "page.keyboard_shortcuts.go_to_previous_item" }} = <strong>p</strong>, <strong>k</strong>, <strong>&#x23F4;</strong></li>
<li>{{ t "page.keyboard_shortcuts.go_to_next_item" }} = <strong>n</strong>, <strong>j</strong>, <strong>&#x23F5;</strong></li>
<li>{{ t "page.keyboard_shortcuts.go_to_feed" }} = <strong>F</strong></li>
<li>{{ t "page.keyboard_shortcuts.go_to_top_item" }} = <strong>g + g</strong></li>
<li>{{ t "page.keyboard_shortcuts.go_to_bottom_item" }} = <strong>G</strong></li>
</ul>
<p>{{ t "page.keyboard_shortcuts.subtitle.pages" }}</p>

View File

@ -135,7 +135,7 @@
{{ if .entry.Tags }}
<div class="entry-tags">
{{ t "entry.tags.label" }}
{{range $i, $e := .entry.Tags}}{{if $i}}, {{end}}<strong>{{ $e }}</strong>{{end}}
{{range $i, $e := .entry.Tags}}{{if $i}}, {{end}}<a href="{{ route "tagEntriesAll" "tagName" (urlEncode $e) }}"><strong>{{ $e }}</strong></a>{{end}}
</div>
{{ end }}
<div class="entry-date">

View File

@ -0,0 +1,52 @@
{{ define "title"}}{{ .tagName }} ({{ .total }}){{ end }}
{{ define "page_header"}}
<section class="page-header" aria-labelledby="page-header-title page-header-title-count">
<h1 id="page-header-title" dir="auto">
{{ .tagName }}
<span aria-hidden="true"> ({{ .total }})</span>
</h1>
<span id="page-header-title-count" class="sr-only">{{ plural "page.tag_entry_count" .total .total }}</span>
</section>
{{ end }}
{{ define "content"}}
{{ if not .entries }}
<p role="alert" class="alert alert-info">{{ t "alert.no_tag_entry" }}</p>
{{ else }}
<div class="pagination-top">
{{ template "pagination" .pagination }}
</div>
<div class="items">
{{ range .entries }}
<article
class="item entry-item {{ if $.user.EntrySwipe }}entry-swipe{{ end }} item-status-{{ .Status }}"
data-id="{{ .ID }}"
aria-labelledby="entry-title-{{ .ID }}"
tabindex="-1"
>
<header class="item-header" dir="auto">
<h2 id="entry-title-{{ .ID }}" class="item-title">
<a href="{{ route "tagEntry" "entryID" .ID "tagName" (urlEncode $.tagName) }}">
{{ if ne .Feed.Icon.IconID 0 }}
<img src="{{ route "icon" "iconID" .Feed.Icon.IconID }}" width="16" height="16" loading="lazy" alt="">
{{ end }}
{{ .Title }}
</a>
</h2>
<span class="category">
<a href="{{ route "categoryEntries" "categoryID" .Feed.Category.ID }}">
{{ .Feed.Category.Title }}
</a>
</span>
</header>
{{ template "item_meta" dict "user" $.user "entry" . "hasSaveEntry" $.hasSaveEntry }}
</article>
{{ end }}
</div>
<div class="pagination-bottom">
{{ template "pagination" .pagination }}
</div>
{{ end }}
{{ end }}

View File

@ -6,6 +6,9 @@ package timezone // import "miniflux.app/v2/internal/timezone"
import (
"testing"
"time"
// Make sure these tests pass when the timezone database is not installed on the host system.
_ "time/tzdata"
)
func TestNow(t *testing.T) {

View File

@ -8,6 +8,7 @@ import (
"net/http"
"time"
"miniflux.app/v2/internal/config"
"miniflux.app/v2/internal/http/request"
"miniflux.app/v2/internal/http/response/html"
"miniflux.app/v2/internal/http/route"
@ -32,8 +33,9 @@ func (h *handler) refreshCategory(w http.ResponseWriter, r *http.Request) int64
sess := session.New(h.store, request.SessionID(r))
// Avoid accidental and excessive refreshes.
if time.Now().UTC().Unix()-request.LastForceRefresh(r) < 1800 {
sess.NewFlashErrorMessage(printer.Print("alert.too_many_feeds_refresh"))
if time.Now().UTC().Unix()-request.LastForceRefresh(r) < int64(config.Opts.ForceRefreshInterval())*60 {
time := config.Opts.ForceRefreshInterval()
sess.NewFlashErrorMessage(printer.Plural("alert.too_many_feeds_refresh", time, time))
} else {
// We allow the end-user to force refresh all its feeds in this category
// without taking into consideration the number of errors.

View File

@ -9,8 +9,8 @@ import (
"miniflux.app/v2/internal/http/request"
"miniflux.app/v2/internal/http/response/json"
"miniflux.app/v2/internal/locale"
"miniflux.app/v2/internal/mediaproxy"
"miniflux.app/v2/internal/model"
"miniflux.app/v2/internal/proxy"
"miniflux.app/v2/internal/reader/processor"
"miniflux.app/v2/internal/storage"
)
@ -65,5 +65,5 @@ func (h *handler) fetchContent(w http.ResponseWriter, r *http.Request) {
readingTime := locale.NewPrinter(user.Language).Plural("entry.estimated_reading_time", entry.ReadingTime, entry.ReadingTime)
json.OK(w, r, map[string]string{"content": proxy.ProxyRewriter(h.router, entry.Content), "reading_time": readingTime})
json.OK(w, r, map[string]string{"content": mediaproxy.RewriteDocumentWithRelativeProxyURL(h.router, entry.Content), "reading_time": readingTime})
}

90
internal/ui/entry_tag.go Normal file
View File

@ -0,0 +1,90 @@
// SPDX-FileCopyrightText: Copyright The Miniflux Authors. All rights reserved.
// SPDX-License-Identifier: Apache-2.0
package ui // import "miniflux.app/v2/internal/ui"
import (
"net/http"
"net/url"
"miniflux.app/v2/internal/http/request"
"miniflux.app/v2/internal/http/response/html"
"miniflux.app/v2/internal/http/route"
"miniflux.app/v2/internal/model"
"miniflux.app/v2/internal/storage"
"miniflux.app/v2/internal/ui/session"
"miniflux.app/v2/internal/ui/view"
)
func (h *handler) showTagEntryPage(w http.ResponseWriter, r *http.Request) {
user, err := h.store.UserByID(request.UserID(r))
if err != nil {
html.ServerError(w, r, err)
return
}
tagName, err := url.PathUnescape(request.RouteStringParam(r, "tagName"))
if err != nil {
html.ServerError(w, r, err)
return
}
entryID := request.RouteInt64Param(r, "entryID")
builder := h.store.NewEntryQueryBuilder(user.ID)
builder.WithTags([]string{tagName})
builder.WithEntryID(entryID)
builder.WithoutStatus(model.EntryStatusRemoved)
entry, err := builder.GetEntry()
if err != nil {
html.ServerError(w, r, err)
return
}
if entry == nil {
html.NotFound(w, r)
return
}
if user.MarkReadOnView && entry.Status == model.EntryStatusUnread {
err = h.store.SetEntriesStatus(user.ID, []int64{entry.ID}, model.EntryStatusRead)
if err != nil {
html.ServerError(w, r, err)
return
}
entry.Status = model.EntryStatusRead
}
entryPaginationBuilder := storage.NewEntryPaginationBuilder(h.store, user.ID, entry.ID, user.EntryOrder, user.EntryDirection)
entryPaginationBuilder.WithTags([]string{tagName})
prevEntry, nextEntry, err := entryPaginationBuilder.Entries()
if err != nil {
html.ServerError(w, r, err)
return
}
nextEntryRoute := ""
if nextEntry != nil {
nextEntryRoute = route.Path(h.router, "tagEntry", "tagName", url.PathEscape(tagName), "entryID", nextEntry.ID)
}
prevEntryRoute := ""
if prevEntry != nil {
prevEntryRoute = route.Path(h.router, "tagEntry", "tagName", url.PathEscape(tagName), "entryID", prevEntry.ID)
}
sess := session.New(h.store, request.SessionID(r))
view := view.New(h.tpl, r, sess)
view.Set("entry", entry)
view.Set("prevEntry", prevEntry)
view.Set("nextEntry", nextEntry)
view.Set("nextEntryRoute", nextEntryRoute)
view.Set("prevEntryRoute", prevEntryRoute)
view.Set("user", user)
view.Set("countUnread", h.store.CountUnreadEntries(user.ID))
view.Set("countErrorFeeds", h.store.CountUserFeedsWithErrors(user.ID))
view.Set("hasSaveEntry", h.store.HasSaveEntry(user.ID))
html.OK(w, r, view.Render("entry"))
}

View File

@ -29,7 +29,9 @@ func (h *handler) showIcon(w http.ResponseWriter, r *http.Request) {
b.WithHeader("Content-Security-Policy", `default-src 'self'`)
b.WithHeader("Content-Type", icon.MimeType)
b.WithBody(icon.Content)
b.WithoutCompression()
if icon.MimeType != "image/svg+xml" {
b.WithoutCompression()
}
b.Write()
})
}

View File

@ -46,7 +46,7 @@ func (h *handler) mediaProxy(w http.ResponseWriter, r *http.Request) {
return
}
mac := hmac.New(sha256.New, config.Opts.ProxyPrivateKey())
mac := hmac.New(sha256.New, config.Opts.MediaProxyPrivateKey())
mac.Write(decodedURL)
expectedMAC := mac.Sum(nil)
@ -99,9 +99,9 @@ func (h *handler) mediaProxy(w http.ResponseWriter, r *http.Request) {
clt := &http.Client{
Transport: &http.Transport{
IdleConnTimeout: time.Duration(config.Opts.ProxyHTTPClientTimeout()) * time.Second,
IdleConnTimeout: time.Duration(config.Opts.MediaProxyHTTPClientTimeout()) * time.Second,
},
Timeout: time.Duration(config.Opts.ProxyHTTPClientTimeout()) * time.Second,
Timeout: time.Duration(config.Opts.MediaProxyHTTPClientTimeout()) * time.Second,
}
resp, err := clt.Do(req)

View File

@ -86,7 +86,8 @@ function onClickMainMenuListItem(event) {
if (element.tagName === "A") {
window.location.href = element.getAttribute("href");
} else {
window.location.href = element.querySelector("a").getAttribute("href");
const linkElement = element.querySelector("a") || element.closest("a");
window.location.href = linkElement.getAttribute("href");
}
}
@ -352,7 +353,7 @@ function handleFetchOriginalContent() {
response.json().then((data) => {
if (data.hasOwnProperty("content") && data.hasOwnProperty("reading_time")) {
document.querySelector(".entry-content").innerHTML = data.content;
document.querySelector(".entry-content").innerHTML = ttpolicy.createHTML(data.content);
const entryReadingtimeElement = document.querySelector(".entry-reading-time");
if (entryReadingtimeElement) {
entryReadingtimeElement.textContent = data.reading_time;
@ -444,17 +445,31 @@ function goToPage(page, fallbackSelf) {
}
}
function goToPrevious() {
/**
*
* @param {(number|event)} offset - many items to jump for focus.
*/
function goToPrevious(offset) {
if (offset instanceof KeyboardEvent) {
offset = -1;
}
if (isListView()) {
goToListItem(-1);
goToListItem(offset);
} else {
goToPage("previous");
}
}
function goToNext() {
/**
*
* @param {(number|event)} offset - How many items to jump for focus.
*/
function goToNext(offset) {
if (offset instanceof KeyboardEvent) {
offset = 1;
}
if (isListView()) {
goToListItem(1);
goToListItem(offset);
} else {
goToPage("next");
}
@ -482,6 +497,10 @@ function goToFeed() {
}
}
// Sentinel values for specific list navigation
const TOP = 9999;
const BOTTOM = -9999;
/**
* @param {number} offset How many items to jump for focus.
*/
@ -501,8 +520,15 @@ function goToListItem(offset) {
if (items[i].classList.contains("current-item")) {
items[i].classList.remove("current-item");
const index = (i + offset + items.length) % items.length;
const item = items[index];
// By default adjust selection by offset
let itemOffset = (i + offset + items.length) % items.length;
// Allow jumping to top or bottom
if (offset == TOP) {
itemOffset = 0;
} else if (offset == BOTTOM) {
itemOffset = items.length - 1;
}
const item = items[itemOffset];
item.classList.add("current-item");
DomHelper.scrollPageTo(item);

View File

@ -2,13 +2,15 @@ document.addEventListener("DOMContentLoaded", () => {
handleSubmitButtons();
if (!document.querySelector("body[data-disable-keyboard-shortcuts=true]")) {
let keyboardHandler = new KeyboardHandler();
const keyboardHandler = new KeyboardHandler();
keyboardHandler.on("g u", () => goToPage("unread"));
keyboardHandler.on("g b", () => goToPage("starred"));
keyboardHandler.on("g h", () => goToPage("history"));
keyboardHandler.on("g f", goToFeedOrFeeds);
keyboardHandler.on("g c", () => goToPage("categories"));
keyboardHandler.on("g s", () => goToPage("settings"));
keyboardHandler.on("g g", () => goToPrevious(TOP));
keyboardHandler.on("G", () => goToNext(BOTTOM));
keyboardHandler.on("ArrowLeft", goToPrevious);
keyboardHandler.on("ArrowRight", goToNext);
keyboardHandler.on("k", goToPrevious);
@ -46,7 +48,7 @@ document.addEventListener("DOMContentLoaded", () => {
keyboardHandler.listen();
}
let touchHandler = new TouchHandler();
const touchHandler = new TouchHandler();
touchHandler.listen();
if (WebAuthnHandler.isWebAuthnSupported()) {
@ -54,7 +56,7 @@ document.addEventListener("DOMContentLoaded", () => {
onClick("#webauthn-delete", () => { webauthnHandler.removeAllCredentials(); });
let registerButton = document.getElementById("webauthn-register");
const registerButton = document.getElementById("webauthn-register");
if (registerButton != null) {
registerButton.disabled = false;
@ -63,13 +65,13 @@ document.addEventListener("DOMContentLoaded", () => {
});
}
let loginButton = document.getElementById("webauthn-login");
const loginButton = document.getElementById("webauthn-login");
if (loginButton != null) {
const abortController = new AbortController();
loginButton.disabled = false;
onClick("#webauthn-login", () => {
let usernameField = document.getElementById("form-username");
const usernameField = document.getElementById("form-username");
if (usernameField != null) {
abortController.abort();
webauthnHandler.login(usernameField.value).catch(err => WebAuthnHandler.showErrorMessage(err));
@ -87,7 +89,7 @@ document.addEventListener("DOMContentLoaded", () => {
onClick(":is(a, button)[data-action=markPageAsRead]", (event) => handleConfirmationMessage(event.target, markPageAsRead));
onClick(":is(a, button)[data-toggle-status]", (event) => handleEntryStatus("next", event.target));
onClick(":is(a, button)[data-confirm]", (event) => handleConfirmationMessage(event.target, (url, redirectURL) => {
let request = new RequestBuilder(url);
const request = new RequestBuilder(url);
request.withCallback((response) => {
if (redirectURL) {
@ -125,9 +127,9 @@ document.addEventListener("DOMContentLoaded", () => {
onClick(".header nav li", (event) => onClickMainMenuListItem(event));
if ("serviceWorker" in navigator) {
let scriptElement = document.getElementById("service-worker-script");
const scriptElement = document.getElementById("service-worker-script");
if (scriptElement) {
navigator.serviceWorker.register(scriptElement.src);
navigator.serviceWorker.register(ttpolicy.createScriptURL(scriptElement.src));
}
}

View File

@ -4,17 +4,17 @@ class DomHelper {
}
static openNewTab(url) {
let win = window.open("");
const win = window.open("");
win.opener = null;
win.location = url;
win.focus();
}
static scrollPageTo(element, evenIfOnScreen) {
let windowScrollPosition = window.pageYOffset;
let windowHeight = document.documentElement.clientHeight;
let viewportPosition = windowScrollPosition + windowHeight;
let itemBottomPosition = element.offsetTop + element.offsetHeight;
const windowScrollPosition = window.pageYOffset;
const windowHeight = document.documentElement.clientHeight;
const viewportPosition = windowScrollPosition + windowHeight;
const itemBottomPosition = element.offsetTop + element.offsetHeight;
if (evenIfOnScreen || viewportPosition - itemBottomPosition < 0 || viewportPosition - element.offsetTop > windowHeight) {
window.scrollTo(0, element.offsetTop - 10);

View File

@ -12,7 +12,7 @@ class KeyboardHandler {
listen() {
document.onkeydown = (event) => {
let key = this.getKey(event);
const key = this.getKey(event);
if (this.isEventIgnored(event, key) || this.isModifierKeyDown(event)) {
return;
}
@ -23,8 +23,8 @@ class KeyboardHandler {
this.queue.push(key);
for (let combination in this.shortcuts) {
let keys = combination.split(" ");
for (const combination in this.shortcuts) {
const keys = combination.split(" ");
if (keys.every((value, index) => value === this.queue[index])) {
this.queue = [];
@ -64,7 +64,7 @@ class KeyboardHandler {
'Right': 'ArrowRight'
};
for (let key in mapping) {
for (const key in mapping) {
if (mapping.hasOwnProperty(key) && key === event.key) {
return mapping[key];
}

View File

@ -8,7 +8,7 @@ class ModalHandler {
}
static getFocusableElements() {
let container = this.getModalContainer();
const container = this.getModalContainer();
if (container === null) {
return null;
@ -18,14 +18,14 @@ class ModalHandler {
}
static setupFocusTrap() {
let focusableElements = this.getFocusableElements();
const focusableElements = this.getFocusableElements();
if (focusableElements === null) {
return;
}
let firstFocusableElement = focusableElements[0];
let lastFocusableElement = focusableElements[focusableElements.length - 1];
const firstFocusableElement = focusableElements[0];
const lastFocusableElement = focusableElements[focusableElements.length - 1];
this.getModalContainer().onkeydown = (e) => {
if (e.key !== 'Tab') {
@ -57,13 +57,13 @@ class ModalHandler {
this.activeElement = document.activeElement;
let container = document.createElement("div");
const container = document.createElement("div");
container.id = "modal-container";
container.setAttribute("role", "dialog");
container.appendChild(document.importNode(fragment, true));
document.body.appendChild(container);
let closeButton = document.querySelector("button.btn-close-modal");
const closeButton = document.querySelector("button.btn-close-modal");
if (closeButton !== null) {
closeButton.onclick = (event) => {
event.preventDefault();
@ -89,7 +89,7 @@ class ModalHandler {
}
static close() {
let container = this.getModalContainer();
const container = this.getModalContainer();
if (container !== null) {
container.parentNode.removeChild(container);
}

View File

@ -15,8 +15,8 @@ class TouchHandler {
calculateDistance() {
if (this.touch.start.x >= -1 && this.touch.move.x >= -1) {
let horizontalDistance = Math.abs(this.touch.move.x - this.touch.start.x);
let verticalDistance = Math.abs(this.touch.move.y - this.touch.start.y);
const horizontalDistance = Math.abs(this.touch.move.x - this.touch.start.x);
const verticalDistance = Math.abs(this.touch.move.y - this.touch.start.y);
if (horizontalDistance > 30 && verticalDistance < 70 || this.touch.moved) {
return this.touch.move.x - this.touch.start.x;
@ -54,8 +54,8 @@ class TouchHandler {
this.touch.move.x = event.touches[0].clientX;
this.touch.move.y = event.touches[0].clientY;
let distance = this.calculateDistance();
let absDistance = Math.abs(distance);
const distance = this.calculateDistance();
const absDistance = Math.abs(distance);
if (absDistance > 0) {
this.touch.moved = true;
@ -78,7 +78,7 @@ class TouchHandler {
}
if (this.touch.element !== null) {
let absDistance = Math.abs(this.calculateDistance());
const absDistance = Math.abs(this.calculateDistance());
if (absDistance > 75) {
toggleEntryStatus(this.touch.element);
@ -118,9 +118,9 @@ class TouchHandler {
return;
}
let distance = this.calculateDistance();
let absDistance = Math.abs(distance);
let now = Date.now();
const distance = this.calculateDistance();
const absDistance = Math.abs(distance);
const now = Date.now();
if (now - this.touch.time <= 1000 && absDistance > 75) {
if (distance > 0) {
@ -138,10 +138,10 @@ class TouchHandler {
return;
}
let now = Date.now();
const now = Date.now();
if (this.touch.start.x !== -1 && now - this.touch.time <= 200) {
let innerWidthHalf = window.innerWidth / 2;
const innerWidthHalf = window.innerWidth / 2;
if (this.touch.start.x >= innerWidthHalf && event.changedTouches[0].clientX >= innerWidthHalf) {
goToPage("next");
@ -158,19 +158,16 @@ class TouchHandler {
}
listen() {
let hasPassiveOption = DomHelper.hasPassiveEventListenerOption();
const hasPassiveOption = DomHelper.hasPassiveEventListenerOption();
let elements = document.querySelectorAll(".entry-swipe");
elements.forEach((element) => {
document.querySelectorAll(".entry-swipe").forEach((element) => {
element.addEventListener("touchstart", (e) => this.onItemTouchStart(e), hasPassiveOption ? { passive: true } : false);
element.addEventListener("touchmove", (e) => this.onItemTouchMove(e), hasPassiveOption ? { passive: false } : false);
element.addEventListener("touchend", (e) => this.onItemTouchEnd(e), hasPassiveOption ? { passive: true } : false);
element.addEventListener("touchcancel", () => this.reset(), hasPassiveOption ? { passive: true } : false);
});
let element = document.querySelector(".entry-content");
const element = document.querySelector(".entry-content");
if (element) {
if (element.classList.contains("gesture-nav-tap")) {
element.addEventListener("touchend", (e) => this.onTapEnd(e), hasPassiveOption ? { passive: true } : false);

View File

@ -0,0 +1,15 @@
let ttpolicy;
if (window.trustedTypes && trustedTypes.createPolicy) {
//TODO: use an allow-list for `createScriptURL`
if (!ttpolicy) {
ttpolicy = trustedTypes.createPolicy('ttpolicy', {
createScriptURL: src => src,
createHTML: html => html,
});
}
} else {
ttpolicy = {
createScriptURL: src => src,
createHTML: html => html,
};
}

View File

@ -5,7 +5,7 @@ class WebAuthnHandler {
static showErrorMessage(errorMessage) {
console.log("webauthn error: " + errorMessage);
let alertElement = document.getElementById("webauthn-error");
const alertElement = document.getElementById("webauthn-error");
if (alertElement) {
alertElement.textContent += " (" + errorMessage + ")";
alertElement.classList.remove("hidden");
@ -79,14 +79,14 @@ class WebAuthnHandler {
return;
}
let credentialCreationOptions = await registerBeginResponse.json();
const credentialCreationOptions = await registerBeginResponse.json();
credentialCreationOptions.publicKey.challenge = this.decodeBuffer(credentialCreationOptions.publicKey.challenge);
credentialCreationOptions.publicKey.user.id = this.decodeBuffer(credentialCreationOptions.publicKey.user.id);
if (Object.hasOwn(credentialCreationOptions.publicKey, 'excludeCredentials')) {
credentialCreationOptions.publicKey.excludeCredentials.forEach((credential) => credential.id = this.decodeBuffer(credential.id));
}
let attestation = await navigator.credentials.create(credentialCreationOptions);
const attestation = await navigator.credentials.create(credentialCreationOptions);
let registrationFinishResponse;
try {
@ -108,7 +108,7 @@ class WebAuthnHandler {
throw new Error("Login failed with HTTP status code " + response.status);
}
let jsonData = await registrationFinishResponse.json();
const jsonData = await registrationFinishResponse.json();
window.location.href = jsonData.redirect;
}
@ -121,7 +121,7 @@ class WebAuthnHandler {
return;
}
let credentialRequestOptions = await loginBeginResponse.json();
const credentialRequestOptions = await loginBeginResponse.json();
credentialRequestOptions.publicKey.challenge = this.decodeBuffer(credentialRequestOptions.publicKey.challenge);
if (Object.hasOwn(credentialRequestOptions.publicKey, 'allowCredentials')) {

View File

@ -113,6 +113,7 @@ func GenerateStylesheetsBundles() error {
func GenerateJavascriptBundles() error {
var bundles = map[string][]string{
"app": {
"js/tt.js", // has to be first
"js/dom_helper.js",
"js/touch_handler.js",
"js/keyboard_handler.js",

View File

@ -31,12 +31,12 @@ func (h *handler) showAppIcon(w http.ResponseWriter, r *http.Request) {
switch filepath.Ext(filename) {
case ".png":
b.WithoutCompression()
b.WithHeader("Content-Type", "image/png")
case ".svg":
b.WithHeader("Content-Type", "image/svg+xml")
}
b.WithoutCompression()
b.WithBody(blob)
b.Write()
})

View File

@ -86,24 +86,26 @@ func (h *handler) submitSubscription(w http.ResponseWriter, r *http.Request) {
html.OK(w, r, v.Render("add_subscription"))
case n == 1 && subscriptionFinder.IsFeedAlreadyDownloaded():
feed, localizedError := feedHandler.CreateFeedFromSubscriptionDiscovery(h.store, user.ID, &model.FeedCreationRequestFromSubscriptionDiscovery{
Content: subscriptionFinder.FeedResponseInfo().Content,
ETag: subscriptionFinder.FeedResponseInfo().ETag,
LastModified: subscriptionFinder.FeedResponseInfo().LastModified,
CategoryID: subscriptionForm.CategoryID,
FeedURL: subscriptions[0].URL,
Crawler: subscriptionForm.Crawler,
AllowSelfSignedCertificates: subscriptionForm.AllowSelfSignedCertificates,
UserAgent: subscriptionForm.UserAgent,
Cookie: subscriptionForm.Cookie,
Username: subscriptionForm.Username,
Password: subscriptionForm.Password,
ScraperRules: subscriptionForm.ScraperRules,
RewriteRules: subscriptionForm.RewriteRules,
BlocklistRules: subscriptionForm.BlocklistRules,
KeeplistRules: subscriptionForm.KeeplistRules,
UrlRewriteRules: subscriptionForm.UrlRewriteRules,
FetchViaProxy: subscriptionForm.FetchViaProxy,
DisableHTTP2: subscriptionForm.DisableHTTP2,
Content: subscriptionFinder.FeedResponseInfo().Content,
ETag: subscriptionFinder.FeedResponseInfo().ETag,
LastModified: subscriptionFinder.FeedResponseInfo().LastModified,
FeedCreationRequest: model.FeedCreationRequest{
CategoryID: subscriptionForm.CategoryID,
FeedURL: subscriptions[0].URL,
AllowSelfSignedCertificates: subscriptionForm.AllowSelfSignedCertificates,
Crawler: subscriptionForm.Crawler,
UserAgent: subscriptionForm.UserAgent,
Cookie: subscriptionForm.Cookie,
Username: subscriptionForm.Username,
Password: subscriptionForm.Password,
ScraperRules: subscriptionForm.ScraperRules,
RewriteRules: subscriptionForm.RewriteRules,
BlocklistRules: subscriptionForm.BlocklistRules,
KeeplistRules: subscriptionForm.KeeplistRules,
UrlRewriteRules: subscriptionForm.UrlRewriteRules,
FetchViaProxy: subscriptionForm.FetchViaProxy,
DisableHTTP2: subscriptionForm.DisableHTTP2,
},
})
if localizedError != nil {
v.Set("form", subscriptionForm)

View File

@ -0,0 +1,65 @@
// SPDX-FileCopyrightText: Copyright The Miniflux Authors. All rights reserved.
// SPDX-License-Identifier: Apache-2.0
package ui // import "miniflux.app/v2/internal/ui"
import (
"net/http"
"net/url"
"miniflux.app/v2/internal/http/request"
"miniflux.app/v2/internal/http/response/html"
"miniflux.app/v2/internal/http/route"
"miniflux.app/v2/internal/model"
"miniflux.app/v2/internal/ui/session"
"miniflux.app/v2/internal/ui/view"
)
func (h *handler) showTagEntriesAllPage(w http.ResponseWriter, r *http.Request) {
user, err := h.store.UserByID(request.UserID(r))
if err != nil {
html.ServerError(w, r, err)
return
}
tagName, err := url.PathUnescape(request.RouteStringParam(r, "tagName"))
if err != nil {
html.ServerError(w, r, err)
return
}
offset := request.QueryIntParam(r, "offset", 0)
builder := h.store.NewEntryQueryBuilder(user.ID)
builder.WithoutStatus(model.EntryStatusRemoved)
builder.WithTags([]string{tagName})
builder.WithSorting("status", "asc")
builder.WithSorting(user.EntryOrder, user.EntryDirection)
builder.WithOffset(offset)
builder.WithLimit(user.EntriesPerPage)
entries, err := builder.GetEntries()
if err != nil {
html.ServerError(w, r, err)
return
}
count, err := builder.CountEntries()
if err != nil {
html.ServerError(w, r, err)
return
}
sess := session.New(h.store, request.SessionID(r))
view := view.New(h.tpl, r, sess)
view.Set("tagName", tagName)
view.Set("total", count)
view.Set("entries", entries)
view.Set("pagination", getPagination(route.Path(h.router, "tagEntriesAll", "tagName", url.PathEscape(tagName)), count, offset, user.EntriesPerPage))
view.Set("user", user)
view.Set("countUnread", h.store.CountUnreadEntries(user.ID))
view.Set("countErrorFeeds", h.store.CountUserFeedsWithErrors(user.ID))
view.Set("hasSaveEntry", h.store.HasSaveEntry(user.ID))
view.Set("showOnlyUnreadEntries", false)
html.OK(w, r, view.Render("tag_entries"))
}

View File

@ -93,6 +93,10 @@ func Serve(router *mux.Router, store *storage.Storage, pool *worker.Pool) {
uiRouter.HandleFunc("/category/{categoryID}/remove", handler.removeCategory).Name("removeCategory").Methods(http.MethodPost)
uiRouter.HandleFunc("/category/{categoryID}/mark-all-as-read", handler.markCategoryAsRead).Name("markCategoryAsRead").Methods(http.MethodPost)
// Tag pages.
uiRouter.HandleFunc("/tags/{tagName}/entries/all", handler.showTagEntriesAllPage).Name("tagEntriesAll").Methods(http.MethodGet)
uiRouter.HandleFunc("/tags/{tagName}/entry/{entryID}", handler.showTagEntryPage).Name("tagEntry").Methods(http.MethodGet)
// Entry pages.
uiRouter.HandleFunc("/entry/status", handler.updateEntriesStatus).Name("updateEntriesStatus").Methods(http.MethodPost)
uiRouter.HandleFunc("/entry/save/{entryID}", handler.saveEntry).Name("saveEntry").Methods(http.MethodPost)

View File

@ -46,5 +46,6 @@ func New(tpl *template.Engine, r *http.Request, sess *session.Session) *View {
"sw_js_checksum": static.JavascriptBundleChecksums["service-worker"],
"webauthn_js_checksum": static.JavascriptBundleChecksums["webauthn"],
"webAuthnEnabled": config.Opts.WebAuthn(),
"contentSecurityPolicy": config.Opts.ContentSecurityPolicy(),
}}
}

View File

@ -1,5 +1,5 @@
.\" Manpage for miniflux.
.TH "MINIFLUX" "1" "November 5, 2023" "\ \&" "\ \&"
.TH "MINIFLUX" "1" "March 23, 2024" "\ \&" "\ \&"
.SH NAME
miniflux \- Minimalist and opinionated feed reader
@ -31,7 +31,7 @@ Load configuration file\&.
.PP
.B \-create-admin
.RS 4
Create admin user\&.
Create an admin user from an interactive terminal\&.
.RE
.PP
.B \-debug
@ -120,6 +120,130 @@ Environment variables override the values defined in the config file.
.SH ENVIRONMENT
.TP
.B ADMIN_PASSWORD
Admin user password, used only if $CREATE_ADMIN is enabled\&.
.br
Default is empty\&.
.TP
.B ADMIN_PASSWORD_FILE
Path to a secret key exposed as a file, it should contain $ADMIN_PASSWORD value\&.
.br
Default is empty\&.
.TP
.B ADMIN_USERNAME
Admin user login, used only if $CREATE_ADMIN is enabled\&.
.br
Default is empty\&.
.TP
.B ADMIN_USERNAME_FILE
Path to a secret key exposed as a file, it should contain $ADMIN_USERNAME value\&.
.br
Default is empty\&.
.TP
.B AUTH_PROXY_HEADER
Proxy authentication HTTP header\&.
.br
Default is empty.
.TP
.B AUTH_PROXY_USER_CREATION
Set to 1 to create users based on proxy authentication information\&.
.br
Disabled by default\&.
.TP
.B BASE_URL
Base URL to generate HTML links and base path for cookies\&.
.br
Default is http://localhost/\&.
.TP
.B BATCH_SIZE
Number of feeds to send to the queue for each interval\&.
.br
Default is 100 feeds\&.
.TP
.B CERT_DOMAIN
Use Let's Encrypt to get automatically a certificate for this domain\&.
.br
Default is empty\&.
.TP
.B CERT_FILE
Path to SSL certificate\&.
.br
Default is empty\&.
.TP
.B CLEANUP_ARCHIVE_BATCH_SIZE
Number of entries to archive for each job interval\&.
.br
Default is 10000 entries\&.
.TP
.B CLEANUP_ARCHIVE_READ_DAYS
Number of days after marking read entries as removed\&.
.br
Set to -1 to keep all read entries.
.br
Default is 60 days\&.
.TP
.B CLEANUP_ARCHIVE_UNREAD_DAYS
Number of days after marking unread entries as removed\&.
.br
Set to -1 to keep all unread entries.
.br
Default is 180 days\&.
.TP
.B CLEANUP_FREQUENCY_HOURS
Cleanup job frequency. Remove old sessions and archive entries\&.
.br
Default is 24 hours\&.
.TP
.B CLEANUP_REMOVE_SESSIONS_DAYS
Number of days after removing old sessions from the database\&.
.br
Default is 30 days\&.
.TP
.B CREATE_ADMIN
Set to 1 to create an admin user from environment variables\&.
.br
Disabled by default\&.
.TP
.B DATABASE_CONNECTION_LIFETIME
Set the maximum amount of time a connection may be reused\&.
.br
Default is 5 minutes\&.
.TP
.B DATABASE_MAX_CONNS
Maximum number of database connections\&.
.br
Default is 20\&.
.TP
.B DATABASE_MIN_CONNS
Minimum number of database connections\&.
.br
Default is 20\&.
.TP
.B DATABASE_URL
Postgresql connection parameters\&.
.br
Default is "user=postgres password=postgres dbname=miniflux2 sslmode=disable"\&.
.TP
.B DATABASE_URL_FILE
Path to a secret key exposed as a file, it should contain $DATABASE_URL value\&.
.br
Default is empty\&.
.TP
.B DISABLE_HSTS
Disable HTTP Strict Transport Security header if \fBHTTPS\fR is set\&.
.br
Default is false (The HSTS is enabled)\&.
.TP
.B DISABLE_HTTP_SERVICE
Set the value to 1 to disable the HTTP service\&.
.br
Default is false (The HTTP service is enabled)\&.
.TP
.B DISABLE_SCHEDULER_SERVICE
Set the value to 1 to disable the internal scheduler service\&.
.br
Default is false (The internal scheduler service is enabled)\&.
.TP
.B FETCH_ODYSEE_WATCH_TIME
Set the value to 1 to scrape video duration from Odysee website and
use it as a reading time\&.
@ -132,15 +256,64 @@ use it as a reading time\&.
.br
Disabled by default\&.
.TP
.B YOUTUBE_EMBED_URL_OVERRIDE
YouTube URL which will be used for embeds\&.
.B FILTER_ENTRY_MAX_AGE_DAYS
Number of days after which new entries should be retained\&.
.br
Default is https://www.youtube-nocookie.com/embed/\&
Set 7 to fetch only entries 7 days old\&.
.br
Default is 0 (disabled)\&.
.TP
.B SERVER_TIMING_HEADER
Set the value to 1 to enable server-timing headers\&.
.B FORCE_REFRESH_INTERVAL
The minimum interval for manual refresh\&.
.br
Disabled by default\&.
Default is 30 minutes\&.
.TP
.B HTTP_CLIENT_MAX_BODY_SIZE
Maximum body size for HTTP requests in Mebibyte (MiB)\&.
.br
Default is 15 MiB\&.
.TP
.B HTTP_CLIENT_PROXY
Proxy URL for HTTP client\&.
.br
Default is empty\&.
.TP
.B HTTP_CLIENT_TIMEOUT
Time limit in seconds before the HTTP client cancel the request\&.
.br
Default is 20 seconds\&.
.TP
.B HTTP_CLIENT_USER_AGENT
The default User-Agent header to use for the HTTP client. Can be overridden in per-feed settings\&.
.br
When empty, Miniflux uses a default User-Agent that includes the Miniflux version\&.
.br
Default is empty.
.TP
.B HTTP_SERVER_TIMEOUT
Time limit in seconds before the HTTP client cancel the request\&.
.br
Default is 300 seconds\&.
.TP
.B HTTPS
Forces cookies to use secure flag and send HSTS header\&.
.br
Default is empty\&.
.TP
.B INVIDIOUS_INSTANCE
Set a custom invidious instance to use\&.
.br
Default is yewtu.be\&.
.TP
.B KEY_FILE
Path to SSL private key\&.
.br
Default is empty\&.
.TP
.B LISTEN_ADDR
Address to listen on. Use absolute path to listen on Unix socket (/var/run/miniflux.sock)\&.
.br
Default is 127.0.0.1:8080\&.
.TP
.B LOG_DATE_TIME
Display the date and time in log messages\&.
@ -162,190 +335,50 @@ Supported values are "debug", "info", "warning", or "error"\&.
.br
Default is "info"\&.
.TP
.B WORKER_POOL_SIZE
Number of background workers\&.
.B MAINTENANCE_MESSAGE
Define a custom maintenance message\&.
.br
Default is 16 workers\&.
Default is "Miniflux is currently under maintenance"\&.
.TP
.B POLLING_FREQUENCY
Refresh interval in minutes for feeds\&.
.br
Default is 60 minutes\&.
.TP
.B FORCE_REFRESH_INTERVAL
The minimum interval for manual refresh\&.
.br
Default is 30 minutes\&.
.TP
.B BATCH_SIZE
Number of feeds to send to the queue for each interval\&.
.br
Default is 100 feeds\&.
.TP
.B POLLING_SCHEDULER
Scheduler used for polling feeds. Possible values are "round_robin" or "entry_frequency"\&.
.br
The maximum number of feeds polled for a given period is subject to POLLING_FREQUENCY and BATCH_SIZE\&.
.br
When "entry_frequency" is selected, the refresh interval for a given feed is equal to the average updating interval of the last week of the feed\&.
.br
The actual number of feeds polled will not exceed the maximum number of feeds that could be polled for a given period\&.
.br
Default is "round_robin"\&.
.TP
.B SCHEDULER_ENTRY_FREQUENCY_MAX_INTERVAL
Maximum interval in minutes for the entry frequency scheduler\&.
.br
Default is 24 hours\&.
.TP
.B SCHEDULER_ENTRY_FREQUENCY_MIN_INTERVAL
Minimum interval in minutes for the entry frequency scheduler\&.
.br
Default is 5 minutes\&.
.TP
.B SCHEDULER_ENTRY_FREQUENCY_FACTOR
Factor to increase refresh frequency for the entry frequency scheduler\&.
.br
Default is 1\&.
.TP
.B SCHEDULER_ROUND_ROBIN_MIN_INTERVAL
Minimum interval in minutes for the round robin scheduler\&.
.br
Default is 60 minutes\&.
.TP
.B POLLING_PARSING_ERROR_LIMIT
The maximum number of parsing errors that the program will try before stopping polling a feed. Once the limit is reached, the user must refresh the feed manually. Set to 0 for unlimited.
.br
Default is 3\&.
.TP
.B DATABASE_URL
Postgresql connection parameters\&.
.br
Default is "user=postgres password=postgres dbname=miniflux2 sslmode=disable"\&.
.TP
.B DATABASE_URL_FILE
Path to a secret key exposed as a file, it should contain $DATABASE_URL value\&.
.br
Default is empty\&.
.TP
.B DATABASE_CONNECTION_LIFETIME
Set the maximum amount of time a connection may be reused\&.
.br
Default is 5 minutes\&.
.TP
.B DATABASE_MAX_CONNS
Maximum number of database connections\&.
.br
Default is 20\&.
.TP
.B DATABASE_MIN_CONNS
Minimum number of database connections\&.
.br
Default is 20\&.
.TP
.B LISTEN_ADDR
Address to listen on. Use absolute path to listen on Unix socket (/var/run/miniflux.sock)\&.
.br
Default is 127.0.0.1:8080\&.
.TP
.B PORT
Override LISTEN_ADDR to 0.0.0.0:$PORT\&.
.br
Default is empty\&.
.TP
.B BASE_URL
Base URL to generate HTML links and base path for cookies\&.
.br
Default is http://localhost/\&.
.TP
.B CLEANUP_FREQUENCY_HOURS
Cleanup job frequency. Remove old sessions and archive entries\&.
.br
Default is 24 hours\&.
.TP
.B CLEANUP_ARCHIVE_READ_DAYS
Number of days after marking read entries as removed\&.
.br
Set to -1 to keep all read entries.
.br
Default is 60 days\&.
.TP
.B CLEANUP_ARCHIVE_UNREAD_DAYS
Number of days after marking unread entries as removed\&.
.br
Set to -1 to keep all unread entries.
.br
Default is 180 days\&.
.TP
.B CLEANUP_ARCHIVE_BATCH_SIZE
Number of entries to archive for each job interval\&.
.br
Default is 10000 entries\&.
.TP
.B CLEANUP_REMOVE_SESSIONS_DAYS
Number of days after removing old sessions from the database\&.
.br
Default is 30 days\&.
.TP
.B HTTPS
Forces cookies to use secure flag and send HSTS header\&.
.br
Default is empty\&.
.TP
.B DISABLE_HSTS
Disable HTTP Strict Transport Security header if \fBHTTPS\fR is set\&.
.br
Default is false (The HSTS is enabled)\&.
.TP
.B DISABLE_HTTP_SERVICE
Set the value to 1 to disable the HTTP service\&.
.br
Default is false (The HTTP service is enabled)\&.
.TP
.B DISABLE_SCHEDULER_SERVICE
Set the value to 1 to disable the internal scheduler service\&.
.br
Default is false (The internal scheduler service is enabled)\&.
.TP
.B CERT_FILE
Path to SSL certificate\&.
.br
Default is empty\&.
.TP
.B KEY_FILE
Path to SSL private key\&.
.br
Default is empty\&.
.TP
.B CERT_DOMAIN
Use Let's Encrypt to get automatically a certificate for this domain\&.
.br
Default is empty\&.
.TP
.B METRICS_COLLECTOR
Set to 1 to enable metrics collector. Expose a /metrics endpoint for Prometheus.
.B MAINTENANCE_MODE
Set to 1 to enable maintenance mode\&.
.br
Disabled by default\&.
.TP
.B METRICS_REFRESH_INTERVAL
Refresh interval to collect database metrics\&.
.B MEDIA_PROXY_CUSTOM_URL
Sets an external server to proxy media through\&.
.br
Default is 60 seconds\&.
Default is empty, Miniflux does the proxying\&.
.TP
.B MEDIA_PROXY_HTTP_CLIENT_TIMEOUT
Time limit in seconds before the media proxy HTTP client cancel the request\&.
.br
Default is 120 seconds\&.
.TP
.B MEDIA_PROXY_RESOURCE_TYPES
A comma-separated list of media types to proxify. Supported values are: image, audio, video\&.
.br
Default is image\&.
.TP
.B MEDIA_PROXY_MODE
Possible values: http-only, all, or none\&.
.br
Default is http-only\&.
.TP
.B MEDIA_PROXY_PRIVATE_KEY
Set a custom custom private key used to sign proxified media URLs\&.
.br
By default, a secret key is randomly generated during startup\&.
.TP
.B METRICS_ALLOWED_NETWORKS
List of networks allowed to access the metrics endpoint (comma-separated values)\&.
.br
Default is 127.0.0.1/8\&.
.TP
.B METRICS_USERNAME
Metrics endpoint username for basic HTTP authentication\&.
.B METRICS_COLLECTOR
Set to 1 to enable metrics collector. Expose a /metrics endpoint for Prometheus.
.br
Default is emtpty\&.
.TP
.B METRICS_USERNAME_FILE
Path to a file that contains the username for the metrics endpoint HTTP authentication\&.
.br
Default is emtpty\&.
Disabled by default\&.
.TP
.B METRICS_PASSWORD
Metrics endpoint password for basic HTTP authentication\&.
@ -357,10 +390,20 @@ Path to a file that contains the password for the metrics endpoint HTTP authenti
.br
Default is emtpty\&.
.TP
.B OAUTH2_PROVIDER
Possible values are "google" or "oidc"\&.
.B METRICS_REFRESH_INTERVAL
Refresh interval to collect database metrics\&.
.br
Default is empty\&.
Default is 60 seconds\&.
.TP
.B METRICS_USERNAME
Metrics endpoint username for basic HTTP authentication\&.
.br
Default is emtpty\&.
.TP
.B METRICS_USERNAME_FILE
Path to a file that contains the username for the metrics endpoint HTTP authentication\&.
.br
Default is emtpty\&.
.TP
.B OAUTH2_CLIENT_ID
OAuth2 client ID\&.
@ -382,6 +425,16 @@ Path to a secret key exposed as a file, it should contain $OAUTH2_CLIENT_SECRET
.br
Default is empty\&.
.TP
.B OAUTH2_OIDC_DISCOVERY_ENDPOINT
OpenID Connect discovery endpoint\&.
.br
Default is empty\&.
.TP
.B OAUTH2_PROVIDER
Possible values are "google" or "oidc"\&.
.br
Default is empty\&.
.TP
.B OAUTH2_REDIRECT_URL
OAuth2 redirect URL\&.
.br
@ -389,46 +442,11 @@ This URL must be registered with the provider and is something like https://mini
.br
Default is empty\&.
.TP
.B OAUTH2_OIDC_DISCOVERY_ENDPOINT
OpenID Connect discovery endpoint\&.
.br
Default is empty\&.
.TP
.B OAUTH2_USER_CREATION
Set to 1 to authorize OAuth2 user creation\&.
.br
Disabled by default\&.
.TP
.B RUN_MIGRATIONS
Set to 1 to run database migrations\&.
.br
Disabled by default\&.
.TP
.B CREATE_ADMIN
Set to 1 to create an admin user from environment variables\&.
.br
Disabled by default\&.
.TP
.B ADMIN_USERNAME
Admin user login, used only if $CREATE_ADMIN is enabled\&.
.br
Default is empty\&.
.TP
.B ADMIN_USERNAME_FILE
Path to a secret key exposed as a file, it should contain $ADMIN_USERNAME value\&.
.br
Default is empty\&.
.TP
.B ADMIN_PASSWORD
Admin user password, used only if $CREATE_ADMIN is enabled\&.
.br
Default is empty\&.
.TP
.B ADMIN_PASSWORD_FILE
Path to a secret key exposed as a file, it should contain $ADMIN_PASSWORD value\&.
.br
Default is empty\&.
.TP
.B POCKET_CONSUMER_KEY
Pocket consumer API key for all users\&.
.br
@ -439,92 +457,91 @@ Path to a secret key exposed as a file, it should contain $POCKET_CONSUMER_KEY v
.br
Default is empty\&.
.TP
.B PROXY_OPTION
Avoids mixed content warnings for external media: http-only, all, or none\&.
.B POLLING_FREQUENCY
Refresh interval in minutes for feeds\&.
.br
Default is http-only\&.
Default is 60 minutes\&.
.TP
.B PROXY_MEDIA_TYPES
A list of media types to proxify (comma-separated values): image, audio, video\&.
.B POLLING_PARSING_ERROR_LIMIT
The maximum number of parsing errors that the program will try before stopping polling a feed. Once the limit is reached, the user must refresh the feed manually. Set to 0 for unlimited.
.br
Default is image only\&.
Default is 3\&.
.TP
.B PROXY_HTTP_CLIENT_TIMEOUT
Time limit in seconds before the proxy HTTP client cancel the request\&.
.B POLLING_SCHEDULER
Scheduler used for polling feeds. Possible values are "round_robin" or "entry_frequency"\&.
.br
Default is 120 seconds\&.
.TP
.B PROXY_URL
Sets a server to proxy media through\&.
The maximum number of feeds polled for a given period is subject to POLLING_FREQUENCY and BATCH_SIZE\&.
.br
Default is empty, miniflux does the proxying\&.
.TP
.B HTTP_CLIENT_TIMEOUT
Time limit in seconds before the HTTP client cancel the request\&.
When "entry_frequency" is selected, the refresh interval for a given feed is equal to the average updating interval of the last week of the feed\&.
.br
Default is 20 seconds\&.
.TP
.B HTTP_CLIENT_MAX_BODY_SIZE
Maximum body size for HTTP requests in Mebibyte (MiB)\&.
The actual number of feeds polled will not exceed the maximum number of feeds that could be polled for a given period\&.
.br
Default is 15 MiB\&.
Default is "round_robin"\&.
.TP
.B HTTP_CLIENT_PROXY
Proxy URL for HTTP client\&.
.B PORT
Override LISTEN_ADDR to 0.0.0.0:$PORT\&.
.br
Default is empty\&.
.TP
.B HTTP_CLIENT_USER_AGENT
The default User-Agent header to use for the HTTP client. Can be overridden in per-feed settings\&.
.br
When empty, Miniflux uses a default User-Agent that includes the Miniflux version\&.
.br
Default is empty.
.TP
.B HTTP_SERVER_TIMEOUT
Time limit in seconds before the HTTP client cancel the request\&.
.br
Default is 300 seconds\&.
.TP
.B AUTH_PROXY_HEADER
Proxy authentication HTTP header\&.
.br
Default is empty.
.TP
.B AUTH_PROXY_USER_CREATION
Set to 1 to create users based on proxy authentication information\&.
.B RUN_MIGRATIONS
Set to 1 to run database migrations\&.
.br
Disabled by default\&.
.TP
.B MAINTENANCE_MODE
Set to 1 to enable maintenance mode\&.
.B SCHEDULER_ENTRY_FREQUENCY_FACTOR
Factor to increase refresh frequency for the entry frequency scheduler\&.
.br
Default is 1\&.
.TP
.B SCHEDULER_ENTRY_FREQUENCY_MAX_INTERVAL
Maximum interval in minutes for the entry frequency scheduler\&.
.br
Default is 24 hours\&.
.TP
.B SCHEDULER_ENTRY_FREQUENCY_MIN_INTERVAL
Minimum interval in minutes for the entry frequency scheduler\&.
.br
Default is 5 minutes\&.
.TP
.B SCHEDULER_ROUND_ROBIN_MIN_INTERVAL
Minimum interval in minutes for the round robin scheduler\&.
.br
Default is 60 minutes\&.
.TP
.B SERVER_TIMING_HEADER
Set the value to 1 to enable server-timing headers\&.
.br
Disabled by default\&.
.TP
.B MAINTENANCE_MESSAGE
Define a custom maintenance message\&.
.br
Default is "Miniflux is currently under maintenance"\&.
.TP
.B WATCHDOG
Enable or disable Systemd watchdog\&.
.br
Enabled by default\&.
.TP
.B INVIDIOUS_INSTANCE
Set a custom invidious instance to use\&.
.br
Default is yewtu.be\&.
.TP
.B PROXY_PRIVATE_KEY
Set a custom custom private key used to sign proxified media URL\&.
.br
Default is randomly generated at startup\&.
.TP
.B WEBAUTHN
Enable or disable WebAuthn/Passkey authentication\&.
.br
Note: After activating and setting up your Passkey, just enter your username and click the Passkey login button\&.
.br
Default is disabled\&.
.TP
.B WORKER_POOL_SIZE
Number of background workers\&.
.br
Default is 16 workers\&.
.TP
.B YOUTUBE_EMBED_URL_OVERRIDE
YouTube URL which will be used for embeds\&.
.br
Default is https://www.youtube-nocookie.com/embed/\&.
.TP
.B CONTENT_SECURITY_POLICY
Set custom value for Content-Security-Policy meta tag. Used when custom CSS is applied.
.br
It may contain "nonce-%s", where nonce will be placed\&.
.br
Default is "default-src 'self'; img-src * data:; media-src *; frame-src *; style-src 'self' 'nonce-%s'; require-trusted-types-for 'script'; trusted-types ttpolicy;"\&.
.TP
.SH AUTHORS
.P

View File

@ -1,14 +1,10 @@
FROM golang:alpine AS build
ENV CGO_ENABLED=0
RUN apk add --no-cache --update git
FROM docker.io/library/golang:alpine3.19 AS build
RUN apk add --no-cache build-base git make
ADD . /go/src/app
WORKDIR /go/src/app
RUN go build \
-o miniflux \
-ldflags="-s -w -X 'miniflux.app/v2/internal/version.Version=`git describe --tags --abbrev=0`' -X 'miniflux.app/v2/internal/version.Commit=`git rev-parse --short HEAD`' -X 'miniflux.app/v2/internal/version.BuildDate=`date +%FT%T%z`'" \
main.go
RUN make miniflux
FROM alpine:latest
FROM docker.io/library/alpine:3.19
LABEL org.opencontainers.image.title=Miniflux
LABEL org.opencontainers.image.description="Miniflux is a minimalist and opinionated feed reader"

Some files were not shown because too many files have changed in this diff Show More