<pre style='margin:0'>
Chris Jones (cjones051073) pushed a commit to branch master
in repository macports-ports.
</pre>
<p><a href="https://github.com/macports/macports-ports/commit/9a7d2e02544cfeccd3e741d4ae29710700ce6d5a">https://github.com/macports/macports-ports/commit/9a7d2e02544cfeccd3e741d4ae29710700ce6d5a</a></p>
<pre style="white-space: pre; background: #F8F8F8">The following commit(s) were added to refs/heads/master by this push:
<span style='display:block; white-space:pre;color:#404040;'> new 9a7d2e02544 py-sentence-transformers: Update to version 1.0.4, Add Python 39, Add tests
</span>9a7d2e02544 is described below
<span style='display:block; white-space:pre;color:#808000;'>commit 9a7d2e02544cfeccd3e741d4ae29710700ce6d5a
</span>Author: Steven Thomas Smith <s.t.smith@ieee.org>
AuthorDate: Thu Apr 1 09:16:24 2021 -0400
<span style='display:block; white-space:pre;color:#404040;'> py-sentence-transformers: Update to version 1.0.4, Add Python 39, Add tests
</span>---
python/py-sentence-transformers/Portfile | 62 +++++++++++++++++---------------
1 file changed, 33 insertions(+), 29 deletions(-)
<span style='display:block; white-space:pre;color:#808080;'>diff --git a/python/py-sentence-transformers/Portfile b/python/py-sentence-transformers/Portfile
</span><span style='display:block; white-space:pre;color:#808080;'>index fa247f5fffc..4b6789417cb 100644
</span><span style='display:block; white-space:pre;background:#e0e0ff;'>--- a/python/py-sentence-transformers/Portfile
</span><span style='display:block; white-space:pre;background:#e0e0ff;'>+++ b/python/py-sentence-transformers/Portfile
</span><span style='display:block; white-space:pre;background:#e0e0e0;'>@@ -1,47 +1,42 @@
</span> # -*- coding: utf-8; mode: tcl; tab-width: 4; indent-tabs-mode: nil; c-basic-offset: 4 -*- vim:fenc=utf-8:ft=tcl:et:sw=4:ts=4:sts=4
PortSystem 1.0
<span style='display:block; white-space:pre;background:#e0ffe0;'>+PortGroup github 1.0
</span> PortGroup python 1.0
<span style='display:block; white-space:pre;background:#ffe0e0;'>-name py-sentence-transformers
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-version 0.3.7
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+github.setup UKPLab sentence-transformers 1.0.4 v
</span> revision 0
<span style='display:block; white-space:pre;background:#ffe0e0;'>-categories python textproc
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+name py-${github.project}
</span>
<span style='display:block; white-space:pre;background:#e0ffe0;'>+categories-append textproc
</span> license Apache-2
maintainers nomaintainer
platforms darwin
<span style='display:block; white-space:pre;background:#ffe0e0;'>-description Sentence Embeddings with BERT & XLNet
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+description Sentence Embeddings using BERT / RoBERTa / XLM-R
</span>
<span style='display:block; white-space:pre;background:#ffe0e0;'>-long_description BERT/XLNet produces out-of-the-box rather bad \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- sentence embeddings. This repository fine-tunes \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- BERT/XLNet with a siamese or triplet network \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- structure to produce semantically meaningful \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- sentence embeddings that can be used in \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- unsupervised scenarios: Semantic textual \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- similarity via cosine-similarity, clustering, \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- semantic search. We provide an increasing number \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- of state-of-the-art pretrained models that can be \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- used to derive sentence embeddings. See Pretrained \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- Models. Details of the implemented approaches can \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- be found in our publication: Sentence-BERT: \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- Sentence Embeddings using Siamese BERT-Networks \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- (published at EMNLP 2019). You can use this code \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- to easily train your own sentence embeddings, that \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- are tuned for your specific task. We provide \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- various dataset readers and you can tune sentence \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- embeddings with different loss function, depending \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- on the structure of your dataset. For further \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- details, see Train your own Sentence Embeddings.
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+long_description This framework provides an easy method to compute \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ dense vector representations for sentences, \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ paragraphs, and images. The models are based on \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ transformer networks like BERT / RoBERTa / \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ XLM-RoBERTa etc. and achieve state-of-the-art \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ performance in various task. Text is embedding in \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ vector space such that similar text is close and \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ can efficiently be found using cosine similarity. \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ We provide an increasing number of \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ state-of-the-art pretrained models for more than \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ 100 languages, fine-tuned for various use-cases. \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ Further, this framework allows an easy fine-tuning \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ of custom embeddings models, to achieve maximal \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ performance on your specific task.
</span>
homepage https://github.com/UKPLab/sentence-transformers
<span style='display:block; white-space:pre;background:#ffe0e0;'>-checksums rmd160 9b370dfe44747f9a7f637ecd158ee12b13464320 \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- sha256 02ac0526af66d40c0d73c5c72169a800227654c61d27a4ba938fdfec46d7197b \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>- size 59599
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+checksums rmd160 c14a7d97e2fb67815eba36b0fe39b8a0c5c1faeb \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ sha256 dbfed0576ac62b882ad2be3a7e4df8f06a0278520f6c98c1ff0e06b9f1e593d1 \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ size 14407175
</span>
<span style='display:block; white-space:pre;background:#ffe0e0;'>-python.versions 37 38
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+python.versions 38 39
</span>
if {${name} ne ${subport}} {
depends_build-append \
<span style='display:block; white-space:pre;background:#e0e0e0;'>@@ -53,17 +48,26 @@ if {${name} ne ${subport}} {
</span> port:py${python.version}-pytorch \
port:py${python.version}-scikit-learn \
port:py${python.version}-scipy \
<span style='display:block; white-space:pre;background:#e0ffe0;'>+ port:py${python.version}-sentencepiece \
</span> port:py${python.version}-tqdm \
port:py${python.version}-transformers
<span style='display:block; white-space:pre;background:#e0ffe0;'>+ depends_test-append \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ port:py${python.version}-pytest
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+
</span> post-destroot {
set docdir ${prefix}/share/doc/${subport}
set sharedir ${prefix}/share/${subport}
xinstall -d \
${destroot}${docdir}
<span style='display:block; white-space:pre;background:#ffe0e0;'>- xinstall -m 0644 -W ${worksrcpath} README.md \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ xinstall -m 0644 -W ${worksrcpath} LICENSE README.md \
</span> ${destroot}${docdir}
}
<span style='display:block; white-space:pre;background:#e0ffe0;'>+ test.run yes
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ test.cmd py.test-${python.branch}
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ test.target tests
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+ test.env-append PYTHONPATH=${worksrcpath}/build/lib
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+
</span> livecheck.type none
}
</pre><pre style='margin:0'>
</pre>