Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
4
112
repo_url
stringlengths
33
141
action
stringclasses
3 values
title
stringlengths
1
1.02k
labels
stringlengths
4
1.54k
body
stringlengths
1
262k
index
stringclasses
17 values
text_combine
stringlengths
95
262k
label
stringclasses
2 values
text
stringlengths
96
252k
binary_label
int64
0
1
204,841
7,091,419,575
IssuesEvent
2018-01-12 13:00:37
cilium/cilium
https://api.github.com/repos/cilium/cilium
closed
Cilium Monitor Not Working
area/monitor kind/bug need-more-info priority/high
current cilium stable, minikube version: v0.22.3 ``` root@minikube:/# cilium version Cilium 1.0.0-rc2 2f33f7a Wed, 6 Dec 2017 15:22:37 -0800 go version go1.9 linux/amd64 ``` Cilium status is fine: ``` root@minikube:/# cilium status KVStore: Ok Etcd: http://127.0.0.1:2379 - (Leader) 3.1.5 ContainerRuntime: Ok Kubernetes: Ok OK Kubernetes APIs: ["core/v1::Node", "CustomResourceDefinition", "cilium/v2::CiliumNetworkPolicy", "extensions/v1beta1::NetworkPolicy", "networking.k8s.io/v1::NetworkPolicy", "core/v1::Service", "core/v1::Endpoint", "extensions/v1beta1::Ingress"] Cilium: Ok OK NodeMonitor: Listening for events on 2 CPUs with 64x4096 of shared memory Cilium health daemon: Ok Known cluster nodes: minikube (localhost): Primary Address: 192.168.99.100 Type: InternalIP AllocRange: 10.15.0.0/16 ``` ``` root@minikube:/# cilium monitor Listening for events on 2 CPUs with 64x4096 of shared memory Press Ctrl-C to quit Error: unable to connect to monitor dial unix /var/run/cilium/monitor.sock: connect: connection refused ``` There is not gops output: ``` root@minikube:/# gops 1006 701 gops go1.9 /usr/local/bin/gops 255 1 cilium-node-monitor go1.9 /usr/bin/cilium-node-monitor 262 1 cilium-health * go1.9 /usr/bin/cilium-health 1 0 cilium-agent * go1.9 /usr/bin/cilium-agent root@minikube:/# gops stack 255 couldn't get port by PID: dial tcp 127.0.0.1:0: getsockopt: connection refused ``` I was told there are no logs for cilium-node-monitor, but I did confirm via "ps" that it was running: ``` root@minikube:/# ps -Af | grep cilium root 1 0 0 01:26 ? 00:00:02 cilium-agent --debug=false -t vxlan --kvstore etcd --kvstore-opt etcd.config=/var/lib/etcd-config/etcd.config --disable-ipv4=false root 255 1 0 01:27 ? 00:00:00 cilium-node-monitor root 262 1 0 01:27 ? 00:00:01 cilium-health -d root 908 701 0 01:33 ? 00:00:00 grep cilium ```
1.0
Cilium Monitor Not Working - current cilium stable, minikube version: v0.22.3 ``` root@minikube:/# cilium version Cilium 1.0.0-rc2 2f33f7a Wed, 6 Dec 2017 15:22:37 -0800 go version go1.9 linux/amd64 ``` Cilium status is fine: ``` root@minikube:/# cilium status KVStore: Ok Etcd: http://127.0.0.1:2379 - (Leader) 3.1.5 ContainerRuntime: Ok Kubernetes: Ok OK Kubernetes APIs: ["core/v1::Node", "CustomResourceDefinition", "cilium/v2::CiliumNetworkPolicy", "extensions/v1beta1::NetworkPolicy", "networking.k8s.io/v1::NetworkPolicy", "core/v1::Service", "core/v1::Endpoint", "extensions/v1beta1::Ingress"] Cilium: Ok OK NodeMonitor: Listening for events on 2 CPUs with 64x4096 of shared memory Cilium health daemon: Ok Known cluster nodes: minikube (localhost): Primary Address: 192.168.99.100 Type: InternalIP AllocRange: 10.15.0.0/16 ``` ``` root@minikube:/# cilium monitor Listening for events on 2 CPUs with 64x4096 of shared memory Press Ctrl-C to quit Error: unable to connect to monitor dial unix /var/run/cilium/monitor.sock: connect: connection refused ``` There is not gops output: ``` root@minikube:/# gops 1006 701 gops go1.9 /usr/local/bin/gops 255 1 cilium-node-monitor go1.9 /usr/bin/cilium-node-monitor 262 1 cilium-health * go1.9 /usr/bin/cilium-health 1 0 cilium-agent * go1.9 /usr/bin/cilium-agent root@minikube:/# gops stack 255 couldn't get port by PID: dial tcp 127.0.0.1:0: getsockopt: connection refused ``` I was told there are no logs for cilium-node-monitor, but I did confirm via "ps" that it was running: ``` root@minikube:/# ps -Af | grep cilium root 1 0 0 01:26 ? 00:00:02 cilium-agent --debug=false -t vxlan --kvstore etcd --kvstore-opt etcd.config=/var/lib/etcd-config/etcd.config --disable-ipv4=false root 255 1 0 01:27 ? 00:00:00 cilium-node-monitor root 262 1 0 01:27 ? 00:00:01 cilium-health -d root 908 701 0 01:33 ? 00:00:00 grep cilium ```
non_test
cilium monitor not working current cilium stable minikube version root minikube cilium version cilium wed dec go version linux cilium status is fine root minikube cilium status kvstore ok etcd leader containerruntime ok kubernetes ok ok kubernetes apis cilium ok ok nodemonitor listening for events on cpus with of shared memory cilium health daemon ok known cluster nodes minikube localhost primary address type internalip allocrange root minikube cilium monitor listening for events on cpus with of shared memory press ctrl c to quit error unable to connect to monitor dial unix var run cilium monitor sock connect connection refused there is not gops output root minikube gops gops usr local bin gops cilium node monitor usr bin cilium node monitor cilium health usr bin cilium health cilium agent usr bin cilium agent root minikube gops stack couldn t get port by pid dial tcp getsockopt connection refused i was told there are no logs for cilium node monitor but i did confirm via ps that it was running root minikube ps af grep cilium root cilium agent debug false t vxlan kvstore etcd kvstore opt etcd config var lib etcd config etcd config disable false root cilium node monitor root cilium health d root grep cilium
0
8,394
6,535,815,873
IssuesEvent
2017-08-31 15:47:22
Patternslib/Patterns
https://api.github.com/repos/Patternslib/Patterns
closed
Bumper update interval
Pattern bumper Performance
Bumpers move rather choppy when the user scrolls. This can be solved with some CSS tricks with position sticky. Except position sticky is only applicable for certain cases and it's taken out recent versions of Chrome again, setting us back to square one. Is the smoothness of the bumper related to the interval in which it measures/updates the position? If so could we increase this interval so that bumpers become smooth in all browsers?
True
Bumper update interval - Bumpers move rather choppy when the user scrolls. This can be solved with some CSS tricks with position sticky. Except position sticky is only applicable for certain cases and it's taken out recent versions of Chrome again, setting us back to square one. Is the smoothness of the bumper related to the interval in which it measures/updates the position? If so could we increase this interval so that bumpers become smooth in all browsers?
non_test
bumper update interval bumpers move rather choppy when the user scrolls this can be solved with some css tricks with position sticky except position sticky is only applicable for certain cases and it s taken out recent versions of chrome again setting us back to square one is the smoothness of the bumper related to the interval in which it measures updates the position if so could we increase this interval so that bumpers become smooth in all browsers
0
84,708
16,539,217,454
IssuesEvent
2021-05-27 14:55:07
HansenBerlin/altenheim-kalender
https://api.github.com/repos/HansenBerlin/altenheim-kalender
opened
POC Kalenderimport
CODE EPIC
Zunächst nur als POC (eine Klasse reicht erstmal) sollte die Möglichkeit des Imports von Kalendern (Google usw.) gecheckt werden. Ggfs gibt dies das Framework her, dazu mal die Doku checken oder @dannyneup fragen, der hat sich damit auseinandergesetzt.
1.0
POC Kalenderimport - Zunächst nur als POC (eine Klasse reicht erstmal) sollte die Möglichkeit des Imports von Kalendern (Google usw.) gecheckt werden. Ggfs gibt dies das Framework her, dazu mal die Doku checken oder @dannyneup fragen, der hat sich damit auseinandergesetzt.
non_test
poc kalenderimport zunächst nur als poc eine klasse reicht erstmal sollte die möglichkeit des imports von kalendern google usw gecheckt werden ggfs gibt dies das framework her dazu mal die doku checken oder dannyneup fragen der hat sich damit auseinandergesetzt
0
323,116
27,695,754,218
IssuesEvent
2023-03-14 01:58:12
owncast/owncast
https://api.github.com/repos/owncast/owncast
closed
Flaky test: Browser notification modal content
backlog tests
### Share your bug report, feature request, or comment. The "Notifications not supported" message gets marked as changed in UI tests even when it has not changed. Example: ![image](https://user-images.githubusercontent.com/414923/214492066-abf25e51-715c-451a-af10-b3008c2a5f07.png) Story: https://owncast.online/components/?path=%2Fstory%2Fowncast-modals-browser-notifications--basic
1.0
Flaky test: Browser notification modal content - ### Share your bug report, feature request, or comment. The "Notifications not supported" message gets marked as changed in UI tests even when it has not changed. Example: ![image](https://user-images.githubusercontent.com/414923/214492066-abf25e51-715c-451a-af10-b3008c2a5f07.png) Story: https://owncast.online/components/?path=%2Fstory%2Fowncast-modals-browser-notifications--basic
test
flaky test browser notification modal content share your bug report feature request or comment the notifications not supported message gets marked as changed in ui tests even when it has not changed example story
1
50,045
6,050,845,033
IssuesEvent
2017-06-12 22:02:49
rancher/rancher
https://api.github.com/repos/rancher/rancher
closed
Failed host doesn't have delete in detail's menu
area/host area/ui kind/bug status/resolved status/to-test
**Rancher versions:** 6/8 **Steps to Reproduce:** 1. Create a host that will fail 2. Click on Host name after it's error 3. Click on menu **Results:** No actions available. Should be same as main host page menu. Which has machine config, delete and view in api ![image](https://user-images.githubusercontent.com/11514927/26946397-f47f57f2-4c43-11e7-9975-d0a141c685ca.png)
1.0
Failed host doesn't have delete in detail's menu - **Rancher versions:** 6/8 **Steps to Reproduce:** 1. Create a host that will fail 2. Click on Host name after it's error 3. Click on menu **Results:** No actions available. Should be same as main host page menu. Which has machine config, delete and view in api ![image](https://user-images.githubusercontent.com/11514927/26946397-f47f57f2-4c43-11e7-9975-d0a141c685ca.png)
test
failed host doesn t have delete in detail s menu rancher versions steps to reproduce create a host that will fail click on host name after it s error click on menu results no actions available should be same as main host page menu which has machine config delete and view in api
1
816,838
30,614,082,340
IssuesEvent
2023-07-24 00:10:42
umatt-ece/new-members-package
https://api.github.com/repos/umatt-ece/new-members-package
opened
GitHub Navigation
high priority
# GitHub Navigation ## Description Create a GitHub Navigation tutorial to introduce members who may not be familiar. Additionally, specify team policies for managing repositories, teams, projects, and issues. ## Tasks - [ ] Create GitHub.md tutorial. ## Linked Issues ## Additional Notes
1.0
GitHub Navigation - # GitHub Navigation ## Description Create a GitHub Navigation tutorial to introduce members who may not be familiar. Additionally, specify team policies for managing repositories, teams, projects, and issues. ## Tasks - [ ] Create GitHub.md tutorial. ## Linked Issues ## Additional Notes
non_test
github navigation github navigation description create a github navigation tutorial to introduce members who may not be familiar additionally specify team policies for managing repositories teams projects and issues tasks create github md tutorial linked issues additional notes
0
165,354
13,999,597,077
IssuesEvent
2020-10-28 11:03:48
crazycapivara/mapboxer
https://api.github.com/repos/crazycapivara/mapboxer
closed
Move api example to examples folder
documentation
Move all api examples from `inst/api-reference` to `examples/api-reference`
1.0
Move api example to examples folder - Move all api examples from `inst/api-reference` to `examples/api-reference`
non_test
move api example to examples folder move all api examples from inst api reference to examples api reference
0
68,207
14,914,580,742
IssuesEvent
2021-01-22 15:36:12
AlexRogalskiy/object-mappers-playground
https://api.github.com/repos/AlexRogalskiy/object-mappers-playground
opened
CVE-2013-4002 (High) detected in xercesImpl-2.9.1.jar
security vulnerability
## CVE-2013-4002 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xercesImpl-2.9.1.jar</b></p></summary> <p>Xerces2 is the next generation of high performance, fully compliant XML parsers in the Apache Xerces family. This new version of Xerces introduces the Xerces Native Interface (XNI), a complete framework for building parser components and configurations that is extremely modular and easy to program.</p> <p>Path to dependency file: object-mappers-playground/modules/objectmappers-all/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar,/home/wss-scanner/.m2/repository/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar</p> <p> Dependency Hierarchy: - milyn-smooks-all-1.7.1.jar (Root Library) - milyn-commons-1.7.1.jar - :x: **xercesImpl-2.9.1.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/object-mappers-playground/commit/ccc3fe199db8575ce11d67be41855d33c5e718ab">ccc3fe199db8575ce11d67be41855d33c5e718ab</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> XMLscanner.java in Apache Xerces2 Java Parser before 2.12.0, as used in the Java Runtime Environment (JRE) in IBM Java 5.0 before 5.0 SR16-FP3, 6 before 6 SR14, 6.0.1 before 6.0.1 SR6, and 7 before 7 SR5 as well as Oracle Java SE 7u40 and earlier, Java SE 6u60 and earlier, Java SE 5.0u51 and earlier, JRockit R28.2.8 and earlier, JRockit R27.7.6 and earlier, Java SE Embedded 7u40 and earlier, and possibly other products allows remote attackers to cause a denial of service via vectors related to XML attribute names. <p>Publish Date: 2013-07-23 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2013-4002>CVE-2013-4002</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>7.1</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2013-4002">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2013-4002</a></p> <p>Release Date: 2013-07-23</p> <p>Fix Resolution: xerces:xercesImpl:Xerces-J_2_12_0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2013-4002 (High) detected in xercesImpl-2.9.1.jar - ## CVE-2013-4002 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xercesImpl-2.9.1.jar</b></p></summary> <p>Xerces2 is the next generation of high performance, fully compliant XML parsers in the Apache Xerces family. This new version of Xerces introduces the Xerces Native Interface (XNI), a complete framework for building parser components and configurations that is extremely modular and easy to program.</p> <p>Path to dependency file: object-mappers-playground/modules/objectmappers-all/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar,/home/wss-scanner/.m2/repository/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar</p> <p> Dependency Hierarchy: - milyn-smooks-all-1.7.1.jar (Root Library) - milyn-commons-1.7.1.jar - :x: **xercesImpl-2.9.1.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/object-mappers-playground/commit/ccc3fe199db8575ce11d67be41855d33c5e718ab">ccc3fe199db8575ce11d67be41855d33c5e718ab</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> XMLscanner.java in Apache Xerces2 Java Parser before 2.12.0, as used in the Java Runtime Environment (JRE) in IBM Java 5.0 before 5.0 SR16-FP3, 6 before 6 SR14, 6.0.1 before 6.0.1 SR6, and 7 before 7 SR5 as well as Oracle Java SE 7u40 and earlier, Java SE 6u60 and earlier, Java SE 5.0u51 and earlier, JRockit R28.2.8 and earlier, JRockit R27.7.6 and earlier, Java SE Embedded 7u40 and earlier, and possibly other products allows remote attackers to cause a denial of service via vectors related to XML attribute names. <p>Publish Date: 2013-07-23 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2013-4002>CVE-2013-4002</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>7.1</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2013-4002">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2013-4002</a></p> <p>Release Date: 2013-07-23</p> <p>Fix Resolution: xerces:xercesImpl:Xerces-J_2_12_0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve high detected in xercesimpl jar cve high severity vulnerability vulnerable library xercesimpl jar is the next generation of high performance fully compliant xml parsers in the apache xerces family this new version of xerces introduces the xerces native interface xni a complete framework for building parser components and configurations that is extremely modular and easy to program path to dependency file object mappers playground modules objectmappers all pom xml path to vulnerable library home wss scanner repository xerces xercesimpl xercesimpl jar home wss scanner repository xerces xercesimpl xercesimpl jar dependency hierarchy milyn smooks all jar root library milyn commons jar x xercesimpl jar vulnerable library found in head commit a href vulnerability details xmlscanner java in apache java parser before as used in the java runtime environment jre in ibm java before before before and before as well as oracle java se and earlier java se and earlier java se and earlier jrockit and earlier jrockit and earlier java se embedded and earlier and possibly other products allows remote attackers to cause a denial of service via vectors related to xml attribute names publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution xerces xercesimpl xerces j step up your open source security game with whitesource
0
252,363
8,035,026,341
IssuesEvent
2018-07-30 01:34:16
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
www.binance.com - site is not usable
browser-firefox-mobile-tablet priority-normal
<!-- @browser: Firefox Mobile (Tablet) 57.0 --> <!-- @ua_header: Mozilla/5.0 (Android 7.1.1; Tablet; rv:57.0) Gecko/57.0 Firefox/57.0 --> <!-- @reported_with: web --> **URL**: https://www.binance.com/indexSpa.html#/trade/index?symbol=TRX_BTC **Browser / Version**: Firefox Mobile (Tablet) 57.0 **Operating System**: Android 7.1.1 **Tested Another Browser**: Yes **Problem type**: Site is not usable **Description**: doesn't render properly **Steps to Reproduce**: 1. Go to binance.com 2. Click on any coin 3. Graphs fail to render properly Works fine on latest Chrome for Android [![Screenshot Description](https://webcompat.com/uploads/2018/1/3f89a31b-17c4-4460-a229-08bdbdf87bcd-thumb.jpeg)](https://webcompat.com/uploads/2018/1/3f89a31b-17c4-4460-a229-08bdbdf87bcd.jpeg) _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
www.binance.com - site is not usable - <!-- @browser: Firefox Mobile (Tablet) 57.0 --> <!-- @ua_header: Mozilla/5.0 (Android 7.1.1; Tablet; rv:57.0) Gecko/57.0 Firefox/57.0 --> <!-- @reported_with: web --> **URL**: https://www.binance.com/indexSpa.html#/trade/index?symbol=TRX_BTC **Browser / Version**: Firefox Mobile (Tablet) 57.0 **Operating System**: Android 7.1.1 **Tested Another Browser**: Yes **Problem type**: Site is not usable **Description**: doesn't render properly **Steps to Reproduce**: 1. Go to binance.com 2. Click on any coin 3. Graphs fail to render properly Works fine on latest Chrome for Android [![Screenshot Description](https://webcompat.com/uploads/2018/1/3f89a31b-17c4-4460-a229-08bdbdf87bcd-thumb.jpeg)](https://webcompat.com/uploads/2018/1/3f89a31b-17c4-4460-a229-08bdbdf87bcd.jpeg) _From [webcompat.com](https://webcompat.com/) with ❤️_
non_test
site is not usable url browser version firefox mobile tablet operating system android tested another browser yes problem type site is not usable description doesn t render properly steps to reproduce go to binance com click on any coin graphs fail to render properly works fine on latest chrome for android from with ❤️
0
88,274
17,532,384,600
IssuesEvent
2021-08-12 00:16:22
backdrop/backdrop-issues
https://api.github.com/repos/backdrop/backdrop-issues
closed
Get run-tests.sh script with profile cache running on newer environments
type - bug report status - has pull request pr - needs testing pr - needs code review
**Description of the bug** Currently running tests via script run-tests.sh _with profile cache on_ doesn't work on newer environments, namely mysql 5.7+ or mariadb 10.2.2+. These db versions with settings like listed below choke on the `alter table` to MyISAM in backdrop_web_test_case_cache.php. Message: `ERROR: SQLSTATE[42000]: Syntax error or access violation: 1071 Specified key was too long; max key length is 1000 bytes ` **Steps To Reproduce** Make sure your db is new enough (on Ubuntu Bionic, Debian Buster...) and has following variables set: - innodb_large_prefix ON - innodb_default_row_format dynamic - innodb_file_per_table ON - innodb_file_format Barracuda Then download and install Backdrop (via install.sh or however). Then go to the Backdrop root directory and try to run the tests with profile caching (param --cache): `./core/scripts/run-tests.sh --url http://localhost/ --force --cache BlockTestCase` **Actual behavior** PDO Exception and no tests run at all. **Expected behavior** Tests running. **Additional information** I ran into this issue when trying to get Simpletests working with travisCI, but now found a way to reproduce. Here's the [Zulipchat thread](https://backdrop.zulipchat.com/#narrow/stream/218635-Backdrop/topic/pdo.20when.20running.20tests) with my question. Joseph Flatt provided a trick: setting `$test->prepareCache(TRUE)` in run-tests.sh bypasses the table conversions to MyISAM - and then the tests run without problem. But patching around in core is no real solution, of course. **Please note:** Variable innodb_large_prefix is deprecated in both, mysql and mariadb, and large prefixes will be the default setting after this variable has gone. See details to the large prefix variable for [MySQL](https://dev.mysql.com/doc/refman/5.7/en/innodb-parameters.html#sysvar_innodb_large_prefix) and [MariaDB](https://mariadb.com/kb/en/innodb-system-variables/#innodb_large_prefix).
1.0
Get run-tests.sh script with profile cache running on newer environments - **Description of the bug** Currently running tests via script run-tests.sh _with profile cache on_ doesn't work on newer environments, namely mysql 5.7+ or mariadb 10.2.2+. These db versions with settings like listed below choke on the `alter table` to MyISAM in backdrop_web_test_case_cache.php. Message: `ERROR: SQLSTATE[42000]: Syntax error or access violation: 1071 Specified key was too long; max key length is 1000 bytes ` **Steps To Reproduce** Make sure your db is new enough (on Ubuntu Bionic, Debian Buster...) and has following variables set: - innodb_large_prefix ON - innodb_default_row_format dynamic - innodb_file_per_table ON - innodb_file_format Barracuda Then download and install Backdrop (via install.sh or however). Then go to the Backdrop root directory and try to run the tests with profile caching (param --cache): `./core/scripts/run-tests.sh --url http://localhost/ --force --cache BlockTestCase` **Actual behavior** PDO Exception and no tests run at all. **Expected behavior** Tests running. **Additional information** I ran into this issue when trying to get Simpletests working with travisCI, but now found a way to reproduce. Here's the [Zulipchat thread](https://backdrop.zulipchat.com/#narrow/stream/218635-Backdrop/topic/pdo.20when.20running.20tests) with my question. Joseph Flatt provided a trick: setting `$test->prepareCache(TRUE)` in run-tests.sh bypasses the table conversions to MyISAM - and then the tests run without problem. But patching around in core is no real solution, of course. **Please note:** Variable innodb_large_prefix is deprecated in both, mysql and mariadb, and large prefixes will be the default setting after this variable has gone. See details to the large prefix variable for [MySQL](https://dev.mysql.com/doc/refman/5.7/en/innodb-parameters.html#sysvar_innodb_large_prefix) and [MariaDB](https://mariadb.com/kb/en/innodb-system-variables/#innodb_large_prefix).
non_test
get run tests sh script with profile cache running on newer environments description of the bug currently running tests via script run tests sh with profile cache on doesn t work on newer environments namely mysql or mariadb these db versions with settings like listed below choke on the alter table to myisam in backdrop web test case cache php message error sqlstate syntax error or access violation specified key was too long max key length is bytes steps to reproduce make sure your db is new enough on ubuntu bionic debian buster and has following variables set innodb large prefix on innodb default row format dynamic innodb file per table on innodb file format barracuda then download and install backdrop via install sh or however then go to the backdrop root directory and try to run the tests with profile caching param cache core scripts run tests sh url force cache blocktestcase actual behavior pdo exception and no tests run at all expected behavior tests running additional information i ran into this issue when trying to get simpletests working with travisci but now found a way to reproduce here s the with my question joseph flatt provided a trick setting test preparecache true in run tests sh bypasses the table conversions to myisam and then the tests run without problem but patching around in core is no real solution of course please note variable innodb large prefix is deprecated in both mysql and mariadb and large prefixes will be the default setting after this variable has gone see details to the large prefix variable for and
0
130,860
27,777,797,053
IssuesEvent
2023-03-16 18:30:27
patternfly/pf-codemods
https://api.github.com/repos/patternfly/pf-codemods
closed
Tooltip - reference prop renamed to triggerRef
codemod
Follow up to breaking change PR https://github.com/patternfly/patternfly-react/pull/8733 Tooltip's reference prop was renamed to triggerRef _Required actions:_ 1. Build codemod 2. Build test 3. Update readme with description & example
1.0
Tooltip - reference prop renamed to triggerRef - Follow up to breaking change PR https://github.com/patternfly/patternfly-react/pull/8733 Tooltip's reference prop was renamed to triggerRef _Required actions:_ 1. Build codemod 2. Build test 3. Update readme with description & example
non_test
tooltip reference prop renamed to triggerref follow up to breaking change pr tooltip s reference prop was renamed to triggerref required actions build codemod build test update readme with description example
0
285,371
24,661,865,337
IssuesEvent
2022-10-18 07:22:42
kartoza/ckanext-dalrrd-emc-dcpr
https://api.github.com/repos/kartoza/ckanext-dalrrd-emc-dcpr
opened
do linked reserouces work via CSW?
test
using QGIS MetaSearch, test that EMC records that have valid OGC resource links, return those links in a valid way in the response such that you can load them directly into QGIS. Use existing records in the EMC that you've identified to have these links else create your own testing records with valid links
1.0
do linked reserouces work via CSW? - using QGIS MetaSearch, test that EMC records that have valid OGC resource links, return those links in a valid way in the response such that you can load them directly into QGIS. Use existing records in the EMC that you've identified to have these links else create your own testing records with valid links
test
do linked reserouces work via csw using qgis metasearch test that emc records that have valid ogc resource links return those links in a valid way in the response such that you can load them directly into qgis use existing records in the emc that you ve identified to have these links else create your own testing records with valid links
1
562,367
16,658,230,805
IssuesEvent
2021-06-05 22:58:30
eclipse/lyo
https://api.github.com/repos/eclipse/lyo
closed
Bintray shutting down; need to check our POMs
Component: N/A (project-wide) Priority: Critical Type: Maintenance
https://news.ycombinator.com/item?id=26016505 I think Shaclex was the only dependency we fetched via Bintray. Need to migrate before May 1, better yet Feb 28.
1.0
Bintray shutting down; need to check our POMs - https://news.ycombinator.com/item?id=26016505 I think Shaclex was the only dependency we fetched via Bintray. Need to migrate before May 1, better yet Feb 28.
non_test
bintray shutting down need to check our poms i think shaclex was the only dependency we fetched via bintray need to migrate before may better yet feb
0
198,485
22,659,645,823
IssuesEvent
2022-07-02 01:11:04
nvenkatesh1/SCA_test_JS
https://api.github.com/repos/nvenkatesh1/SCA_test_JS
closed
extract-text-webpack-plugin-1.0.1.tgz: 1 vulnerabilities (highest severity is: 7.8) - autoclosed
security vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>extract-text-webpack-plugin-1.0.1.tgz</b></p></summary> <p></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/webpack/node_modules/async/package.json,/app/compilers/react-compiler/node_modules/async/package.json,/node_modules/istanbul/node_modules/async/package.json,/node_modules/extract-text-webpack-plugin/node_modules/async/package.json,/node_modules/handlebars/node_modules/async/package.json</p> <p> <p>Found in HEAD commit: <a href="https://github.com/nvenkatesh1/SCA_test_JS/commit/eb47eeefc02a252a76628fec10a3c26aacb34024">eb47eeefc02a252a76628fec10a3c26aacb34024</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | --- | --- | | [CVE-2021-43138](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43138) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.8 | async-1.5.2.tgz | Transitive | N/A | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-43138</summary> ### Vulnerable Library - <b>async-1.5.2.tgz</b></p> <p>Higher-order functions and common patterns for asynchronous code</p> <p>Library home page: <a href="https://registry.npmjs.org/async/-/async-1.5.2.tgz">https://registry.npmjs.org/async/-/async-1.5.2.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/webpack/node_modules/async/package.json,/app/compilers/react-compiler/node_modules/async/package.json,/node_modules/istanbul/node_modules/async/package.json,/node_modules/extract-text-webpack-plugin/node_modules/async/package.json,/node_modules/handlebars/node_modules/async/package.json</p> <p> Dependency Hierarchy: - extract-text-webpack-plugin-1.0.1.tgz (Root Library) - :x: **async-1.5.2.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/nvenkatesh1/SCA_test_JS/commit/eb47eeefc02a252a76628fec10a3c26aacb34024">eb47eeefc02a252a76628fec10a3c26aacb34024</a></p> <p>Found in base branch: <b>master</b></p> </p> <p></p> ### Vulnerability Details <p> In Async before 2.6.4 and 3.x before 3.2.2, a malicious user can obtain privileges via the mapValues() method, aka lib/internal/iterator.js createObjectIterator prototype pollution. <p>Publish Date: 2022-04-06 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43138>CVE-2021-43138</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>7.8</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-43138">https://nvd.nist.gov/vuln/detail/CVE-2021-43138</a></p> <p>Release Date: 2022-04-06</p> <p>Fix Resolution: async - v3.2.2</p> </p> <p></p> </details>
True
extract-text-webpack-plugin-1.0.1.tgz: 1 vulnerabilities (highest severity is: 7.8) - autoclosed - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>extract-text-webpack-plugin-1.0.1.tgz</b></p></summary> <p></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/webpack/node_modules/async/package.json,/app/compilers/react-compiler/node_modules/async/package.json,/node_modules/istanbul/node_modules/async/package.json,/node_modules/extract-text-webpack-plugin/node_modules/async/package.json,/node_modules/handlebars/node_modules/async/package.json</p> <p> <p>Found in HEAD commit: <a href="https://github.com/nvenkatesh1/SCA_test_JS/commit/eb47eeefc02a252a76628fec10a3c26aacb34024">eb47eeefc02a252a76628fec10a3c26aacb34024</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | --- | --- | | [CVE-2021-43138](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43138) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.8 | async-1.5.2.tgz | Transitive | N/A | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2021-43138</summary> ### Vulnerable Library - <b>async-1.5.2.tgz</b></p> <p>Higher-order functions and common patterns for asynchronous code</p> <p>Library home page: <a href="https://registry.npmjs.org/async/-/async-1.5.2.tgz">https://registry.npmjs.org/async/-/async-1.5.2.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/webpack/node_modules/async/package.json,/app/compilers/react-compiler/node_modules/async/package.json,/node_modules/istanbul/node_modules/async/package.json,/node_modules/extract-text-webpack-plugin/node_modules/async/package.json,/node_modules/handlebars/node_modules/async/package.json</p> <p> Dependency Hierarchy: - extract-text-webpack-plugin-1.0.1.tgz (Root Library) - :x: **async-1.5.2.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/nvenkatesh1/SCA_test_JS/commit/eb47eeefc02a252a76628fec10a3c26aacb34024">eb47eeefc02a252a76628fec10a3c26aacb34024</a></p> <p>Found in base branch: <b>master</b></p> </p> <p></p> ### Vulnerability Details <p> In Async before 2.6.4 and 3.x before 3.2.2, a malicious user can obtain privileges via the mapValues() method, aka lib/internal/iterator.js createObjectIterator prototype pollution. <p>Publish Date: 2022-04-06 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43138>CVE-2021-43138</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>7.8</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-43138">https://nvd.nist.gov/vuln/detail/CVE-2021-43138</a></p> <p>Release Date: 2022-04-06</p> <p>Fix Resolution: async - v3.2.2</p> </p> <p></p> </details>
non_test
extract text webpack plugin tgz vulnerabilities highest severity is autoclosed vulnerable library extract text webpack plugin tgz path to dependency file package json path to vulnerable library node modules webpack node modules async package json app compilers react compiler node modules async package json node modules istanbul node modules async package json node modules extract text webpack plugin node modules async package json node modules handlebars node modules async package json found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available high async tgz transitive n a details cve vulnerable library async tgz higher order functions and common patterns for asynchronous code library home page a href path to dependency file package json path to vulnerable library node modules webpack node modules async package json app compilers react compiler node modules async package json node modules istanbul node modules async package json node modules extract text webpack plugin node modules async package json node modules handlebars node modules async package json dependency hierarchy extract text webpack plugin tgz root library x async tgz vulnerable library found in head commit a href found in base branch master vulnerability details in async before and x before a malicious user can obtain privileges via the mapvalues method aka lib internal iterator js createobjectiterator prototype pollution publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution async
0
1,358
5,865,047,597
IssuesEvent
2017-05-13 00:19:53
ansible/ansible-modules-extras
https://api.github.com/repos/ansible/ansible-modules-extras
closed
Install Chocolatey from Internal Source
affects_2.3 feature_idea waiting_on_maintainer windows
Was asked if this was available and it doesn't appear it is from looking at the source. @nitzmahone ##### ISSUE TYPE - Feature Idea ##### COMPONENT NAME win_chocolatey ##### ANSIBLE VERSION ##### CONFIGURATION ##### OS / ENVIRONMENT Windows ##### SUMMARY Folks would like to be able to have Chocolatey installed from internal sources when using the Ansible module, particularly when they are completely offline. This would be a good capability added to the module. It is already supported by other config mgmt tools so those could provide good references on how to add it to the chocolatey ansible module.
True
Install Chocolatey from Internal Source - Was asked if this was available and it doesn't appear it is from looking at the source. @nitzmahone ##### ISSUE TYPE - Feature Idea ##### COMPONENT NAME win_chocolatey ##### ANSIBLE VERSION ##### CONFIGURATION ##### OS / ENVIRONMENT Windows ##### SUMMARY Folks would like to be able to have Chocolatey installed from internal sources when using the Ansible module, particularly when they are completely offline. This would be a good capability added to the module. It is already supported by other config mgmt tools so those could provide good references on how to add it to the chocolatey ansible module.
non_test
install chocolatey from internal source was asked if this was available and it doesn t appear it is from looking at the source nitzmahone issue type feature idea component name win chocolatey ansible version configuration os environment windows summary folks would like to be able to have chocolatey installed from internal sources when using the ansible module particularly when they are completely offline this would be a good capability added to the module it is already supported by other config mgmt tools so those could provide good references on how to add it to the chocolatey ansible module
0
469,804
13,526,198,183
IssuesEvent
2020-09-15 13:58:04
carbon-design-system/ibm-dotcom-library
https://api.github.com/repos/carbon-design-system/ibm-dotcom-library
closed
Web component: Button Group Prod QA testing
Airtable Done QA package: web components priority: high
<!-- Avoid any type of solutions in this user story --> <!-- replace _{{...}}_ with your own words or remove --> #### User Story <!-- {{Provide a detailed description of the user's need here, but avoid any type of solutions}} --> > As a `[user role below]`: developer using the ibm.com Library `Button Group` > I need to: have a version of the component that has been tested for accessibility compliance as well as on multiple browsers and platforms > so that I can: be confident that my ibm.com web site users will have a good experience #### Additional information <!-- {{Please provide any additional information or resources for reference}} --> - [Browser Stack link](https://ibm.ent.box.com/notes/578734426612) - [Browser Standard](https://w3.ibm.com/standards/web/browser/) - Browser versions to be tested: Tier 1 browsers will be tested with defects created as Sev 1 or Sev 2. Tier 2 browser defects will be created as Sev 3 defects. - Platforms to be tested, by priority: 1) Desktop 2) Mobile 3) Tablet - Mobile & Tablet iOS versions: 13.1 and 13.3 - Mobile & Tablet Android versions: 9.0 Pie and 8.1 Oreo - Browsers to be tested: Desktop: Chrome, Firefox, Safari, Edge, Mobile: Chrome, Safari, Samsung Internet, UC Browser, Tablet: Safari, Chrome, Android - [Accessibility Checklist](https://www.ibm.com/able/guidelines/ci162/accessibility_checklist.html) - [Creating a QA bug](https://ibm.ent.box.com/notes/603242247385) - **See the Epic for the Design and Functional specs information** - Dev issue (#3492) - Once development is finished the updated code is available in the [**Web Components Canary Environment**](https://ibmdotcom-web-components-canary.mybluemix.net/?path=/story/overview-getting-started--page) for testing. - The [**React canary environment**](https://ibmdotcom-react-canary.mybluemix.net/?path=/story/overview-getting-started--page) component should be used for comparison #### Acceptance criteria - [ ] Accessibility testing is complete. Component is compliant. - [ ] All browser versions are tested - [ ] All operating systems are tested - [ ] All devices are tested - [ ] Defects are recorded and retested when fixed
1.0
Web component: Button Group Prod QA testing - <!-- Avoid any type of solutions in this user story --> <!-- replace _{{...}}_ with your own words or remove --> #### User Story <!-- {{Provide a detailed description of the user's need here, but avoid any type of solutions}} --> > As a `[user role below]`: developer using the ibm.com Library `Button Group` > I need to: have a version of the component that has been tested for accessibility compliance as well as on multiple browsers and platforms > so that I can: be confident that my ibm.com web site users will have a good experience #### Additional information <!-- {{Please provide any additional information or resources for reference}} --> - [Browser Stack link](https://ibm.ent.box.com/notes/578734426612) - [Browser Standard](https://w3.ibm.com/standards/web/browser/) - Browser versions to be tested: Tier 1 browsers will be tested with defects created as Sev 1 or Sev 2. Tier 2 browser defects will be created as Sev 3 defects. - Platforms to be tested, by priority: 1) Desktop 2) Mobile 3) Tablet - Mobile & Tablet iOS versions: 13.1 and 13.3 - Mobile & Tablet Android versions: 9.0 Pie and 8.1 Oreo - Browsers to be tested: Desktop: Chrome, Firefox, Safari, Edge, Mobile: Chrome, Safari, Samsung Internet, UC Browser, Tablet: Safari, Chrome, Android - [Accessibility Checklist](https://www.ibm.com/able/guidelines/ci162/accessibility_checklist.html) - [Creating a QA bug](https://ibm.ent.box.com/notes/603242247385) - **See the Epic for the Design and Functional specs information** - Dev issue (#3492) - Once development is finished the updated code is available in the [**Web Components Canary Environment**](https://ibmdotcom-web-components-canary.mybluemix.net/?path=/story/overview-getting-started--page) for testing. - The [**React canary environment**](https://ibmdotcom-react-canary.mybluemix.net/?path=/story/overview-getting-started--page) component should be used for comparison #### Acceptance criteria - [ ] Accessibility testing is complete. Component is compliant. - [ ] All browser versions are tested - [ ] All operating systems are tested - [ ] All devices are tested - [ ] Defects are recorded and retested when fixed
non_test
web component button group prod qa testing user story as a developer using the ibm com library button group i need to have a version of the component that has been tested for accessibility compliance as well as on multiple browsers and platforms so that i can be confident that my ibm com web site users will have a good experience additional information browser versions to be tested tier browsers will be tested with defects created as sev or sev tier browser defects will be created as sev defects platforms to be tested by priority desktop mobile tablet mobile tablet ios versions and mobile tablet android versions pie and oreo browsers to be tested desktop chrome firefox safari edge mobile chrome safari samsung internet uc browser tablet safari chrome android see the epic for the design and functional specs information dev issue once development is finished the updated code is available in the for testing the component should be used for comparison acceptance criteria accessibility testing is complete component is compliant all browser versions are tested all operating systems are tested all devices are tested defects are recorded and retested when fixed
0
74,083
15,303,962,459
IssuesEvent
2021-02-24 16:22:32
elastic/kibana
https://api.github.com/repos/elastic/kibana
opened
Detect and prevent the use of mismatched encryption keys
Team:Security discuss enhancement
Kibana relies on a number of encryption keys. Arguably the most important key is `xpack.encryptedSavedObjects.encryptionKey`, as this controls the encryption/decryption of actions, alerts, and other sensitive user data. Kibana requires that this key is set to the same value across all instances. If two Kibana instances have different encryption keys, then they will be encrypting saved objects that cannot be decrypted by the other instance. We should attempt to detect if there is a potential encryption key mismatch, and alert consumers of the ESO plugin so that they can take appropriate action. One potential solution is to save a "canary" saved object, whose sole purpose is to test that it can be successfully decrypted by the current instance. If we cannot decrypt this object, then it stands to reason that this instance is not properly configured. I expect there are scenarios where we'd need to allow the canary to be forcefully replaced, however.
True
Detect and prevent the use of mismatched encryption keys - Kibana relies on a number of encryption keys. Arguably the most important key is `xpack.encryptedSavedObjects.encryptionKey`, as this controls the encryption/decryption of actions, alerts, and other sensitive user data. Kibana requires that this key is set to the same value across all instances. If two Kibana instances have different encryption keys, then they will be encrypting saved objects that cannot be decrypted by the other instance. We should attempt to detect if there is a potential encryption key mismatch, and alert consumers of the ESO plugin so that they can take appropriate action. One potential solution is to save a "canary" saved object, whose sole purpose is to test that it can be successfully decrypted by the current instance. If we cannot decrypt this object, then it stands to reason that this instance is not properly configured. I expect there are scenarios where we'd need to allow the canary to be forcefully replaced, however.
non_test
detect and prevent the use of mismatched encryption keys kibana relies on a number of encryption keys arguably the most important key is xpack encryptedsavedobjects encryptionkey as this controls the encryption decryption of actions alerts and other sensitive user data kibana requires that this key is set to the same value across all instances if two kibana instances have different encryption keys then they will be encrypting saved objects that cannot be decrypted by the other instance we should attempt to detect if there is a potential encryption key mismatch and alert consumers of the eso plugin so that they can take appropriate action one potential solution is to save a canary saved object whose sole purpose is to test that it can be successfully decrypted by the current instance if we cannot decrypt this object then it stands to reason that this instance is not properly configured i expect there are scenarios where we d need to allow the canary to be forcefully replaced however
0
112,166
17,080,009,330
IssuesEvent
2021-07-08 02:47:53
faizulho/gatsby-starter-docz-netlifycms-1
https://api.github.com/repos/faizulho/gatsby-starter-docz-netlifycms-1
opened
CVE-2020-28469 (High) detected in glob-parent-3.1.0.tgz, glob-parent-5.1.1.tgz
security vulnerability
## CVE-2020-28469 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>glob-parent-3.1.0.tgz</b>, <b>glob-parent-5.1.1.tgz</b></p></summary> <p> <details><summary><b>glob-parent-3.1.0.tgz</b></p></summary> <p>Strips glob magic from a string to provide the parent directory path</p> <p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz</a></p> <p>Path to dependency file: gatsby-starter-docz-netlifycms-1/package.json</p> <p>Path to vulnerable library: gatsby-starter-docz-netlifycms-1/node_modules/watchpack-chokidar2/node_modules/glob-parent/package.json,gatsby-starter-docz-netlifycms-1/node_modules/@nicolo-ribaudo/chokidar-2/node_modules/glob-parent/package.json,gatsby-starter-docz-netlifycms-1/node_modules/webpack-dev-server/node_modules/glob-parent/package.json</p> <p> Dependency Hierarchy: - docz-2.3.1.tgz (Root Library) - gatsby-theme-docz-2.3.1.tgz - babel-plugin-export-metadata-2.3.0.tgz - cli-7.12.10.tgz - chokidar-2-2.1.8-no-fsevents.tgz - :x: **glob-parent-3.1.0.tgz** (Vulnerable Library) </details> <details><summary><b>glob-parent-5.1.1.tgz</b></p></summary> <p>Extract the non-magic parent path from a glob string.</p> <p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz</a></p> <p>Path to dependency file: gatsby-starter-docz-netlifycms-1/package.json</p> <p>Path to vulnerable library: gatsby-starter-docz-netlifycms-1/node_modules/glob-parent/package.json</p> <p> Dependency Hierarchy: - gatsby-3.9.0.tgz (Root Library) - chokidar-3.4.2.tgz - :x: **glob-parent-5.1.1.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/faizulho/gatsby-starter-docz-netlifycms-1/commit/70a9e87b1e68c0bef6964284e0899376209b0f3d">70a9e87b1e68c0bef6964284e0899376209b0f3d</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package glob-parent before 5.1.2. The enclosure regex used to check for strings ending in enclosure containing path separator. <p>Publish Date: 2021-06-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469>CVE-2020-28469</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469</a></p> <p>Release Date: 2021-06-03</p> <p>Fix Resolution: glob-parent - 5.1.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-28469 (High) detected in glob-parent-3.1.0.tgz, glob-parent-5.1.1.tgz - ## CVE-2020-28469 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>glob-parent-3.1.0.tgz</b>, <b>glob-parent-5.1.1.tgz</b></p></summary> <p> <details><summary><b>glob-parent-3.1.0.tgz</b></p></summary> <p>Strips glob magic from a string to provide the parent directory path</p> <p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz</a></p> <p>Path to dependency file: gatsby-starter-docz-netlifycms-1/package.json</p> <p>Path to vulnerable library: gatsby-starter-docz-netlifycms-1/node_modules/watchpack-chokidar2/node_modules/glob-parent/package.json,gatsby-starter-docz-netlifycms-1/node_modules/@nicolo-ribaudo/chokidar-2/node_modules/glob-parent/package.json,gatsby-starter-docz-netlifycms-1/node_modules/webpack-dev-server/node_modules/glob-parent/package.json</p> <p> Dependency Hierarchy: - docz-2.3.1.tgz (Root Library) - gatsby-theme-docz-2.3.1.tgz - babel-plugin-export-metadata-2.3.0.tgz - cli-7.12.10.tgz - chokidar-2-2.1.8-no-fsevents.tgz - :x: **glob-parent-3.1.0.tgz** (Vulnerable Library) </details> <details><summary><b>glob-parent-5.1.1.tgz</b></p></summary> <p>Extract the non-magic parent path from a glob string.</p> <p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz</a></p> <p>Path to dependency file: gatsby-starter-docz-netlifycms-1/package.json</p> <p>Path to vulnerable library: gatsby-starter-docz-netlifycms-1/node_modules/glob-parent/package.json</p> <p> Dependency Hierarchy: - gatsby-3.9.0.tgz (Root Library) - chokidar-3.4.2.tgz - :x: **glob-parent-5.1.1.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/faizulho/gatsby-starter-docz-netlifycms-1/commit/70a9e87b1e68c0bef6964284e0899376209b0f3d">70a9e87b1e68c0bef6964284e0899376209b0f3d</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package glob-parent before 5.1.2. The enclosure regex used to check for strings ending in enclosure containing path separator. <p>Publish Date: 2021-06-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28469>CVE-2020-28469</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28469</a></p> <p>Release Date: 2021-06-03</p> <p>Fix Resolution: glob-parent - 5.1.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve high detected in glob parent tgz glob parent tgz cve high severity vulnerability vulnerable libraries glob parent tgz glob parent tgz glob parent tgz strips glob magic from a string to provide the parent directory path library home page a href path to dependency file gatsby starter docz netlifycms package json path to vulnerable library gatsby starter docz netlifycms node modules watchpack node modules glob parent package json gatsby starter docz netlifycms node modules nicolo ribaudo chokidar node modules glob parent package json gatsby starter docz netlifycms node modules webpack dev server node modules glob parent package json dependency hierarchy docz tgz root library gatsby theme docz tgz babel plugin export metadata tgz cli tgz chokidar no fsevents tgz x glob parent tgz vulnerable library glob parent tgz extract the non magic parent path from a glob string library home page a href path to dependency file gatsby starter docz netlifycms package json path to vulnerable library gatsby starter docz netlifycms node modules glob parent package json dependency hierarchy gatsby tgz root library chokidar tgz x glob parent tgz vulnerable library found in head commit a href found in base branch master vulnerability details this affects the package glob parent before the enclosure regex used to check for strings ending in enclosure containing path separator publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution glob parent step up your open source security game with whitesource
0
12,005
14,738,162,537
IssuesEvent
2021-01-07 03:56:43
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Billing Cycles Process - Improvments and Fixes
anc-process anp-not prioritized ant-enhancement
In GitLab by @kdjstudios on May 11, 2018, 12:40 Hello Team, This is going to be a list of all the concerns and improvements to insure an easy to use and functional Billing cycle Process - Add staged fees to accounts that will get applied to the same billing cycle. (This is due to ops now having to create the next billing cycle after finalizing in order to create manual invoices) #890 - Add the ability to process a billing cycle with no usage file. This is due to the increased amount of billing cycles that do not need a usage file. #893 - Add the step between create next cycle and upload to allow for manual invoice creation without confusion. - Reverting/Deleting a cycle concerns - When performing a revert, Will this revert manually created invoices or just one created by the upload? - Is the delete functionality shown to what permission level? - Does this delete manually created invoices or just ones created by the upload?
1.0
Billing Cycles Process - Improvments and Fixes - In GitLab by @kdjstudios on May 11, 2018, 12:40 Hello Team, This is going to be a list of all the concerns and improvements to insure an easy to use and functional Billing cycle Process - Add staged fees to accounts that will get applied to the same billing cycle. (This is due to ops now having to create the next billing cycle after finalizing in order to create manual invoices) #890 - Add the ability to process a billing cycle with no usage file. This is due to the increased amount of billing cycles that do not need a usage file. #893 - Add the step between create next cycle and upload to allow for manual invoice creation without confusion. - Reverting/Deleting a cycle concerns - When performing a revert, Will this revert manually created invoices or just one created by the upload? - Is the delete functionality shown to what permission level? - Does this delete manually created invoices or just ones created by the upload?
non_test
billing cycles process improvments and fixes in gitlab by kdjstudios on may hello team this is going to be a list of all the concerns and improvements to insure an easy to use and functional billing cycle process add staged fees to accounts that will get applied to the same billing cycle this is due to ops now having to create the next billing cycle after finalizing in order to create manual invoices add the ability to process a billing cycle with no usage file this is due to the increased amount of billing cycles that do not need a usage file add the step between create next cycle and upload to allow for manual invoice creation without confusion reverting deleting a cycle concerns when performing a revert will this revert manually created invoices or just one created by the upload is the delete functionality shown to what permission level does this delete manually created invoices or just ones created by the upload
0
129,375
10,572,682,896
IssuesEvent
2019-10-07 10:07:24
kubernetes/kubernetes
https://api.github.com/repos/kubernetes/kubernetes
opened
node-kubelet-master tests fail during startup
kind/failing-test
**Which jobs are failing**: node-kubelet-master in master blocking https://testgrid.k8s.io/sig-release-master-blocking#node-kubelet-master **Which test(s) are failing**: test startup, the tests don't even get run **Since when has it been failing**: 10/02 **Testgrid link**: https://testgrid.k8s.io/sig-release-master-blocking#node-kubelet-master **Reason for failure**: ``` I1002 02:30:10.594] >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I1002 02:30:10.594] > START TEST > I1002 02:30:10.594] >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I1002 02:30:10.595] Start Test Suite on Host I1002 02:30:10.595] I1002 02:30:10.595] Failure Finished Test Suite on Host I1002 02:30:10.595] unable to create gce instance with running docker daemon for image cos-stable-60-9592-84-0. googleapi: Error 404: The resource 'projects/k8s-jkns-ci-node-e2e/zones/us-west1-b/instances/tmp-node-e2e-3039b3ed-cos-stable-60-9592-84-0' was not found, notFound ``` **Anything else we need to know**: /kind flake /priority important-soon /milestone v1.17 /sig node cc. @droslean @hasheddan @Verolop @epk
1.0
node-kubelet-master tests fail during startup - **Which jobs are failing**: node-kubelet-master in master blocking https://testgrid.k8s.io/sig-release-master-blocking#node-kubelet-master **Which test(s) are failing**: test startup, the tests don't even get run **Since when has it been failing**: 10/02 **Testgrid link**: https://testgrid.k8s.io/sig-release-master-blocking#node-kubelet-master **Reason for failure**: ``` I1002 02:30:10.594] >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I1002 02:30:10.594] > START TEST > I1002 02:30:10.594] >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I1002 02:30:10.595] Start Test Suite on Host I1002 02:30:10.595] I1002 02:30:10.595] Failure Finished Test Suite on Host I1002 02:30:10.595] unable to create gce instance with running docker daemon for image cos-stable-60-9592-84-0. googleapi: Error 404: The resource 'projects/k8s-jkns-ci-node-e2e/zones/us-west1-b/instances/tmp-node-e2e-3039b3ed-cos-stable-60-9592-84-0' was not found, notFound ``` **Anything else we need to know**: /kind flake /priority important-soon /milestone v1.17 /sig node cc. @droslean @hasheddan @Verolop @epk
test
node kubelet master tests fail during startup which jobs are failing node kubelet master in master blocking which test s are failing test startup the tests don t even get run since when has it been failing testgrid link reason for failure start test start test suite on host failure finished test suite on host unable to create gce instance with running docker daemon for image cos stable googleapi error the resource projects jkns ci node zones us b instances tmp node cos stable was not found notfound anything else we need to know kind flake priority important soon milestone sig node cc droslean hasheddan verolop epk
1
138,772
11,216,668,338
IssuesEvent
2020-01-07 07:08:18
chainer/chainer
https://api.github.com/repos/chainer/chainer
closed
ChainerX `test_Max` is flaky
cat:test prio:high
ERROR: type should be string, got "https://jenkins.preferred.jp/job/chainer/job/chainer_pr/2665/TEST=CHAINERX_chainerx-py3,label=mn1-p100/console\r\n\r\n```\r\n test_Max_param_98_{axis=(-2, -1), in_dtypes=('float16',), is_module=True, out_dtype='float16', shape=(4, 3)}[native:0] _\r\n14:51:04 \r\n14:51:04 device = native:0, args = (), kwargs = {}\r\n14:51:04 backend_config = <BackendConfig use_chainerx=True chainerx_device='native:0' use_cuda=False cuda_device=None use_cudnn='never' cudnn_deterministic=False autotune=False cudnn_fast_batch_normalization=False use_ideep='never'>\r\n14:51:04 obj = <chainer.testing._bundle.TestMax_param_98_{axis=(-2, -1), in_dtypes=('float16',), is_module=True, out_dtype='float16', shape=(4, 3)} object at 0x7f60b2c70400>\r\n14:51:04 \r\n14:51:04 @pytest.mark.parametrize_device(devices)\r\n14:51:04 def entry_func(device, *args, **kwargs):\r\n14:51:04 backend_config = _make_backend_config(device.name)\r\n14:51:04 \r\n14:51:04 # Forward test\r\n14:51:04 obj = cls()\r\n14:51:04 try:\r\n14:51:04 obj.setup(*args, **kwargs)\r\n14:51:04 obj.run_test_forward(backend_config)\r\n14:51:04 finally:\r\n14:51:04 obj.teardown()\r\n14:51:04 \r\n14:51:04 # If this is a NumpyOpTest instance, skip backward/double-backward\r\n14:51:04 # tests if the forward test succeeds with acceptable errors.\r\n14:51:04 if isinstance(obj, NumpyOpTest):\r\n14:51:04 if obj.is_forward_successful_with_accept_errors:\r\n14:51:04 return # success with expected errors\r\n14:51:04 \r\n14:51:04 # Backward test\r\n14:51:04 obj = cls()\r\n14:51:04 try:\r\n14:51:04 obj.setup(*args, **kwargs)\r\n14:51:04 > obj.run_test_backward(backend_config)\r\n14:51:04 \r\n14:51:04 args = ()\r\n14:51:04 backend_config = <BackendConfig use_chainerx=True chainerx_device='native:0' use_cuda=False cuda_device=None use_cudnn='never' cudnn_deterministic=False autotune=False cudnn_fast_batch_normalization=False use_ideep='never'>\r\n14:51:04 cls = <class 'chainer.testing._bundle.TestMax_param_98_{axis=(-2, -1), in_dtypes=('float16',), is_module=True, out_dtype='float16', shape=(4, 3)}'>\r\n14:51:04 device = native:0\r\n14:51:04 kwargs = {}\r\n14:51:04 obj = <chainer.testing._bundle.TestMax_param_98_{axis=(-2, -1), in_dtypes=('float16',), is_module=True, out_dtype='float16', shape=(4, 3)} object at 0x7f60b2c70400>\r\n14:51:04 \r\n14:51:04 /repo/tests/chainerx_tests/op_utils.py:366: \r\n14:51:04 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ \r\n14:51:04 /repo/tests/chainerx_tests/op_utils.py:95: in run_test_backward\r\n14:51:04 super(OpTest, self).run_test_backward(backend_config)\r\n14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:260: in run_test_backward\r\n14:51:04 do_check()\r\n14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:255: in do_check\r\n14:51:04 **self.check_backward_options)\r\n14:51:04 /workspace/conda/envs/testenv/lib/python3.6/contextlib.py:99: in __exit__\r\n14:51:04 self.gen.throw(type, value, traceback)\r\n14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:36: in raise_if_fail\r\n14:51:04 cls.fail(message, e)\r\n14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:27: in fail\r\n14:51:04 utils._raise_from(cls, message, exc)\r\n14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/utils/__init__.py:106: in _raise_from\r\n14:51:04 six.raise_from(new_exc.with_traceback(orig_exc.__traceback__), None)\r\n14:51:04 <string>:3: in raise_from\r\n14:51:04 ???\r\n14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:34: in raise_if_fail\r\n14:51:04 yield\r\n14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:255: in do_check\r\n14:51:04 **self.check_backward_options)\r\n14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:944: in check_backward\r\n14:51:04 detect_nondifferentiable, is_immutable_params=False\r\n14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:463: in run\r\n14:51:04 self._run()\r\n14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:506: in _run\r\n14:51:04 self._compare_gradients(gx_numeric, gx_backward, directions)\r\n14:51:04 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ \r\n14:51:04 \r\n14:51:04 self = <chainer.gradient_check._CheckBackward object at 0x7f60b2c70c88>\r\n14:51:04 gx_numeric = array(0.02866709, shape=(), dtype=float64, device='native:0')\r\n14:51:04 gx_backward = array(0.05733418, shape=(), dtype=float64, device='native:0')\r\n14:51:04 directions = (array([[-0.13341468, -0.37321123, -0.25195918],\r\n14:51:04 [0.05371198, -0.4639925 , 0.1588227 ],\r\n14:51:04 [0.59166147, 0.03545586, 0.03313082],\r\n14:51:04 [0.04581011, -0.2949229 , -0.3074665 ]], shape=(4, 3), dtype=float64, device='native:0'),)\r\n14:51:04 \r\n14:51:04 def _compare_gradients(self, gx_numeric, gx_backward, directions):\r\n14:51:04 atol = self.atol\r\n14:51:04 rtol = self.rtol\r\n14:51:04 # Compare the gradients\r\n14:51:04 try:\r\n14:51:04 testing.assert_allclose(\r\n14:51:04 gx_numeric, gx_backward, atol=atol, rtol=rtol)\r\n14:51:04 except AssertionError as e:\r\n14:51:04 eps = self.eps\r\n14:51:04 xs = self.xs\r\n14:51:04 gys = self.gys\r\n14:51:04 f = six.StringIO()\r\n14:51:04 f.write('check_backward failed (eps={} atol={} rtol={})\\n'.format(\r\n14:51:04 eps, atol, rtol))\r\n14:51:04 for i, x in enumerate(xs):\r\n14:51:04 f.write('inputs[{}]:\\n'.format(i))\r\n14:51:04 f.write('{}\\n'.format(x))\r\n14:51:04 for i, gy in enumerate(gys):\r\n14:51:04 f.write('grad_outputs[{}]:\\n'.format(i))\r\n14:51:04 f.write('{}\\n'.format(gy))\r\n14:51:04 for i, d in enumerate(directions):\r\n14:51:04 f.write('directions[{}]:\\n'.format(i))\r\n14:51:04 f.write('{}\\n'.format(d))\r\n14:51:04 f.write('gradients (numeric): {}\\n'.format(gx_numeric))\r\n14:51:04 f.write('gradients (backward): {}\\n'.format(gx_backward))\r\n14:51:04 f.write('\\n')\r\n14:51:04 f.write('x: numeric gradient, y: backward gradient')\r\n14:51:04 f.write(str(e))\r\n14:51:04 > raise AssertionError(f.getvalue())\r\n14:51:04 E chainer.testing.function_link.FunctionTestError: backward is not implemented correctly\r\n14:51:04 E \r\n14:51:04 E (caused by)\r\n14:51:04 E AssertionError: check_backward failed (eps=0.001 atol=0.003 rtol=0.003)\r\n14:51:04 E inputs[0]:\r\n14:51:04 E array([[0.24450684, -0.29174805, 0.37353516],\r\n14:51:04 E [-0.91162109, -0.18688965, -0.21130371],\r\n14:51:04 E [-0.42041016, 0.97705078, 0.97705078],\r\n14:51:04 E [0.3083496 , 0.79785156, 0.07928467]], shape=(4, 3), dtype=float16, device='native:0')\r\n14:51:04 E grad_outputs[0]:\r\n14:51:04 E array(0.8359375, shape=(), dtype=float16, device='native:0')\r\n14:51:04 E directions[0]:\r\n14:51:04 E array([[-0.13341468, -0.37321123, -0.25195918],\r\n14:51:04 E [0.05371198, -0.4639925 , 0.1588227 ],\r\n14:51:04 E [0.59166147, 0.03545586, 0.03313082],\r\n14:51:04 E [0.04581011, -0.2949229 , -0.3074665 ]], shape=(4, 3), dtype=float64, device='native:0')\r\n14:51:04 E gradients (numeric): array(0.02866709, shape=(), dtype=float64, device='native:0')\r\n14:51:04 E gradients (backward): array(0.05733418, shape=(), dtype=float64, device='native:0')\r\n14:51:04 E \r\n14:51:04 E x: numeric gradient, y: backward gradient\r\n14:51:04 E Not equal to tolerance rtol=0.003, atol=0.003\r\n14:51:04 E \r\n14:51:04 E Mismatch: 100%\r\n14:51:04 E Max absolute difference: 0.02866709\r\n14:51:04 E Max relative difference: 0.5\r\n14:51:04 E x: array(0.028667)\r\n14:51:04 E y: array(0.057334)\r\n14:51:04 E \r\n14:51:04 E assert_allclose failed: \r\n14:51:04 E shape: () ()\r\n14:51:04 E dtype: float64 float64\r\n14:51:04 E i: (0,)\r\n14:51:04 E x[i]: 0.028667087781057466\r\n14:51:04 E y[i]: 0.05733417556222338\r\n14:51:04 E relative error[i]: 0.5000000000009458\r\n14:51:04 E absolute error[i]: 0.028667087781165914\r\n14:51:04 E relative tolerance * |y[i]|: 0.00017200252668667015\r\n14:51:04 E absolute tolerance: 0.003\r\n14:51:04 E total tolerance: 0.00317200252668667\r\n14:51:04 E x: 0.02866709\r\n14:51:04 E y: 0.05733418\r\n14:51:04 \r\n14:51:04 atol = 0.003\r\n14:51:04 d = array([[-0.13341468, -0.37321123, -0.25195918],\r\n14:51:04 [0.05371198, -0.4639925 , 0.1588227 ],\r\n14:51:04 [0.59166147, 0.03545586, 0.03313082],\r\n14:51:04 [0.04581011, -0.2949229 , -0.3074665 ]], shape=(4, 3), dtype=float64, device='native:0')\r\n14:51:04 directions = (array([[-0.13341468, -0.37321123, -0.25195918],\r\n14:51:04 [0.05371198, -0.4639925 , 0.1588227 ],\r\n14:51:04 [0.59166147, 0.03545586, 0.03313082],\r\n14:51:04 [0.04581011, -0.2949229 , -0.3074665 ]], shape=(4, 3), dtype=float64, device='native:0'),)\r\n14:51:04 eps = 0.001\r\n14:51:04 f = <_io.StringIO object at 0x7f6116c75af8>\r\n14:51:04 gx_backward = array(0.05733418, shape=(), dtype=float64, device='native:0')\r\n14:51:04 gx_numeric = array(0.02866709, shape=(), dtype=float64, device='native:0')\r\n14:51:04 gy = array(0.8359375, shape=(), dtype=float16, device='native:0')\r\n14:51:04 gys = (array(0.8359375, shape=(), dtype=float16, device='native:0'),)\r\n14:51:04 i = 0\r\n14:51:04 rtol = 0.003\r\n14:51:04 self = <chainer.gradient_check._CheckBackward object at 0x7f60b2c70c88>\r\n14:51:04 x = array([[0.24450684, -0.29174805, 0.37353516],\r\n14:51:04 [-0.91162109, -0.18688965, -0.21130371],\r\n14:51:04 [-0.42041016, 0.97705078, 0.97705078],\r\n14:51:04 [0.3083496 , 0.79785156, 0.07928467]], shape=(4, 3), dtype=float16, device='native:0')\r\n14:51:04 xs = (array([[0.24450684, -0.29174805, 0.37353516],\r\n14:51:04 [-0.91162109, -0.18688965, -0.21130371],\r\n14:51:04 [-0.42041016, 0.97705078, 0.97705078],\r\n14:51:04 [0.3083496 , 0.79785156, 0.07928467]], shape=(4, 3), dtype=float16, device='native:0'),)\r\n14:51:04 \r\n14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:536: FunctionTestError\r\n```"
1.0
ChainerX `test_Max` is flaky - https://jenkins.preferred.jp/job/chainer/job/chainer_pr/2665/TEST=CHAINERX_chainerx-py3,label=mn1-p100/console ``` test_Max_param_98_{axis=(-2, -1), in_dtypes=('float16',), is_module=True, out_dtype='float16', shape=(4, 3)}[native:0] _ 14:51:04 14:51:04 device = native:0, args = (), kwargs = {} 14:51:04 backend_config = <BackendConfig use_chainerx=True chainerx_device='native:0' use_cuda=False cuda_device=None use_cudnn='never' cudnn_deterministic=False autotune=False cudnn_fast_batch_normalization=False use_ideep='never'> 14:51:04 obj = <chainer.testing._bundle.TestMax_param_98_{axis=(-2, -1), in_dtypes=('float16',), is_module=True, out_dtype='float16', shape=(4, 3)} object at 0x7f60b2c70400> 14:51:04 14:51:04 @pytest.mark.parametrize_device(devices) 14:51:04 def entry_func(device, *args, **kwargs): 14:51:04 backend_config = _make_backend_config(device.name) 14:51:04 14:51:04 # Forward test 14:51:04 obj = cls() 14:51:04 try: 14:51:04 obj.setup(*args, **kwargs) 14:51:04 obj.run_test_forward(backend_config) 14:51:04 finally: 14:51:04 obj.teardown() 14:51:04 14:51:04 # If this is a NumpyOpTest instance, skip backward/double-backward 14:51:04 # tests if the forward test succeeds with acceptable errors. 14:51:04 if isinstance(obj, NumpyOpTest): 14:51:04 if obj.is_forward_successful_with_accept_errors: 14:51:04 return # success with expected errors 14:51:04 14:51:04 # Backward test 14:51:04 obj = cls() 14:51:04 try: 14:51:04 obj.setup(*args, **kwargs) 14:51:04 > obj.run_test_backward(backend_config) 14:51:04 14:51:04 args = () 14:51:04 backend_config = <BackendConfig use_chainerx=True chainerx_device='native:0' use_cuda=False cuda_device=None use_cudnn='never' cudnn_deterministic=False autotune=False cudnn_fast_batch_normalization=False use_ideep='never'> 14:51:04 cls = <class 'chainer.testing._bundle.TestMax_param_98_{axis=(-2, -1), in_dtypes=('float16',), is_module=True, out_dtype='float16', shape=(4, 3)}'> 14:51:04 device = native:0 14:51:04 kwargs = {} 14:51:04 obj = <chainer.testing._bundle.TestMax_param_98_{axis=(-2, -1), in_dtypes=('float16',), is_module=True, out_dtype='float16', shape=(4, 3)} object at 0x7f60b2c70400> 14:51:04 14:51:04 /repo/tests/chainerx_tests/op_utils.py:366: 14:51:04 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 14:51:04 /repo/tests/chainerx_tests/op_utils.py:95: in run_test_backward 14:51:04 super(OpTest, self).run_test_backward(backend_config) 14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:260: in run_test_backward 14:51:04 do_check() 14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:255: in do_check 14:51:04 **self.check_backward_options) 14:51:04 /workspace/conda/envs/testenv/lib/python3.6/contextlib.py:99: in __exit__ 14:51:04 self.gen.throw(type, value, traceback) 14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:36: in raise_if_fail 14:51:04 cls.fail(message, e) 14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:27: in fail 14:51:04 utils._raise_from(cls, message, exc) 14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/utils/__init__.py:106: in _raise_from 14:51:04 six.raise_from(new_exc.with_traceback(orig_exc.__traceback__), None) 14:51:04 <string>:3: in raise_from 14:51:04 ??? 14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:34: in raise_if_fail 14:51:04 yield 14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/testing/function_link.py:255: in do_check 14:51:04 **self.check_backward_options) 14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:944: in check_backward 14:51:04 detect_nondifferentiable, is_immutable_params=False 14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:463: in run 14:51:04 self._run() 14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:506: in _run 14:51:04 self._compare_gradients(gx_numeric, gx_backward, directions) 14:51:04 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 14:51:04 14:51:04 self = <chainer.gradient_check._CheckBackward object at 0x7f60b2c70c88> 14:51:04 gx_numeric = array(0.02866709, shape=(), dtype=float64, device='native:0') 14:51:04 gx_backward = array(0.05733418, shape=(), dtype=float64, device='native:0') 14:51:04 directions = (array([[-0.13341468, -0.37321123, -0.25195918], 14:51:04 [0.05371198, -0.4639925 , 0.1588227 ], 14:51:04 [0.59166147, 0.03545586, 0.03313082], 14:51:04 [0.04581011, -0.2949229 , -0.3074665 ]], shape=(4, 3), dtype=float64, device='native:0'),) 14:51:04 14:51:04 def _compare_gradients(self, gx_numeric, gx_backward, directions): 14:51:04 atol = self.atol 14:51:04 rtol = self.rtol 14:51:04 # Compare the gradients 14:51:04 try: 14:51:04 testing.assert_allclose( 14:51:04 gx_numeric, gx_backward, atol=atol, rtol=rtol) 14:51:04 except AssertionError as e: 14:51:04 eps = self.eps 14:51:04 xs = self.xs 14:51:04 gys = self.gys 14:51:04 f = six.StringIO() 14:51:04 f.write('check_backward failed (eps={} atol={} rtol={})\n'.format( 14:51:04 eps, atol, rtol)) 14:51:04 for i, x in enumerate(xs): 14:51:04 f.write('inputs[{}]:\n'.format(i)) 14:51:04 f.write('{}\n'.format(x)) 14:51:04 for i, gy in enumerate(gys): 14:51:04 f.write('grad_outputs[{}]:\n'.format(i)) 14:51:04 f.write('{}\n'.format(gy)) 14:51:04 for i, d in enumerate(directions): 14:51:04 f.write('directions[{}]:\n'.format(i)) 14:51:04 f.write('{}\n'.format(d)) 14:51:04 f.write('gradients (numeric): {}\n'.format(gx_numeric)) 14:51:04 f.write('gradients (backward): {}\n'.format(gx_backward)) 14:51:04 f.write('\n') 14:51:04 f.write('x: numeric gradient, y: backward gradient') 14:51:04 f.write(str(e)) 14:51:04 > raise AssertionError(f.getvalue()) 14:51:04 E chainer.testing.function_link.FunctionTestError: backward is not implemented correctly 14:51:04 E 14:51:04 E (caused by) 14:51:04 E AssertionError: check_backward failed (eps=0.001 atol=0.003 rtol=0.003) 14:51:04 E inputs[0]: 14:51:04 E array([[0.24450684, -0.29174805, 0.37353516], 14:51:04 E [-0.91162109, -0.18688965, -0.21130371], 14:51:04 E [-0.42041016, 0.97705078, 0.97705078], 14:51:04 E [0.3083496 , 0.79785156, 0.07928467]], shape=(4, 3), dtype=float16, device='native:0') 14:51:04 E grad_outputs[0]: 14:51:04 E array(0.8359375, shape=(), dtype=float16, device='native:0') 14:51:04 E directions[0]: 14:51:04 E array([[-0.13341468, -0.37321123, -0.25195918], 14:51:04 E [0.05371198, -0.4639925 , 0.1588227 ], 14:51:04 E [0.59166147, 0.03545586, 0.03313082], 14:51:04 E [0.04581011, -0.2949229 , -0.3074665 ]], shape=(4, 3), dtype=float64, device='native:0') 14:51:04 E gradients (numeric): array(0.02866709, shape=(), dtype=float64, device='native:0') 14:51:04 E gradients (backward): array(0.05733418, shape=(), dtype=float64, device='native:0') 14:51:04 E 14:51:04 E x: numeric gradient, y: backward gradient 14:51:04 E Not equal to tolerance rtol=0.003, atol=0.003 14:51:04 E 14:51:04 E Mismatch: 100% 14:51:04 E Max absolute difference: 0.02866709 14:51:04 E Max relative difference: 0.5 14:51:04 E x: array(0.028667) 14:51:04 E y: array(0.057334) 14:51:04 E 14:51:04 E assert_allclose failed: 14:51:04 E shape: () () 14:51:04 E dtype: float64 float64 14:51:04 E i: (0,) 14:51:04 E x[i]: 0.028667087781057466 14:51:04 E y[i]: 0.05733417556222338 14:51:04 E relative error[i]: 0.5000000000009458 14:51:04 E absolute error[i]: 0.028667087781165914 14:51:04 E relative tolerance * |y[i]|: 0.00017200252668667015 14:51:04 E absolute tolerance: 0.003 14:51:04 E total tolerance: 0.00317200252668667 14:51:04 E x: 0.02866709 14:51:04 E y: 0.05733418 14:51:04 14:51:04 atol = 0.003 14:51:04 d = array([[-0.13341468, -0.37321123, -0.25195918], 14:51:04 [0.05371198, -0.4639925 , 0.1588227 ], 14:51:04 [0.59166147, 0.03545586, 0.03313082], 14:51:04 [0.04581011, -0.2949229 , -0.3074665 ]], shape=(4, 3), dtype=float64, device='native:0') 14:51:04 directions = (array([[-0.13341468, -0.37321123, -0.25195918], 14:51:04 [0.05371198, -0.4639925 , 0.1588227 ], 14:51:04 [0.59166147, 0.03545586, 0.03313082], 14:51:04 [0.04581011, -0.2949229 , -0.3074665 ]], shape=(4, 3), dtype=float64, device='native:0'),) 14:51:04 eps = 0.001 14:51:04 f = <_io.StringIO object at 0x7f6116c75af8> 14:51:04 gx_backward = array(0.05733418, shape=(), dtype=float64, device='native:0') 14:51:04 gx_numeric = array(0.02866709, shape=(), dtype=float64, device='native:0') 14:51:04 gy = array(0.8359375, shape=(), dtype=float16, device='native:0') 14:51:04 gys = (array(0.8359375, shape=(), dtype=float16, device='native:0'),) 14:51:04 i = 0 14:51:04 rtol = 0.003 14:51:04 self = <chainer.gradient_check._CheckBackward object at 0x7f60b2c70c88> 14:51:04 x = array([[0.24450684, -0.29174805, 0.37353516], 14:51:04 [-0.91162109, -0.18688965, -0.21130371], 14:51:04 [-0.42041016, 0.97705078, 0.97705078], 14:51:04 [0.3083496 , 0.79785156, 0.07928467]], shape=(4, 3), dtype=float16, device='native:0') 14:51:04 xs = (array([[0.24450684, -0.29174805, 0.37353516], 14:51:04 [-0.91162109, -0.18688965, -0.21130371], 14:51:04 [-0.42041016, 0.97705078, 0.97705078], 14:51:04 [0.3083496 , 0.79785156, 0.07928467]], shape=(4, 3), dtype=float16, device='native:0'),) 14:51:04 14:51:04 /workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:536: FunctionTestError ```
test
chainerx test max is flaky test max param axis in dtypes is module true out dtype shape device native args kwargs backend config obj pytest mark parametrize device devices def entry func device args kwargs backend config make backend config device name forward test obj cls try obj setup args kwargs obj run test forward backend config finally obj teardown if this is a numpyoptest instance skip backward double backward tests if the forward test succeeds with acceptable errors if isinstance obj numpyoptest if obj is forward successful with accept errors return success with expected errors backward test obj cls try obj setup args kwargs obj run test backward backend config args backend config cls device native kwargs obj repo tests chainerx tests op utils py repo tests chainerx tests op utils py in run test backward super optest self run test backward backend config workspace conda envs testenv lib site packages chainer testing function link py in run test backward do check workspace conda envs testenv lib site packages chainer testing function link py in do check self check backward options workspace conda envs testenv lib contextlib py in exit self gen throw type value traceback workspace conda envs testenv lib site packages chainer testing function link py in raise if fail cls fail message e workspace conda envs testenv lib site packages chainer testing function link py in fail utils raise from cls message exc workspace conda envs testenv lib site packages chainer utils init py in raise from six raise from new exc with traceback orig exc traceback none in raise from workspace conda envs testenv lib site packages chainer testing function link py in raise if fail yield workspace conda envs testenv lib site packages chainer testing function link py in do check self check backward options workspace conda envs testenv lib site packages chainer gradient check py in check backward detect nondifferentiable is immutable params false workspace conda envs testenv lib site packages chainer gradient check py in run self run workspace conda envs testenv lib site packages chainer gradient check py in run self compare gradients gx numeric gx backward directions self gx numeric array shape dtype device native gx backward array shape dtype device native directions array shape dtype device native def compare gradients self gx numeric gx backward directions atol self atol rtol self rtol compare the gradients try testing assert allclose gx numeric gx backward atol atol rtol rtol except assertionerror as e eps self eps xs self xs gys self gys f six stringio f write check backward failed eps atol rtol n format eps atol rtol for i x in enumerate xs f write inputs n format i f write n format x for i gy in enumerate gys f write grad outputs n format i f write n format gy for i d in enumerate directions f write directions n format i f write n format d f write gradients numeric n format gx numeric f write gradients backward n format gx backward f write n f write x numeric gradient y backward gradient f write str e raise assertionerror f getvalue e chainer testing function link functiontesterror backward is not implemented correctly e e caused by e assertionerror check backward failed eps atol rtol e inputs e array e e e shape dtype device native e grad outputs e array shape dtype device native e directions e array e e e shape dtype device native e gradients numeric array shape dtype device native e gradients backward array shape dtype device native e e x numeric gradient y backward gradient e not equal to tolerance rtol atol e e mismatch e max absolute difference e max relative difference e x array e y array e e assert allclose failed e shape e dtype e i e x e y e relative error e absolute error e relative tolerance y e absolute tolerance e total tolerance e x e y atol d array shape dtype device native directions array shape dtype device native eps f gx backward array shape dtype device native gx numeric array shape dtype device native gy array shape dtype device native gys array shape dtype device native i rtol self x array shape dtype device native xs array shape dtype device native workspace conda envs testenv lib site packages chainer gradient check py functiontesterror
1
511,330
14,858,340,978
IssuesEvent
2021-01-18 16:39:42
bounswe/bounswe2020group2
https://api.github.com/repos/bounswe/bounswe2020group2
opened
[ANDROID] Mail Verification & Password Change
effort: high priority: high type: android type: research
- [x] Research Firebase - [x] Send email to the mail address in these two cases: i. After he or she just registered to the system ii. The user tries to login to the system although the verification mail had sent but user does not still validate his or her email address - [x] Implement a mail verification fragment with its view model - [x] Design the layout of the mail verification fragment _**Deadline : 01.19.2021 @23.59**_
1.0
[ANDROID] Mail Verification & Password Change - - [x] Research Firebase - [x] Send email to the mail address in these two cases: i. After he or she just registered to the system ii. The user tries to login to the system although the verification mail had sent but user does not still validate his or her email address - [x] Implement a mail verification fragment with its view model - [x] Design the layout of the mail verification fragment _**Deadline : 01.19.2021 @23.59**_
non_test
mail verification password change research firebase send email to the mail address in these two cases i after he or she just registered to the system ii the user tries to login to the system although the verification mail had sent but user does not still validate his or her email address implement a mail verification fragment with its view model design the layout of the mail verification fragment deadline
0
89,556
15,830,791,618
IssuesEvent
2021-04-06 12:56:13
rsoreq/zenbot
https://api.github.com/repos/rsoreq/zenbot
reopened
WS-2019-0424 (Medium) detected in multiple libraries
security vulnerability
## WS-2019-0424 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>elliptic-6.4.0.tgz</b>, <b>elliptic-6.5.0.tgz</b>, <b>elliptic-6.5.2.tgz</b></p></summary> <p> <details><summary><b>elliptic-6.4.0.tgz</b></p></summary> <p>EC cryptography</p> <p>Library home page: <a href="https://registry.npmjs.org/elliptic/-/elliptic-6.4.0.tgz">https://registry.npmjs.org/elliptic/-/elliptic-6.4.0.tgz</a></p> <p>Path to dependency file: zenbot/package.json</p> <p>Path to vulnerable library: zenbot/node_modules/bitcore-lib/node_modules/elliptic/package.json</p> <p> Dependency Hierarchy: - adamant-api-0.5.3.tgz (Root Library) - bitcore-mnemonic-1.7.0.tgz - bitcore-lib-0.16.0.tgz - :x: **elliptic-6.4.0.tgz** (Vulnerable Library) </details> <details><summary><b>elliptic-6.5.0.tgz</b></p></summary> <p>EC cryptography</p> <p>Library home page: <a href="https://registry.npmjs.org/elliptic/-/elliptic-6.5.0.tgz">https://registry.npmjs.org/elliptic/-/elliptic-6.5.0.tgz</a></p> <p>Path to dependency file: zenbot/package.json</p> <p>Path to vulnerable library: zenbot/node_modules/elliptic/package.json</p> <p> Dependency Hierarchy: - adamant-api-0.5.3.tgz (Root Library) - web3-1.2.8.tgz - web3-utils-1.2.8.tgz - eth-lib-0.2.7.tgz - :x: **elliptic-6.5.0.tgz** (Vulnerable Library) </details> <details><summary><b>elliptic-6.5.2.tgz</b></p></summary> <p>EC cryptography</p> <p>Library home page: <a href="https://registry.npmjs.org/elliptic/-/elliptic-6.5.2.tgz">https://registry.npmjs.org/elliptic/-/elliptic-6.5.2.tgz</a></p> <p>Path to dependency file: zenbot/package.json</p> <p>Path to vulnerable library: zenbot/node_modules/secp256k1/node_modules/elliptic/package.json</p> <p> Dependency Hierarchy: - adamant-api-0.5.3.tgz (Root Library) - ethereumjs-util-6.2.0.tgz - secp256k1-3.8.0.tgz - :x: **elliptic-6.5.2.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/rsoreq/zenbot/commit/7a24c0d7b98ee76e6bac827974cff490a7694378">7a24c0d7b98ee76e6bac827974cff490a7694378</a></p> <p>Found in base branch: <b>unstable</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> all versions of elliptic are vulnerable to Timing Attack through side-channels. <p>Publish Date: 2019-11-13 <p>URL: <a href=https://github.com/indutny/elliptic/commit/ec735edde187a43693197f6fa3667ceade751a3a>WS-2019-0424</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Adjacent - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"elliptic","packageVersion":"6.4.0","isTransitiveDependency":true,"dependencyTree":"adamant-api:0.5.3;bitcore-mnemonic:1.7.0;bitcore-lib:0.16.0;elliptic:6.4.0","isMinimumFixVersionAvailable":false},{"packageType":"javascript/Node.js","packageName":"elliptic","packageVersion":"6.5.0","isTransitiveDependency":true,"dependencyTree":"adamant-api:0.5.3;web3:1.2.8;web3-utils:1.2.8;eth-lib:0.2.7;elliptic:6.5.0","isMinimumFixVersionAvailable":false},{"packageType":"javascript/Node.js","packageName":"elliptic","packageVersion":"6.5.2","isTransitiveDependency":true,"dependencyTree":"adamant-api:0.5.3;ethereumjs-util:6.2.0;secp256k1:3.8.0;elliptic:6.5.2","isMinimumFixVersionAvailable":false}],"vulnerabilityIdentifier":"WS-2019-0424","vulnerabilityDetails":"all versions of elliptic are vulnerable to Timing Attack through side-channels.","vulnerabilityUrl":"https://github.com/indutny/elliptic/commit/ec735edde187a43693197f6fa3667ceade751a3a","cvss3Severity":"medium","cvss3Score":"5.9","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Adjacent","I":"High"},"extraData":{}}</REMEDIATE> -->
True
WS-2019-0424 (Medium) detected in multiple libraries - ## WS-2019-0424 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>elliptic-6.4.0.tgz</b>, <b>elliptic-6.5.0.tgz</b>, <b>elliptic-6.5.2.tgz</b></p></summary> <p> <details><summary><b>elliptic-6.4.0.tgz</b></p></summary> <p>EC cryptography</p> <p>Library home page: <a href="https://registry.npmjs.org/elliptic/-/elliptic-6.4.0.tgz">https://registry.npmjs.org/elliptic/-/elliptic-6.4.0.tgz</a></p> <p>Path to dependency file: zenbot/package.json</p> <p>Path to vulnerable library: zenbot/node_modules/bitcore-lib/node_modules/elliptic/package.json</p> <p> Dependency Hierarchy: - adamant-api-0.5.3.tgz (Root Library) - bitcore-mnemonic-1.7.0.tgz - bitcore-lib-0.16.0.tgz - :x: **elliptic-6.4.0.tgz** (Vulnerable Library) </details> <details><summary><b>elliptic-6.5.0.tgz</b></p></summary> <p>EC cryptography</p> <p>Library home page: <a href="https://registry.npmjs.org/elliptic/-/elliptic-6.5.0.tgz">https://registry.npmjs.org/elliptic/-/elliptic-6.5.0.tgz</a></p> <p>Path to dependency file: zenbot/package.json</p> <p>Path to vulnerable library: zenbot/node_modules/elliptic/package.json</p> <p> Dependency Hierarchy: - adamant-api-0.5.3.tgz (Root Library) - web3-1.2.8.tgz - web3-utils-1.2.8.tgz - eth-lib-0.2.7.tgz - :x: **elliptic-6.5.0.tgz** (Vulnerable Library) </details> <details><summary><b>elliptic-6.5.2.tgz</b></p></summary> <p>EC cryptography</p> <p>Library home page: <a href="https://registry.npmjs.org/elliptic/-/elliptic-6.5.2.tgz">https://registry.npmjs.org/elliptic/-/elliptic-6.5.2.tgz</a></p> <p>Path to dependency file: zenbot/package.json</p> <p>Path to vulnerable library: zenbot/node_modules/secp256k1/node_modules/elliptic/package.json</p> <p> Dependency Hierarchy: - adamant-api-0.5.3.tgz (Root Library) - ethereumjs-util-6.2.0.tgz - secp256k1-3.8.0.tgz - :x: **elliptic-6.5.2.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/rsoreq/zenbot/commit/7a24c0d7b98ee76e6bac827974cff490a7694378">7a24c0d7b98ee76e6bac827974cff490a7694378</a></p> <p>Found in base branch: <b>unstable</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> all versions of elliptic are vulnerable to Timing Attack through side-channels. <p>Publish Date: 2019-11-13 <p>URL: <a href=https://github.com/indutny/elliptic/commit/ec735edde187a43693197f6fa3667ceade751a3a>WS-2019-0424</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Adjacent - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"elliptic","packageVersion":"6.4.0","isTransitiveDependency":true,"dependencyTree":"adamant-api:0.5.3;bitcore-mnemonic:1.7.0;bitcore-lib:0.16.0;elliptic:6.4.0","isMinimumFixVersionAvailable":false},{"packageType":"javascript/Node.js","packageName":"elliptic","packageVersion":"6.5.0","isTransitiveDependency":true,"dependencyTree":"adamant-api:0.5.3;web3:1.2.8;web3-utils:1.2.8;eth-lib:0.2.7;elliptic:6.5.0","isMinimumFixVersionAvailable":false},{"packageType":"javascript/Node.js","packageName":"elliptic","packageVersion":"6.5.2","isTransitiveDependency":true,"dependencyTree":"adamant-api:0.5.3;ethereumjs-util:6.2.0;secp256k1:3.8.0;elliptic:6.5.2","isMinimumFixVersionAvailable":false}],"vulnerabilityIdentifier":"WS-2019-0424","vulnerabilityDetails":"all versions of elliptic are vulnerable to Timing Attack through side-channels.","vulnerabilityUrl":"https://github.com/indutny/elliptic/commit/ec735edde187a43693197f6fa3667ceade751a3a","cvss3Severity":"medium","cvss3Score":"5.9","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Adjacent","I":"High"},"extraData":{}}</REMEDIATE> -->
non_test
ws medium detected in multiple libraries ws medium severity vulnerability vulnerable libraries elliptic tgz elliptic tgz elliptic tgz elliptic tgz ec cryptography library home page a href path to dependency file zenbot package json path to vulnerable library zenbot node modules bitcore lib node modules elliptic package json dependency hierarchy adamant api tgz root library bitcore mnemonic tgz bitcore lib tgz x elliptic tgz vulnerable library elliptic tgz ec cryptography library home page a href path to dependency file zenbot package json path to vulnerable library zenbot node modules elliptic package json dependency hierarchy adamant api tgz root library tgz utils tgz eth lib tgz x elliptic tgz vulnerable library elliptic tgz ec cryptography library home page a href path to dependency file zenbot package json path to vulnerable library zenbot node modules node modules elliptic package json dependency hierarchy adamant api tgz root library ethereumjs util tgz tgz x elliptic tgz vulnerable library found in head commit a href found in base branch unstable vulnerability details all versions of elliptic are vulnerable to timing attack through side channels publish date url a href cvss score details base score metrics exploitability metrics attack vector adjacent attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact high availability impact none for more information on scores click a href isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier ws vulnerabilitydetails all versions of elliptic are vulnerable to timing attack through side channels vulnerabilityurl
0
204,387
15,440,311,781
IssuesEvent
2021-03-08 02:58:24
lussierc/StockSwingPredictor
https://api.github.com/repos/lussierc/StockSwingPredictor
closed
Configure Testing Suite
feature testing
I need to take the time to configure my test suite for the project. By creating a test suite, I can ensure that my code functions as intended and that results generated by it are accurate & correct. As my tool is being implemented in Python I will use Pytest to test it. With this, there are a few things I need to do first to test my project. I need to: - [x] Implement the test suite configuration file, `conftest.py` - [x] Update the README with info about running the test suite - [x] Begin implementing test cases where possible (currently the scraping feature is the main one done, so I should begin by testing that).
1.0
Configure Testing Suite - I need to take the time to configure my test suite for the project. By creating a test suite, I can ensure that my code functions as intended and that results generated by it are accurate & correct. As my tool is being implemented in Python I will use Pytest to test it. With this, there are a few things I need to do first to test my project. I need to: - [x] Implement the test suite configuration file, `conftest.py` - [x] Update the README with info about running the test suite - [x] Begin implementing test cases where possible (currently the scraping feature is the main one done, so I should begin by testing that).
test
configure testing suite i need to take the time to configure my test suite for the project by creating a test suite i can ensure that my code functions as intended and that results generated by it are accurate correct as my tool is being implemented in python i will use pytest to test it with this there are a few things i need to do first to test my project i need to implement the test suite configuration file conftest py update the readme with info about running the test suite begin implementing test cases where possible currently the scraping feature is the main one done so i should begin by testing that
1
15,621
10,191,374,023
IssuesEvent
2019-08-12 08:14:03
spyder-ide/spyder
https://api.github.com/repos/spyder-ide/spyder
closed
Make changing the working directory via the toolbar work with local external Spyder kernels
component:IPython Console tag:Ux-usability type:Enhancement
<!--- **PLEASE READ:** When submitting here, please ensure you've completed the following checklist and checked the boxes to confirm. Issue reports without it may be closed. Thanks! ---> ## Problem Description Opened by the request of @ccordoba12 It would be very nice to make changing the working directory in the Spyder toolbar work with external local consoles. Currently, the UX is rather confusing since the toolbar updates with the actual working directory when it is changed in the console, and it allows me to appear to "change" this directory via typing it in or the file browser, but it doesn't change the the console's actual working directory and goes back to showing the actual working dir when running a file, changing tabs. ### What steps reproduce the problem? 1. Open an external kernel with ``spyder-kernels`` and connect to it with Spyder 2. Change the working directory via the file browser or direct entry on the toolbar 3. Check the actual working directory in the console ### What is the expected output? What do you see instead? Ideally, it would be nice for changing it via the GUI to work (I'm not sure why it couldn't), but if not then we should disable changing it there so as to not confuse/frustrate the user. Considering I can simply type ``%cd path/to/dir`` into the console in Spyder and it works, and Spyder can successfully read it from the console, I would think there should be some straightforward way to set it from the Spyder GUI, even just sending silent %cd commands to the console at worst.
True
Make changing the working directory via the toolbar work with local external Spyder kernels - <!--- **PLEASE READ:** When submitting here, please ensure you've completed the following checklist and checked the boxes to confirm. Issue reports without it may be closed. Thanks! ---> ## Problem Description Opened by the request of @ccordoba12 It would be very nice to make changing the working directory in the Spyder toolbar work with external local consoles. Currently, the UX is rather confusing since the toolbar updates with the actual working directory when it is changed in the console, and it allows me to appear to "change" this directory via typing it in or the file browser, but it doesn't change the the console's actual working directory and goes back to showing the actual working dir when running a file, changing tabs. ### What steps reproduce the problem? 1. Open an external kernel with ``spyder-kernels`` and connect to it with Spyder 2. Change the working directory via the file browser or direct entry on the toolbar 3. Check the actual working directory in the console ### What is the expected output? What do you see instead? Ideally, it would be nice for changing it via the GUI to work (I'm not sure why it couldn't), but if not then we should disable changing it there so as to not confuse/frustrate the user. Considering I can simply type ``%cd path/to/dir`` into the console in Spyder and it works, and Spyder can successfully read it from the console, I would think there should be some straightforward way to set it from the Spyder GUI, even just sending silent %cd commands to the console at worst.
non_test
make changing the working directory via the toolbar work with local external spyder kernels problem description opened by the request of it would be very nice to make changing the working directory in the spyder toolbar work with external local consoles currently the ux is rather confusing since the toolbar updates with the actual working directory when it is changed in the console and it allows me to appear to change this directory via typing it in or the file browser but it doesn t change the the console s actual working directory and goes back to showing the actual working dir when running a file changing tabs what steps reproduce the problem open an external kernel with spyder kernels and connect to it with spyder change the working directory via the file browser or direct entry on the toolbar check the actual working directory in the console what is the expected output what do you see instead ideally it would be nice for changing it via the gui to work i m not sure why it couldn t but if not then we should disable changing it there so as to not confuse frustrate the user considering i can simply type cd path to dir into the console in spyder and it works and spyder can successfully read it from the console i would think there should be some straightforward way to set it from the spyder gui even just sending silent cd commands to the console at worst
0
504,826
14,622,535,120
IssuesEvent
2020-12-23 00:33:34
BenJeau/react-native-draw
https://api.github.com/repos/BenJeau/react-native-draw
closed
Allow way to simplify SVG paths
enhancement priority: med
This feature would be inside the `BrushProperties` component, to toggle if its on or off. Paper.js has that functionnality (http://paperjs.org/reference/path/#simplify), but seems unecessary to import the whole package for one simple function.
1.0
Allow way to simplify SVG paths - This feature would be inside the `BrushProperties` component, to toggle if its on or off. Paper.js has that functionnality (http://paperjs.org/reference/path/#simplify), but seems unecessary to import the whole package for one simple function.
non_test
allow way to simplify svg paths this feature would be inside the brushproperties component to toggle if its on or off paper js has that functionnality but seems unecessary to import the whole package for one simple function
0
16,659
3,548,938,060
IssuesEvent
2016-01-20 16:14:04
dfernandezm/download-utils
https://api.github.com/repos/dfernandezm/download-utils
closed
Change layout of search table to be the same as status pages
front-end in-test story
The new layout is much more responsive and gives the same information. Reuse directives.
1.0
Change layout of search table to be the same as status pages - The new layout is much more responsive and gives the same information. Reuse directives.
test
change layout of search table to be the same as status pages the new layout is much more responsive and gives the same information reuse directives
1
303,246
22,960,654,540
IssuesEvent
2022-07-19 15:06:55
cal-itp/benefits
https://api.github.com/repos/cal-itp/benefits
opened
diagram text doesn't show up on docs in dark mode
bug documentation front-end
## To Reproduce Steps to reproduce the behavior: 1. Go to https://docs.calitp.org/benefits/#enrollment-process 2. Toggle to dark mode 3. See the diagram text ## Expected behavior A clear and concise description of what you expected to happen. ## Screenshots Light mode: <img width="690" alt="Light mode, where diagram text is visible" src="https://user-images.githubusercontent.com/86842/179783175-e3f0bedf-2fa7-4c6f-b617-24bcd18f0650.png"> Dark mode: <img width="695" alt="Dark mode, where diagram text isn't visible" src="https://user-images.githubusercontent.com/86842/179783181-a2500b98-ee51-4ab8-957a-beec530679ec.png"> Seems the issue is the text is the same color as the background: <img width="966" alt="Screenshot of diagram with the browser inspector open, showing the text color as grey" src="https://user-images.githubusercontent.com/86842/179783643-377153b6-688a-4af4-a115-39f367896d7d.png"> ## Desktop - OS: macOS 12.4 - Browser: Firefox - Version: 102.0.1 ## Additional context Interestingly, the [theme documentation](https://squidfunk.github.io/mkdocs-material/reference/diagrams/#usage) shows diagrams fine in light and dark mode.
1.0
diagram text doesn't show up on docs in dark mode - ## To Reproduce Steps to reproduce the behavior: 1. Go to https://docs.calitp.org/benefits/#enrollment-process 2. Toggle to dark mode 3. See the diagram text ## Expected behavior A clear and concise description of what you expected to happen. ## Screenshots Light mode: <img width="690" alt="Light mode, where diagram text is visible" src="https://user-images.githubusercontent.com/86842/179783175-e3f0bedf-2fa7-4c6f-b617-24bcd18f0650.png"> Dark mode: <img width="695" alt="Dark mode, where diagram text isn't visible" src="https://user-images.githubusercontent.com/86842/179783181-a2500b98-ee51-4ab8-957a-beec530679ec.png"> Seems the issue is the text is the same color as the background: <img width="966" alt="Screenshot of diagram with the browser inspector open, showing the text color as grey" src="https://user-images.githubusercontent.com/86842/179783643-377153b6-688a-4af4-a115-39f367896d7d.png"> ## Desktop - OS: macOS 12.4 - Browser: Firefox - Version: 102.0.1 ## Additional context Interestingly, the [theme documentation](https://squidfunk.github.io/mkdocs-material/reference/diagrams/#usage) shows diagrams fine in light and dark mode.
non_test
diagram text doesn t show up on docs in dark mode to reproduce steps to reproduce the behavior go to toggle to dark mode see the diagram text expected behavior a clear and concise description of what you expected to happen screenshots light mode img width alt light mode where diagram text is visible src dark mode img width alt dark mode where diagram text isn t visible src seems the issue is the text is the same color as the background img width alt screenshot of diagram with the browser inspector open showing the text color as grey src desktop os macos browser firefox version additional context interestingly the shows diagrams fine in light and dark mode
0
227,269
18,054,253,867
IssuesEvent
2021-09-20 05:21:07
logicmoo/logicmoo_workspace
https://api.github.com/repos/logicmoo/logicmoo_workspace
opened
logicmoo.pfc.test.sanity_base.RETRY_INHERITANCE_07 JUnit
Test_9999 logicmoo.pfc.test.sanity_base unit_test RETRY_INHERITANCE_07 Passing
(cd /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base ; timeout --foreground --preserve-status -s SIGKILL -k 10s 10s swipl -x /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-clif retry_inheritance_07.pfc) % ISSUE: https://github.com/logicmoo/logicmoo_workspace/issues/ % EDIT: https://github.com/logicmoo/logicmoo_workspace/edit/master/packs_sys/pfc/t/sanity_base/retry_inheritance_07.pfc % JENKINS: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/RETRY_INHERITANCE_07/logicmoo_pfc_test_sanity_base_RETRY_INHERITANCE_07_JUnit/ % ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ARETRY_INHERITANCE_07 ``` %~ init_phase(after_load) %~ init_phase(restore_state) % running('/var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/retry_inheritance_07.pfc'), %~ this_test_might_need( :-( use_module( library(logicmoo_plarkc)))) :- if((pfc_test_feature(localMt,X=1),X==1)). :- endif. :- if((pfc_test_feature(mt,X=1),X==1)). :- endif. :- set_prolog_flag(retry_undefined, kb_shared). :- \+ a. % ISSUE: https://github.com/logicmoo/logicmoo_workspace/issues/ % EDIT: https://github.com/logicmoo/logicmoo_workspace/edit/master/packs_sys/pfc/t/sanity_base/retry_inheritance_07.pfc % JENKINS: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/RETRY_INHERITANCE_07/logicmoo_pfc_test_sanity_base_RETRY_INHERITANCE_07_JUnit/ % ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ARETRY_INHERITANCE_07 %~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/retry_inheritance_07.pfc:20 %~ mpred_test("Test_0001_Line_0000__naf_A",baseKB:(\+a)) %~ make_dynamic_here(baseKB,a) /*~ %~ mpred_test("Test_0001_Line_0000__naf_A",baseKB:(\+a)) passed=info(why_was_true(baseKB:(\+a))) no_proof_for(\+a). no_proof_for(\+a). no_proof_for(\+a). name = 'logicmoo.pfc.test.sanity_base.RETRY_INHERITANCE_07-Test_0001_Line_0000__naf_A'. JUNIT_CLASSNAME = 'logicmoo.pfc.test.sanity_base.RETRY_INHERITANCE_07'. JUNIT_CMD = 'timeout --foreground --preserve-status -s SIGKILL -k 10s 10s swipl -x /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-clif retry_inheritance_07.pfc'. % saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-pfc-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.RETRY_INHERITANCE_07-Test_0001_Line_0000__naf_A-junit.xml ~*/ %~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/retry_inheritance_07.pfc:27 %~ unused(no_junit_results) Test_0001_Line_0000__naf_A result = passed. %~ test_completed_exit(64) ``` totalTime=1.000 SUCCESS: /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-junit-minor -k retry_inheritance_07.pfc (returned 64) Add_LABELS='' Rem_LABELS='Skipped,Errors,Warnings,Overtime,Skipped,Skipped'
3.0
logicmoo.pfc.test.sanity_base.RETRY_INHERITANCE_07 JUnit - (cd /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base ; timeout --foreground --preserve-status -s SIGKILL -k 10s 10s swipl -x /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-clif retry_inheritance_07.pfc) % ISSUE: https://github.com/logicmoo/logicmoo_workspace/issues/ % EDIT: https://github.com/logicmoo/logicmoo_workspace/edit/master/packs_sys/pfc/t/sanity_base/retry_inheritance_07.pfc % JENKINS: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/RETRY_INHERITANCE_07/logicmoo_pfc_test_sanity_base_RETRY_INHERITANCE_07_JUnit/ % ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ARETRY_INHERITANCE_07 ``` %~ init_phase(after_load) %~ init_phase(restore_state) % running('/var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/retry_inheritance_07.pfc'), %~ this_test_might_need( :-( use_module( library(logicmoo_plarkc)))) :- if((pfc_test_feature(localMt,X=1),X==1)). :- endif. :- if((pfc_test_feature(mt,X=1),X==1)). :- endif. :- set_prolog_flag(retry_undefined, kb_shared). :- \+ a. % ISSUE: https://github.com/logicmoo/logicmoo_workspace/issues/ % EDIT: https://github.com/logicmoo/logicmoo_workspace/edit/master/packs_sys/pfc/t/sanity_base/retry_inheritance_07.pfc % JENKINS: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/RETRY_INHERITANCE_07/logicmoo_pfc_test_sanity_base_RETRY_INHERITANCE_07_JUnit/ % ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3ARETRY_INHERITANCE_07 %~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/retry_inheritance_07.pfc:20 %~ mpred_test("Test_0001_Line_0000__naf_A",baseKB:(\+a)) %~ make_dynamic_here(baseKB,a) /*~ %~ mpred_test("Test_0001_Line_0000__naf_A",baseKB:(\+a)) passed=info(why_was_true(baseKB:(\+a))) no_proof_for(\+a). no_proof_for(\+a). no_proof_for(\+a). name = 'logicmoo.pfc.test.sanity_base.RETRY_INHERITANCE_07-Test_0001_Line_0000__naf_A'. JUNIT_CLASSNAME = 'logicmoo.pfc.test.sanity_base.RETRY_INHERITANCE_07'. JUNIT_CMD = 'timeout --foreground --preserve-status -s SIGKILL -k 10s 10s swipl -x /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-clif retry_inheritance_07.pfc'. % saving_junit: /var/lib/jenkins/workspace/logicmoo_workspace/test_results/jenkins/Report-logicmoo-pfc-test-sanity_base-vSTARv0vSTARvvDOTvvSTARv-Units-logicmoo.pfc.test.sanity_base.RETRY_INHERITANCE_07-Test_0001_Line_0000__naf_A-junit.xml ~*/ %~ /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/retry_inheritance_07.pfc:27 %~ unused(no_junit_results) Test_0001_Line_0000__naf_A result = passed. %~ test_completed_exit(64) ``` totalTime=1.000 SUCCESS: /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-junit-minor -k retry_inheritance_07.pfc (returned 64) Add_LABELS='' Rem_LABELS='Skipped,Errors,Warnings,Overtime,Skipped,Skipped'
test
logicmoo pfc test sanity base retry inheritance junit cd var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base timeout foreground preserve status s sigkill k swipl x var lib jenkins workspace logicmoo workspace bin lmoo clif retry inheritance pfc issue edit jenkins issue search init phase after load init phase restore state running var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base retry inheritance pfc this test might need use module library logicmoo plarkc if pfc test feature localmt x x endif if pfc test feature mt x x endif set prolog flag retry undefined kb shared a issue edit jenkins issue search var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base retry inheritance pfc mpred test test line naf a basekb a make dynamic here basekb a mpred test test line naf a basekb a passed info why was true basekb a no proof for a no proof for a no proof for a name logicmoo pfc test sanity base retry inheritance test line naf a junit classname logicmoo pfc test sanity base retry inheritance junit cmd timeout foreground preserve status s sigkill k swipl x var lib jenkins workspace logicmoo workspace bin lmoo clif retry inheritance pfc saving junit var lib jenkins workspace logicmoo workspace test results jenkins report logicmoo pfc test sanity base units logicmoo pfc test sanity base retry inheritance test line naf a junit xml var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base retry inheritance pfc unused no junit results test line naf a result passed test completed exit totaltime success var lib jenkins workspace logicmoo workspace bin lmoo junit minor k retry inheritance pfc returned add labels rem labels skipped errors warnings overtime skipped skipped
1
215,303
16,663,494,611
IssuesEvent
2021-06-06 19:06:19
summercms/sc-security-module
https://api.github.com/repos/summercms/sc-security-module
closed
Backend dashboard to handle missing operating system entries
Dashboard FINSIHED Firewall Priority: High Testing - Passed bug 🐛
### Description: - [x] Backend dashboard to handle missing operating system entries.
1.0
Backend dashboard to handle missing operating system entries - ### Description: - [x] Backend dashboard to handle missing operating system entries.
test
backend dashboard to handle missing operating system entries description backend dashboard to handle missing operating system entries
1
98,034
12,287,253,936
IssuesEvent
2020-05-09 11:17:36
reach4help/reach4help
https://api.github.com/repos/reach4help/reach4help
reopened
UI of CAV Flow View #4: Map
UI Design
Since the CAV UX is ready, we are able to start building the UI of CAV View "Timeline". Referred by the attached image or this [link](https://www.figma.com/file/OiU7Tl4k1YFfqZQXxLaRrl/Reach4Help?node-id=1055%3A235) ![Screen Shot 2020-04-22 at 8.57.56 PM.png](https://images.zenhubusercontent.com/5e9f44c44d7b4947cc6b5e0a/13b5f872-2146-4753-9e71-42ed8fe82253) <!--zenhub info: do not edit anything after this line, it will be automatically changed--> -------- ### [ZenHub Information](https://app.zenhub.com/workspaces/reach4help-5e8dcbfb14ac087f410cbabb/issues/reach4help/reach4help/460) *This information is updated automatically. To modify it, please use ZenHub.* **Belonging to Epics:** * **[EPIC]** [#459 - CAV Flow View #4: Map](https://github.com/reach4help/reach4help/issues/459) <!--zenhub info end-->
1.0
UI of CAV Flow View #4: Map - Since the CAV UX is ready, we are able to start building the UI of CAV View "Timeline". Referred by the attached image or this [link](https://www.figma.com/file/OiU7Tl4k1YFfqZQXxLaRrl/Reach4Help?node-id=1055%3A235) ![Screen Shot 2020-04-22 at 8.57.56 PM.png](https://images.zenhubusercontent.com/5e9f44c44d7b4947cc6b5e0a/13b5f872-2146-4753-9e71-42ed8fe82253) <!--zenhub info: do not edit anything after this line, it will be automatically changed--> -------- ### [ZenHub Information](https://app.zenhub.com/workspaces/reach4help-5e8dcbfb14ac087f410cbabb/issues/reach4help/reach4help/460) *This information is updated automatically. To modify it, please use ZenHub.* **Belonging to Epics:** * **[EPIC]** [#459 - CAV Flow View #4: Map](https://github.com/reach4help/reach4help/issues/459) <!--zenhub info end-->
non_test
ui of cav flow view map since the cav ux is ready we are able to start building the ui of cav view timeline referred by the attached image or this this information is updated automatically to modify it please use zenhub belonging to epics
0
178,451
13,780,338,729
IssuesEvent
2020-10-08 14:48:55
hashgraph/hedera-services
https://api.github.com/repos/hashgraph/hedera-services
closed
Create new test methods to cover payer records to state
Test Development enhancement
**Summary** In `0.7.0` each `MerkleAccount` has a new `FCQueue` with the records of all transactions it paid for in the last 3min of consensus time, **including duplicates**. (In particular, these records are used to classify duplicates.) Existing tests will not cover the system's management of duplicate (or unclassifiable) records.
1.0
Create new test methods to cover payer records to state - **Summary** In `0.7.0` each `MerkleAccount` has a new `FCQueue` with the records of all transactions it paid for in the last 3min of consensus time, **including duplicates**. (In particular, these records are used to classify duplicates.) Existing tests will not cover the system's management of duplicate (or unclassifiable) records.
test
create new test methods to cover payer records to state summary in each merkleaccount has a new fcqueue with the records of all transactions it paid for in the last of consensus time including duplicates in particular these records are used to classify duplicates existing tests will not cover the system s management of duplicate or unclassifiable records
1
51,691
7,723,572,261
IssuesEvent
2018-05-24 12:52:31
PowerShell/xPSDesiredStateConfiguration
https://api.github.com/repos/PowerShell/xPSDesiredStateConfiguration
closed
xPSDesiredStateConfiguration: Add Codecov badges for both master and dev branch
documentation in progress
I suggest we should add a badges for both branches and a [section Branches](https://github.com/PowerShell/SqlServerDsc#branches).
1.0
xPSDesiredStateConfiguration: Add Codecov badges for both master and dev branch - I suggest we should add a badges for both branches and a [section Branches](https://github.com/PowerShell/SqlServerDsc#branches).
non_test
xpsdesiredstateconfiguration add codecov badges for both master and dev branch i suggest we should add a badges for both branches and a
0
267,506
23,304,306,244
IssuesEvent
2022-08-07 19:49:41
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
opened
roachtest: loqrecovery/workload=tpcc/rangeSize=16mb failed
C-test-failure O-robot O-roachtest branch-master
roachtest.loqrecovery/workload=tpcc/rangeSize=16mb [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/5997534?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/5997534?buildTab=artifacts#/loqrecovery/workload=tpcc/rangeSize=16mb) on master @ [a7c91f06d8ee0fa2096bcd626f689009024947bb](https://github.com/cockroachdb/cockroach/commits/a7c91f06d8ee0fa2096bcd626f689009024947bb): ``` test artifacts and logs in: /artifacts/loqrecovery/workload=tpcc/rangeSize=16mb/run_1 test_runner.go:1027,test_runner.go:926: test timed out (0s) ``` <p>Parameters: <code>ROACHTEST_cloud=gce</code> , <code>ROACHTEST_cpu=4</code> , <code>ROACHTEST_ssd=0</code> </p> <details><summary>Help</summary> <p> See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md) See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7) </p> </details> /cc @cockroachdb/kv-triage <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*loqrecovery/workload=tpcc/rangeSize=16mb.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub>
2.0
roachtest: loqrecovery/workload=tpcc/rangeSize=16mb failed - roachtest.loqrecovery/workload=tpcc/rangeSize=16mb [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/5997534?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/5997534?buildTab=artifacts#/loqrecovery/workload=tpcc/rangeSize=16mb) on master @ [a7c91f06d8ee0fa2096bcd626f689009024947bb](https://github.com/cockroachdb/cockroach/commits/a7c91f06d8ee0fa2096bcd626f689009024947bb): ``` test artifacts and logs in: /artifacts/loqrecovery/workload=tpcc/rangeSize=16mb/run_1 test_runner.go:1027,test_runner.go:926: test timed out (0s) ``` <p>Parameters: <code>ROACHTEST_cloud=gce</code> , <code>ROACHTEST_cpu=4</code> , <code>ROACHTEST_ssd=0</code> </p> <details><summary>Help</summary> <p> See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md) See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7) </p> </details> /cc @cockroachdb/kv-triage <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*loqrecovery/workload=tpcc/rangeSize=16mb.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub>
test
roachtest loqrecovery workload tpcc rangesize failed roachtest loqrecovery workload tpcc rangesize with on master test artifacts and logs in artifacts loqrecovery workload tpcc rangesize run test runner go test runner go test timed out parameters roachtest cloud gce roachtest cpu roachtest ssd help see see cc cockroachdb kv triage
1
196,190
14,839,515,563
IssuesEvent
2021-01-16 01:04:34
backend-br/vagas
https://api.github.com/repos/backend-br/vagas
closed
[100% REMOTO] Sr. Java Developer @Mazzatech
AWS Alocado CLT DevOps Docker Especialista Java Kubernetes Linux NodeJS Remoto Rest SOAP SQL Spring Stale Testes Unitários
## Nossa empresa Profissional será contratado pela Mazzatech, uma consultoria de TI com mais de 10 anos de atuação neste mercado, para atuar alocado em grande banco de investimentos situado em São Paulo. ## Descrição da vaga Atuará na Equipe IT Digital, que está inserida dentro do pilar de System Development/Digital e é responsável por desenvolver o Portal de Clientes (Internet Banking), que é uma ferramenta voltada para os clientes deste banco e tem como objetivo disponibilizar informações sobre seus investimentos, transações, onboarding digital e acesso a produtos e serviços. Também é responsável por outros canais de comunicação com os clientes, como o Portal Institucional, Área Administrativa B2B e Soluções mobile. Atividades relacionadas à vaga: - Atuação junto à equipe de Desenvolvimento que utiliza em seu trabalho algumas tecnologias como Java, EJB, WebServices REST/SOAP, Node.js, Docker, Kubernetes, AWS. - Trabalhar com aquitetura orientada a microsserviços e modelo Devops. - Desenvolver e expandir do pipeline de produtos do Portal de Clientes do BTG e soluções para a plataforma de agentes autônomos. - Atuar nos projetos que consistem em aumentar e flexibilizar produtos de renda fixa disponíveis na plataforma do Digital. - Diminuir o trabalho operacional do time de back office e aumentando a captação financeira. ## Local Empresa fica na região da Faria Lima, São Paulo. Fica a critério do profissional voltar para o escritório após a pandemia ou continuar 100% remoto. ## Requisitos Conhecimentos Técnicos: Java Spring Boot SQL REST/SOAP Conhecimentos Desejáveis: Testes unitários (JUnit) Jenkins Kubernetes Experiência Amazon AWS Experiência com arquitetura de microsserviços NodeJS Linux Básico ## Benefícios - Plano de saúde - Seguro de vida - VR 500,00 ao mês - Assistência Odontológica - PLR ## Contratação Regime CLT ## Como se candidatar Por favor envie um email para [email protected] com seu CV anexado - enviar no assunto: Vaga Java Sr. ## Tempo médio de feedbacks Costumamos enviar feedbacks em até 5 dias após cada processo. E-mail para contato em caso de não haver resposta: [email protected] ## Labels #### Alocação - Remoto #### Regime - CLT #### Nível - Sênior - Especialista
1.0
[100% REMOTO] Sr. Java Developer @Mazzatech - ## Nossa empresa Profissional será contratado pela Mazzatech, uma consultoria de TI com mais de 10 anos de atuação neste mercado, para atuar alocado em grande banco de investimentos situado em São Paulo. ## Descrição da vaga Atuará na Equipe IT Digital, que está inserida dentro do pilar de System Development/Digital e é responsável por desenvolver o Portal de Clientes (Internet Banking), que é uma ferramenta voltada para os clientes deste banco e tem como objetivo disponibilizar informações sobre seus investimentos, transações, onboarding digital e acesso a produtos e serviços. Também é responsável por outros canais de comunicação com os clientes, como o Portal Institucional, Área Administrativa B2B e Soluções mobile. Atividades relacionadas à vaga: - Atuação junto à equipe de Desenvolvimento que utiliza em seu trabalho algumas tecnologias como Java, EJB, WebServices REST/SOAP, Node.js, Docker, Kubernetes, AWS. - Trabalhar com aquitetura orientada a microsserviços e modelo Devops. - Desenvolver e expandir do pipeline de produtos do Portal de Clientes do BTG e soluções para a plataforma de agentes autônomos. - Atuar nos projetos que consistem em aumentar e flexibilizar produtos de renda fixa disponíveis na plataforma do Digital. - Diminuir o trabalho operacional do time de back office e aumentando a captação financeira. ## Local Empresa fica na região da Faria Lima, São Paulo. Fica a critério do profissional voltar para o escritório após a pandemia ou continuar 100% remoto. ## Requisitos Conhecimentos Técnicos: Java Spring Boot SQL REST/SOAP Conhecimentos Desejáveis: Testes unitários (JUnit) Jenkins Kubernetes Experiência Amazon AWS Experiência com arquitetura de microsserviços NodeJS Linux Básico ## Benefícios - Plano de saúde - Seguro de vida - VR 500,00 ao mês - Assistência Odontológica - PLR ## Contratação Regime CLT ## Como se candidatar Por favor envie um email para [email protected] com seu CV anexado - enviar no assunto: Vaga Java Sr. ## Tempo médio de feedbacks Costumamos enviar feedbacks em até 5 dias após cada processo. E-mail para contato em caso de não haver resposta: [email protected] ## Labels #### Alocação - Remoto #### Regime - CLT #### Nível - Sênior - Especialista
test
sr java developer mazzatech nossa empresa profissional será contratado pela mazzatech uma consultoria de ti com mais de anos de atuação neste mercado para atuar alocado em grande banco de investimentos situado em são paulo descrição da vaga atuará na equipe it digital que está inserida dentro do pilar de system development digital e é responsável por desenvolver o portal de clientes internet banking que é uma ferramenta voltada para os clientes deste banco e tem como objetivo disponibilizar informações sobre seus investimentos transações onboarding digital e acesso a produtos e serviços também é responsável por outros canais de comunicação com os clientes como o portal institucional área administrativa e soluções mobile atividades relacionadas à vaga atuação junto à equipe de desenvolvimento que utiliza em seu trabalho algumas tecnologias como java ejb webservices rest soap node js docker kubernetes aws trabalhar com aquitetura orientada a microsserviços e modelo devops desenvolver e expandir do pipeline de produtos do portal de clientes do btg e soluções para a plataforma de agentes autônomos atuar nos projetos que consistem em aumentar e flexibilizar produtos de renda fixa disponíveis na plataforma do digital diminuir o trabalho operacional do time de back office e aumentando a captação financeira local empresa fica na região da faria lima são paulo fica a critério do profissional voltar para o escritório após a pandemia ou continuar remoto requisitos conhecimentos técnicos java spring boot sql rest soap conhecimentos desejáveis testes unitários junit jenkins kubernetes experiência amazon aws experiência com arquitetura de microsserviços nodejs linux básico benefícios plano de saúde seguro de vida vr ao mês assistência odontológica plr contratação regime clt como se candidatar por favor envie um email para taiane mazza tech com seu cv anexado enviar no assunto vaga java sr tempo médio de feedbacks costumamos enviar feedbacks em até dias após cada processo e mail para contato em caso de não haver resposta taiane mazza tech labels alocação remoto regime clt nível sênior especialista
1
313,927
26,962,811,480
IssuesEvent
2023-02-08 19:36:39
void-linux/void-packages
https://api.github.com/repos/void-linux/void-packages
opened
fuse2fs not included in e2fsprogs
bug needs-testing
### Is this a new report? Yes ### System Info Void 6.1.9_1 x86_64 GenuineIntel uptodate rrrrmmnFFFFFF ### Package(s) Affected e2fsprogs-1.46.5_1 ### Does a report exist for this bug with the project's home (upstream) and/or another distro? It is included in other distros but not Void's: https://github.com/tytso/e2fsprogs/blob/master/misc/fuse2fs.c ### Expected behaviour It would include fuse2fs, which is required for apptainer. ### Actual behaviour Not included. ### Steps to reproduce 1. Install 2. `which fuse2fs` -> not present
1.0
fuse2fs not included in e2fsprogs - ### Is this a new report? Yes ### System Info Void 6.1.9_1 x86_64 GenuineIntel uptodate rrrrmmnFFFFFF ### Package(s) Affected e2fsprogs-1.46.5_1 ### Does a report exist for this bug with the project's home (upstream) and/or another distro? It is included in other distros but not Void's: https://github.com/tytso/e2fsprogs/blob/master/misc/fuse2fs.c ### Expected behaviour It would include fuse2fs, which is required for apptainer. ### Actual behaviour Not included. ### Steps to reproduce 1. Install 2. `which fuse2fs` -> not present
test
not included in is this a new report yes system info void genuineintel uptodate rrrrmmnffffff package s affected does a report exist for this bug with the project s home upstream and or another distro it is included in other distros but not void s expected behaviour it would include which is required for apptainer actual behaviour not included steps to reproduce install which not present
1
12,090
14,740,070,008
IssuesEvent
2021-01-07 08:27:51
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Sarasota - SA Billing - Late Fee Account List
anc-process anp-important ant-bug has attachment
In GitLab by @kdjstudios on Oct 3, 2018, 11:05 [Sarasota.xlsx](/uploads/66633f8d0277b1657e305d94194fa157/Sarasota.xlsx) HD: http://www.servicedesk.answernet.com/profiles/ticket/2018-10-03-63235/conversation
1.0
Sarasota - SA Billing - Late Fee Account List - In GitLab by @kdjstudios on Oct 3, 2018, 11:05 [Sarasota.xlsx](/uploads/66633f8d0277b1657e305d94194fa157/Sarasota.xlsx) HD: http://www.servicedesk.answernet.com/profiles/ticket/2018-10-03-63235/conversation
non_test
sarasota sa billing late fee account list in gitlab by kdjstudios on oct uploads sarasota xlsx hd
0
551,057
16,136,819,307
IssuesEvent
2021-04-29 12:54:51
Heyimlulu/AutoInsuranceProject
https://api.github.com/repos/Heyimlulu/AutoInsuranceProject
closed
ERROR SpringApplication Application run failed
High priority bug
# On pull request #11 (Add coverage table) ## Error logs ``` 21-04-29 Thu 14:41:43.658 ERROR SpringApplication Application run failed org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'jpaAuditingHandler': Cannot resolve reference to bean 'jpaMappingContext' while setting constructor argument; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'jpaMappingContext': Invocation of init method failed; nested exception is java.lang.IllegalStateException: Failed to asynchronously initialize native EntityManagerFactory: java.lang.NoClassDefFoundError: com/fasterxml/classmate/TypeResolver at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:342) at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:113) at org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:690) at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:196) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1358) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1204) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:557) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:517) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:323) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:226) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:321) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202) at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:893) at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:879) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:551) at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:143) at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:758) at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:750) at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:397) at org.springframework.boot.SpringApplication.run(SpringApplication.java:315) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1237) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1226) at com.namai.assurance.AssuranceApplication.main(AssuranceApplication.java:12) Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'jpaMappingContext': Invocation of init method failed; nested exception is java.lang.IllegalStateException: Failed to asynchronously initialize native EntityManagerFactory: java.lang.NoClassDefFoundError: com/fasterxml/classmate/TypeResolver at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1796) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:595) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:517) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:323) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:226) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:321) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202) at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:330) ... 22 common frames omitted Caused by: java.lang.IllegalStateException: Failed to asynchronously initialize native EntityManagerFactory: java.lang.NoClassDefFoundError: com/fasterxml/classmate/TypeResolver at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.getNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:553) at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.invokeProxyMethod(AbstractEntityManagerFactoryBean.java:497) at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean$ManagedEntityManagerFactoryInvocationHandler.invoke(AbstractEntityManagerFactoryBean.java:680) at com.sun.proxy.$Proxy72.getMetamodel(Unknown Source) at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) at java.base/java.util.Iterator.forEachRemaining(Iterator.java:133) at java.base/java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) at org.springframework.data.jpa.repository.config.JpaMetamodelMappingContextFactoryBean.getMetamodels(JpaMetamodelMappingContextFactoryBean.java:106) at org.springframework.data.jpa.repository.config.JpaMetamodelMappingContextFactoryBean.createInstance(JpaMetamodelMappingContextFactoryBean.java:80) at org.springframework.data.jpa.repository.config.JpaMetamodelMappingContextFactoryBean.createInstance(JpaMetamodelMappingContextFactoryBean.java:44) at org.springframework.beans.factory.config.AbstractFactoryBean.afterPropertiesSet(AbstractFactoryBean.java:142) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1855) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1792) ... 29 common frames omitted Caused by: java.lang.NoClassDefFoundError: com/fasterxml/classmate/TypeResolver at org.hibernate.boot.internal.ClassmateContext.<init>(ClassmateContext.java:16) at org.hibernate.boot.internal.BootstrapContextImpl.<init>(BootstrapContextImpl.java:84) at org.hibernate.boot.internal.MetadataBuilderImpl.<init>(MetadataBuilderImpl.java:123) at org.hibernate.boot.MetadataSources.getMetadataBuilder(MetadataSources.java:141) at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.<init>(EntityManagerFactoryBuilderImpl.java:238) at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.<init>(EntityManagerFactoryBuilderImpl.java:168) at org.springframework.orm.jpa.vendor.SpringHibernateJpaPersistenceProvider.createContainerEntityManagerFactory(SpringHibernateJpaPersistenceProvider.java:52) at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:365) at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.buildNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:391) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) at java.base/java.lang.Thread.run(Thread.java:832) Caused by: java.lang.ClassNotFoundException: com.fasterxml.classmate.TypeResolver at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:606) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:168) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522) ... 13 common frames omitted ```
1.0
ERROR SpringApplication Application run failed - # On pull request #11 (Add coverage table) ## Error logs ``` 21-04-29 Thu 14:41:43.658 ERROR SpringApplication Application run failed org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'jpaAuditingHandler': Cannot resolve reference to bean 'jpaMappingContext' while setting constructor argument; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'jpaMappingContext': Invocation of init method failed; nested exception is java.lang.IllegalStateException: Failed to asynchronously initialize native EntityManagerFactory: java.lang.NoClassDefFoundError: com/fasterxml/classmate/TypeResolver at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:342) at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:113) at org.springframework.beans.factory.support.ConstructorResolver.resolveConstructorArguments(ConstructorResolver.java:690) at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:196) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1358) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1204) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:557) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:517) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:323) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:226) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:321) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202) at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:893) at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:879) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:551) at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:143) at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:758) at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:750) at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:397) at org.springframework.boot.SpringApplication.run(SpringApplication.java:315) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1237) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1226) at com.namai.assurance.AssuranceApplication.main(AssuranceApplication.java:12) Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'jpaMappingContext': Invocation of init method failed; nested exception is java.lang.IllegalStateException: Failed to asynchronously initialize native EntityManagerFactory: java.lang.NoClassDefFoundError: com/fasterxml/classmate/TypeResolver at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1796) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:595) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:517) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:323) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:226) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:321) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202) at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:330) ... 22 common frames omitted Caused by: java.lang.IllegalStateException: Failed to asynchronously initialize native EntityManagerFactory: java.lang.NoClassDefFoundError: com/fasterxml/classmate/TypeResolver at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.getNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:553) at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.invokeProxyMethod(AbstractEntityManagerFactoryBean.java:497) at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean$ManagedEntityManagerFactoryInvocationHandler.invoke(AbstractEntityManagerFactoryBean.java:680) at com.sun.proxy.$Proxy72.getMetamodel(Unknown Source) at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) at java.base/java.util.Iterator.forEachRemaining(Iterator.java:133) at java.base/java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) at org.springframework.data.jpa.repository.config.JpaMetamodelMappingContextFactoryBean.getMetamodels(JpaMetamodelMappingContextFactoryBean.java:106) at org.springframework.data.jpa.repository.config.JpaMetamodelMappingContextFactoryBean.createInstance(JpaMetamodelMappingContextFactoryBean.java:80) at org.springframework.data.jpa.repository.config.JpaMetamodelMappingContextFactoryBean.createInstance(JpaMetamodelMappingContextFactoryBean.java:44) at org.springframework.beans.factory.config.AbstractFactoryBean.afterPropertiesSet(AbstractFactoryBean.java:142) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1855) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1792) ... 29 common frames omitted Caused by: java.lang.NoClassDefFoundError: com/fasterxml/classmate/TypeResolver at org.hibernate.boot.internal.ClassmateContext.<init>(ClassmateContext.java:16) at org.hibernate.boot.internal.BootstrapContextImpl.<init>(BootstrapContextImpl.java:84) at org.hibernate.boot.internal.MetadataBuilderImpl.<init>(MetadataBuilderImpl.java:123) at org.hibernate.boot.MetadataSources.getMetadataBuilder(MetadataSources.java:141) at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.<init>(EntityManagerFactoryBuilderImpl.java:238) at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.<init>(EntityManagerFactoryBuilderImpl.java:168) at org.springframework.orm.jpa.vendor.SpringHibernateJpaPersistenceProvider.createContainerEntityManagerFactory(SpringHibernateJpaPersistenceProvider.java:52) at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:365) at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.buildNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:391) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) at java.base/java.lang.Thread.run(Thread.java:832) Caused by: java.lang.ClassNotFoundException: com.fasterxml.classmate.TypeResolver at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:606) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:168) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522) ... 13 common frames omitted ```
non_test
error springapplication application run failed on pull request add coverage table error logs thu error springapplication application run failed org springframework beans factory beancreationexception error creating bean with name jpaauditinghandler cannot resolve reference to bean jpamappingcontext while setting constructor argument nested exception is org springframework beans factory beancreationexception error creating bean with name jpamappingcontext invocation of init method failed nested exception is java lang illegalstateexception failed to asynchronously initialize native entitymanagerfactory java lang noclassdeffounderror com fasterxml classmate typeresolver at org springframework beans factory support beandefinitionvalueresolver resolvereference beandefinitionvalueresolver java at org springframework beans factory support beandefinitionvalueresolver resolvevalueifnecessary beandefinitionvalueresolver java at org springframework beans factory support constructorresolver resolveconstructorarguments constructorresolver java at org springframework beans factory support constructorresolver autowireconstructor constructorresolver java at org springframework beans factory support abstractautowirecapablebeanfactory autowireconstructor abstractautowirecapablebeanfactory java at org springframework beans factory support abstractautowirecapablebeanfactory createbeaninstance abstractautowirecapablebeanfactory java at org springframework beans factory support abstractautowirecapablebeanfactory docreatebean abstractautowirecapablebeanfactory java at org springframework beans factory support abstractautowirecapablebeanfactory createbean abstractautowirecapablebeanfactory java at org springframework beans factory support abstractbeanfactory lambda dogetbean abstractbeanfactory java at org springframework beans factory support defaultsingletonbeanregistry getsingleton defaultsingletonbeanregistry java at org springframework beans factory support abstractbeanfactory dogetbean abstractbeanfactory java at org springframework beans factory support abstractbeanfactory getbean abstractbeanfactory java at org springframework beans factory support defaultlistablebeanfactory preinstantiatesingletons defaultlistablebeanfactory java at org springframework context support abstractapplicationcontext finishbeanfactoryinitialization abstractapplicationcontext java at org springframework context support abstractapplicationcontext refresh abstractapplicationcontext java at org springframework boot web servlet context servletwebserverapplicationcontext refresh servletwebserverapplicationcontext java at org springframework boot springapplication refresh springapplication java at org springframework boot springapplication refresh springapplication java at org springframework boot springapplication refreshcontext springapplication java at org springframework boot springapplication run springapplication java at org springframework boot springapplication run springapplication java at org springframework boot springapplication run springapplication java at com namai assurance assuranceapplication main assuranceapplication java caused by org springframework beans factory beancreationexception error creating bean with name jpamappingcontext invocation of init method failed nested exception is java lang illegalstateexception failed to asynchronously initialize native entitymanagerfactory java lang noclassdeffounderror com fasterxml classmate typeresolver at org springframework beans factory support abstractautowirecapablebeanfactory initializebean abstractautowirecapablebeanfactory java at org springframework beans factory support abstractautowirecapablebeanfactory docreatebean abstractautowirecapablebeanfactory java at org springframework beans factory support abstractautowirecapablebeanfactory createbean abstractautowirecapablebeanfactory java at org springframework beans factory support abstractbeanfactory lambda dogetbean abstractbeanfactory java at org springframework beans factory support defaultsingletonbeanregistry getsingleton defaultsingletonbeanregistry java at org springframework beans factory support abstractbeanfactory dogetbean abstractbeanfactory java at org springframework beans factory support abstractbeanfactory getbean abstractbeanfactory java at org springframework beans factory support beandefinitionvalueresolver resolvereference beandefinitionvalueresolver java common frames omitted caused by java lang illegalstateexception failed to asynchronously initialize native entitymanagerfactory java lang noclassdeffounderror com fasterxml classmate typeresolver at org springframework orm jpa abstractentitymanagerfactorybean getnativeentitymanagerfactory abstractentitymanagerfactorybean java at org springframework orm jpa abstractentitymanagerfactorybean invokeproxymethod abstractentitymanagerfactorybean java at org springframework orm jpa abstractentitymanagerfactorybean managedentitymanagerfactoryinvocationhandler invoke abstractentitymanagerfactorybean java at com sun proxy getmetamodel unknown source at java base java util stream referencepipeline accept referencepipeline java at java base java util iterator foreachremaining iterator java at java base java util spliterators iteratorspliterator foreachremaining spliterators java at java base java util stream abstractpipeline copyinto abstractpipeline java at java base java util stream abstractpipeline wrapandcopyinto abstractpipeline java at java base java util stream reduceops reduceop evaluatesequential reduceops java at java base java util stream abstractpipeline evaluate abstractpipeline java at java base java util stream referencepipeline collect referencepipeline java at org springframework data jpa repository config jpametamodelmappingcontextfactorybean getmetamodels jpametamodelmappingcontextfactorybean java at org springframework data jpa repository config jpametamodelmappingcontextfactorybean createinstance jpametamodelmappingcontextfactorybean java at org springframework data jpa repository config jpametamodelmappingcontextfactorybean createinstance jpametamodelmappingcontextfactorybean java at org springframework beans factory config abstractfactorybean afterpropertiesset abstractfactorybean java at org springframework beans factory support abstractautowirecapablebeanfactory invokeinitmethods abstractautowirecapablebeanfactory java at org springframework beans factory support abstractautowirecapablebeanfactory initializebean abstractautowirecapablebeanfactory java common frames omitted caused by java lang noclassdeffounderror com fasterxml classmate typeresolver at org hibernate boot internal classmatecontext classmatecontext java at org hibernate boot internal bootstrapcontextimpl bootstrapcontextimpl java at org hibernate boot internal metadatabuilderimpl metadatabuilderimpl java at org hibernate boot metadatasources getmetadatabuilder metadatasources java at org hibernate jpa boot internal entitymanagerfactorybuilderimpl entitymanagerfactorybuilderimpl java at org hibernate jpa boot internal entitymanagerfactorybuilderimpl entitymanagerfactorybuilderimpl java at org springframework orm jpa vendor springhibernatejpapersistenceprovider createcontainerentitymanagerfactory springhibernatejpapersistenceprovider java at org springframework orm jpa localcontainerentitymanagerfactorybean createnativeentitymanagerfactory localcontainerentitymanagerfactorybean java at org springframework orm jpa abstractentitymanagerfactorybean buildnativeentitymanagerfactory abstractentitymanagerfactorybean java at java base java util concurrent futuretask run futuretask java at java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java base java lang thread run thread java caused by java lang classnotfoundexception com fasterxml classmate typeresolver at java base jdk internal loader builtinclassloader loadclass builtinclassloader java at java base jdk internal loader classloaders appclassloader loadclass classloaders java at java base java lang classloader loadclass classloader java common frames omitted
0
67,738
21,096,879,119
IssuesEvent
2022-04-04 11:10:04
vector-im/element-web
https://api.github.com/repos/vector-im/element-web
opened
Joined room shows preview instead of timeline
T-Defect
### Steps to reproduce 1. Create a private and encrypted room 2. Send a message to it 3. Open develop in a new tab ### Outcome #### What did you expect? See room #### What happened instead? See "Room can't be previewed. Do you want to join it? Join the descussion" dialog ![Screenshot from 2022-04-04 12-06-44](https://user-images.githubusercontent.com/51663/161531415-0db617b0-c19f-44a4-aed6-b1f30d13c125.png) ### Operating system _No response_ ### Browser information Chromium 98.0.4758.102 (Official Build) Arch Linux (64-bit) ### URL for webapp develop.element.io ### Application version Element version: b1a60b25b4c8-react-13a51654e782-js-71b7521f4223 Olm version: 3.2.8 ### Homeserver matrix.org ### Will you send logs? Yes
1.0
Joined room shows preview instead of timeline - ### Steps to reproduce 1. Create a private and encrypted room 2. Send a message to it 3. Open develop in a new tab ### Outcome #### What did you expect? See room #### What happened instead? See "Room can't be previewed. Do you want to join it? Join the descussion" dialog ![Screenshot from 2022-04-04 12-06-44](https://user-images.githubusercontent.com/51663/161531415-0db617b0-c19f-44a4-aed6-b1f30d13c125.png) ### Operating system _No response_ ### Browser information Chromium 98.0.4758.102 (Official Build) Arch Linux (64-bit) ### URL for webapp develop.element.io ### Application version Element version: b1a60b25b4c8-react-13a51654e782-js-71b7521f4223 Olm version: 3.2.8 ### Homeserver matrix.org ### Will you send logs? Yes
non_test
joined room shows preview instead of timeline steps to reproduce create a private and encrypted room send a message to it open develop in a new tab outcome what did you expect see room what happened instead see room can t be previewed do you want to join it join the descussion dialog operating system no response browser information chromium official build arch linux bit url for webapp develop element io application version element version react js olm version homeserver matrix org will you send logs yes
0
254,171
21,767,044,825
IssuesEvent
2022-05-13 04:01:01
TheRenegadeCoder/sample-programs
https://api.github.com/repos/TheRenegadeCoder/sample-programs
closed
Add Longest Word Testing
enhancement tests
To request tests for a new project, please fill out the following: Project name: Longest Word Project link: https://sampleprograms.io/projects/longest-word/ > The project link must be a link to a project on the [Sample Programs Website project list][sample-programs-project-list]. > If you would like to add a new project to the Sample Programs Website, > please first make a pull request to the [Sample Programs Website][sample-programs-website]. [sample-programs-website]: https://github.com/TheRenegadeCoder/sample-programs-website [sample-programs-project-list]: https://sampleprograms.io/projects/
1.0
Add Longest Word Testing - To request tests for a new project, please fill out the following: Project name: Longest Word Project link: https://sampleprograms.io/projects/longest-word/ > The project link must be a link to a project on the [Sample Programs Website project list][sample-programs-project-list]. > If you would like to add a new project to the Sample Programs Website, > please first make a pull request to the [Sample Programs Website][sample-programs-website]. [sample-programs-website]: https://github.com/TheRenegadeCoder/sample-programs-website [sample-programs-project-list]: https://sampleprograms.io/projects/
test
add longest word testing to request tests for a new project please fill out the following project name longest word project link the project link must be a link to a project on the if you would like to add a new project to the sample programs website please first make a pull request to the
1
12,853
2,722,937,608
IssuesEvent
2015-04-14 08:59:09
BlackCodec/tint2
https://api.github.com/repos/BlackCodec/tint2
closed
Battery remaining shows garbage
auto-migrated Component-Battery Priority-Medium Type-Defect
``` What steps will reproduce the problem? It can be reproduced on a laptop. What is the expected output? What do you see instead? Negatives numbers in % (or numbers over 100) and wrong time left (sometimes even negative time) https://dl.dropbox.com/u/71236259/2012-11-23-185407_300x61_scrot.png https://dl.dropbox.com/u/71236259/2012-11-23-185433_320x55_scrot.png I can't manage to find any logical explanation for these numbers. Most of the time they're random. What version of the product are you using? On what operating system? Crunchbang waldorf $ tint2 --version tint2 version 0.11-svn ``` Original issue reported on code.google.com by `[email protected]` on 23 Nov 2012 at 5:18
1.0
Battery remaining shows garbage - ``` What steps will reproduce the problem? It can be reproduced on a laptop. What is the expected output? What do you see instead? Negatives numbers in % (or numbers over 100) and wrong time left (sometimes even negative time) https://dl.dropbox.com/u/71236259/2012-11-23-185407_300x61_scrot.png https://dl.dropbox.com/u/71236259/2012-11-23-185433_320x55_scrot.png I can't manage to find any logical explanation for these numbers. Most of the time they're random. What version of the product are you using? On what operating system? Crunchbang waldorf $ tint2 --version tint2 version 0.11-svn ``` Original issue reported on code.google.com by `[email protected]` on 23 Nov 2012 at 5:18
non_test
battery remaining shows garbage what steps will reproduce the problem it can be reproduced on a laptop what is the expected output what do you see instead negatives numbers in or numbers over and wrong time left sometimes even negative time i can t manage to find any logical explanation for these numbers most of the time they re random what version of the product are you using on what operating system crunchbang waldorf version version svn original issue reported on code google com by sysa gmail com on nov at
0
15,373
5,108,711,356
IssuesEvent
2017-01-05 18:32:46
BlackSourceLabs/BlackNectar-Service
https://api.github.com/repos/BlackSourceLabs/BlackNectar-Service
closed
Update SQL to use POSTGIS Functions
code enhancement
Right now distances are calculated in a primitive way doing trigonometric calculations. Now that we have access to a Geo-Spatial Database courtesy BlackSourceLabs/BlackNectar-Service#13 & BlackSourceLabs/BlackNectar-Service#12, we can use `ST_DWithin` and `ST_Distance` to calculate and order stores by distance.
1.0
Update SQL to use POSTGIS Functions - Right now distances are calculated in a primitive way doing trigonometric calculations. Now that we have access to a Geo-Spatial Database courtesy BlackSourceLabs/BlackNectar-Service#13 & BlackSourceLabs/BlackNectar-Service#12, we can use `ST_DWithin` and `ST_Distance` to calculate and order stores by distance.
non_test
update sql to use postgis functions right now distances are calculated in a primitive way doing trigonometric calculations now that we have access to a geo spatial database courtesy blacksourcelabs blacknectar service blacksourcelabs blacknectar service we can use st dwithin and st distance to calculate and order stores by distance
0
11,965
3,244,467,169
IssuesEvent
2015-10-16 02:28:33
IngSoft2CMS/CMS
https://api.github.com/repos/IngSoft2CMS/CMS
reopened
Crear sección(Backend)
QA - Test
Implementar el controlador utilizando REST que permite crear una instancia del modelo Sección en la base de datos.
1.0
Crear sección(Backend) - Implementar el controlador utilizando REST que permite crear una instancia del modelo Sección en la base de datos.
test
crear sección backend implementar el controlador utilizando rest que permite crear una instancia del modelo sección en la base de datos
1
276,774
24,017,594,409
IssuesEvent
2022-09-15 03:21:31
ApeWorX/ape
https://api.github.com/repos/ApeWorX/ape
closed
docker images need to be tested for running properly before pushing image to dockerhub
category: testing
### Elevator pitch: ape docker image was failing when being ran and we were not able to know this because it was building fine. we should fix this. ### Value: prevents bad docker images from being pushed ### Dependencies: n/a ### Design approach: test docker image after building and make sure it runs successfully before pushing detailed here: https://github.com/ApeWorX/ape/pull/891#issuecomment-1190387831 ### Task list: * [ ] Tasks go here ### Estimated completion date: ### Design review: <!-- 1-2 people needed for signoff --> Do not signoff unless: - 1) agreed the tasks and design approach will achieve acceptance, and - 2) the work can be completed by one person within the SLA. Design reviewers should consider simpler approaches to achieve goals. (Please leave a comment to sign off)
1.0
docker images need to be tested for running properly before pushing image to dockerhub - ### Elevator pitch: ape docker image was failing when being ran and we were not able to know this because it was building fine. we should fix this. ### Value: prevents bad docker images from being pushed ### Dependencies: n/a ### Design approach: test docker image after building and make sure it runs successfully before pushing detailed here: https://github.com/ApeWorX/ape/pull/891#issuecomment-1190387831 ### Task list: * [ ] Tasks go here ### Estimated completion date: ### Design review: <!-- 1-2 people needed for signoff --> Do not signoff unless: - 1) agreed the tasks and design approach will achieve acceptance, and - 2) the work can be completed by one person within the SLA. Design reviewers should consider simpler approaches to achieve goals. (Please leave a comment to sign off)
test
docker images need to be tested for running properly before pushing image to dockerhub elevator pitch ape docker image was failing when being ran and we were not able to know this because it was building fine we should fix this value prevents bad docker images from being pushed dependencies n a design approach test docker image after building and make sure it runs successfully before pushing detailed here task list tasks go here estimated completion date design review do not signoff unless agreed the tasks and design approach will achieve acceptance and the work can be completed by one person within the sla design reviewers should consider simpler approaches to achieve goals please leave a comment to sign off
1
48,215
7,390,333,231
IssuesEvent
2018-03-16 11:59:21
refinery-platform/refinery-platform
https://api.github.com/repos/refinery-platform/refinery-platform
closed
Tutorials don't work if the relevant component in Satori is minimized
bug documentation satori
* Specific code commit: 2a3c825183c423d388c391f52a18e13d5caa2925 * Version of the web browser and OS: ff macos * Environment where the error occurred: test production ### Steps to reproduce 1. Minimize "List Graph" 2. and then try to go through the corresponding tutorial ### Observed behavior ![screen shot 2017-09-13 at 3 05 20 pm](https://user-images.githubusercontent.com/730388/30395647-ddc64a64-9894-11e7-863f-ab2259c94217.png)
1.0
Tutorials don't work if the relevant component in Satori is minimized - * Specific code commit: 2a3c825183c423d388c391f52a18e13d5caa2925 * Version of the web browser and OS: ff macos * Environment where the error occurred: test production ### Steps to reproduce 1. Minimize "List Graph" 2. and then try to go through the corresponding tutorial ### Observed behavior ![screen shot 2017-09-13 at 3 05 20 pm](https://user-images.githubusercontent.com/730388/30395647-ddc64a64-9894-11e7-863f-ab2259c94217.png)
non_test
tutorials don t work if the relevant component in satori is minimized specific code commit version of the web browser and os ff macos environment where the error occurred test production steps to reproduce minimize list graph and then try to go through the corresponding tutorial observed behavior
0
77,630
7,582,666,997
IssuesEvent
2018-04-25 05:49:03
pravega/pravega
https://api.github.com/repos/pravega/pravega
closed
Implement a system test which kills pravega segmentstore during any operation
area/testing priority/P1
**Problem description** The task is to implement a test case which tests SSS failover, during all the operations which might corrupt the data such as appending, sealing , merging .. **Problem location** System Tests **Suggestions for an improvement**
1.0
Implement a system test which kills pravega segmentstore during any operation - **Problem description** The task is to implement a test case which tests SSS failover, during all the operations which might corrupt the data such as appending, sealing , merging .. **Problem location** System Tests **Suggestions for an improvement**
test
implement a system test which kills pravega segmentstore during any operation problem description the task is to implement a test case which tests sss failover during all the operations which might corrupt the data such as appending sealing merging problem location system tests suggestions for an improvement
1
208,470
15,891,283,272
IssuesEvent
2021-04-10 18:37:16
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
opened
sql/tests: TestRandomSyntaxSelect failed
C-test-failure O-robot branch-release-20.2
[(sql/tests).TestRandomSyntaxSelect failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2870034&tab=buildLog) on [release-20.2@31d3f451066eb1c23398dd862f2fd6a9a12ccf7c](https://github.com/cockroachdb/cockroach/commits/31d3f451066eb1c23398dd862f2fd6a9a12ccf7c): Fatal error: ``` panic: runtime error: invalid memory address or nil pointer dereference [signal SIGSEGV: segmentation violation code=0x1 addr=0x18 pc=0x1b48bf9] ``` Stack: ``` goroutine 3593613 [running]: github.com/cockroachdb/cockroach/pkg/sql/parser.(*lexer).populateErrorDetails(0xc000ed6c30) /go/src/github.com/cockroachdb/cockroach/pkg/sql/parser/lexer.go:214 +0x699 github.com/cockroachdb/cockroach/pkg/sql/parser.(*lexer).setErr(0xc000ed6c30, 0x0, 0x0) /go/src/github.com/cockroachdb/cockroach/pkg/sql/parser/lexer.go:194 +0x5c github.com/cockroachdb/cockroach/pkg/sql/parser.setErr(0x5f374c0, 0xc000ed6c30, 0x0, 0x0, 0x8) sql-gen.y:39 +0x50 github.com/cockroachdb/cockroach/pkg/sql/parser.(*sqlParserImpl).Parse(0xc000ed6c98, 0x5f374c0, 0xc000ed6c30, 0x0) sql-gen.y:10030 +0x6ca65 github.com/cockroachdb/cockroach/pkg/sql/parser.(*Parser).parse(0xc000ed6c00, 0x2, 0xc00c43ea20, 0x2c, 0xc00131f200, 0xb, 0x10, 0x803f2c0, 0x0, 0x0, ...) /go/src/github.com/cockroachdb/cockroach/pkg/sql/parser/parse.go:176 +0x175 github.com/cockroachdb/cockroach/pkg/sql/parser.(*Parser).parseWithDepth(0xc000ed6c00, 0x1, 0xc00c43ea20, 0x2d, 0x803f2c0, 0x0, 0x0, 0x0, 0x0, 0x0) /go/src/github.com/cockroachdb/cockroach/pkg/sql/parser/parse.go:156 +0x1b0 github.com/cockroachdb/cockroach/pkg/sql/parser.Parse(...) /go/src/github.com/cockroachdb/cockroach/pkg/sql/parser/parse.go:225 github.com/cockroachdb/cockroach/pkg/sql/tests_test.verifyFormat(0xc00c43ea20, 0x2d, 0x0, 0x73) /go/src/github.com/cockroachdb/cockroach/pkg/sql/tests/rsg_test.go:57 +0x7c github.com/cockroachdb/cockroach/pkg/sql/tests_test.(*verifyFormatDB).exec(0xc00d335050, 0xc0056ded80, 0x5f7ee00, 0xc02055e500, 0xc00c43ea20, 0x2d, 0x0, 0x0) /go/src/github.com/cockroachdb/cockroach/pkg/sql/tests/rsg_test.go:125 +0x73 github.com/cockroachdb/cockroach/pkg/sql/tests_test.TestRandomSyntaxSelect.func2(0x5f7ee00, 0xc02055e500, 0xc00d335050, 0xc0352f8dc0, 0x5ef4900, 0xc001e04c00) /go/src/github.com/cockroachdb/cockroach/pkg/sql/tests/rsg_test.go:253 +0x1f9 github.com/cockroachdb/cockroach/pkg/sql/tests_test.testRandomSyntax.func3(0x5f7ee00, 0xc02055e500, 0x7, 0x5905c00, 0x5905cc8) /go/src/github.com/cockroachdb/cockroach/pkg/sql/tests/rsg_test.go:745 +0xc7 github.com/cockroachdb/cockroach/pkg/util/ctxgroup.GroupWorkers.func1(0x5f7ee00, 0xc02055e500, 0x5f7ee40, 0xc000074120) /go/src/github.com/cockroachdb/cockroach/pkg/util/ctxgroup/ctxgroup.go:175 +0x42 github.com/cockroachdb/cockroach/pkg/util/ctxgroup.Group.GoCtx.func1(0xc030979580, 0x15) /go/src/github.com/cockroachdb/cockroach/pkg/util/ctxgroup/ctxgroup.go:166 +0x3a golang.org/x/sync/errgroup.(*Group).Go.func1(0xc008832090, 0xc008832210) /go/src/github.com/cockroachdb/cockroach/vendor/golang.org/x/sync/errgroup/errgroup.go:57 +0x59 created by golang.org/x/sync/errgroup.(*Group).Go /go/src/github.com/cockroachdb/cockroach/vendor/golang.org/x/sync/errgroup/errgroup.go:54 +0x66 ``` <details><summary>Log preceding fatal error</summary><p> ``` === RUN TestRandomSyntaxSelect test_log_scope.go:158: test logs captured to: /go/src/github.com/cockroachdb/cockroach/artifacts/logTestRandomSyntaxSelect175685025 test_log_scope.go:63: use -show-logs to present logs inline rsg_test.go:728: 5s of 5m0s: 856 executions, 24 successful rsg_test.go:728: 10s of 5m0s: 1550 executions, 35 successful rsg_test.go:728: 15s of 5m0s: 2230 executions, 54 successful rsg_test.go:728: 20s of 5m0s: 2909 executions, 73 successful rsg_test.go:728: 25s of 5m0s: 3548 executions, 84 successful rsg_test.go:728: 30s of 5m0s: 4173 executions, 103 successful rsg_test.go:728: 35s of 5m0s: 4762 executions, 116 successful rsg_test.go:728: 40s of 5m0s: 5344 executions, 126 successful rsg_test.go:728: 45s of 5m0s: 5948 executions, 135 successful rsg_test.go:728: 50s of 5m0s: 6498 executions, 142 successful rsg_test.go:728: 55s of 5m0s: 7085 executions, 159 successful rsg_test.go:728: 1m0s of 5m0s: 7645 executions, 173 successful rsg_test.go:728: 1m5s of 5m0s: 8181 executions, 183 successful rsg_test.go:728: 1m10s of 5m0s: 8688 executions, 197 successful rsg_test.go:728: 1m15s of 5m0s: 9233 executions, 212 successful rsg_test.go:728: 1m20s of 5m0s: 9770 executions, 223 successful rsg_test.go:728: 1m25s of 5m0s: 10279 executions, 232 successful rsg_test.go:728: 1m30s of 5m0s: 10798 executions, 247 successful rsg_test.go:728: 1m35s of 5m0s: 11364 executions, 258 successful rsg_test.go:728: 1m40s of 5m0s: 11898 executions, 272 successful rsg_test.go:728: 1m45s of 5m0s: 12417 executions, 277 successful ``` </p></details> <details><summary>More</summary><p> ``` make stressrace TESTS=TestRandomSyntaxSelect PKG=./pkg/sql/tests TESTTIMEOUT=5m STRESSFLAGS='-timeout 5m' 2>&1 ``` [See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2ATestRandomSyntaxSelect.%2A&sort=title&restgroup=false&display=lastcommented+project) <sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
1.0
sql/tests: TestRandomSyntaxSelect failed - [(sql/tests).TestRandomSyntaxSelect failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2870034&tab=buildLog) on [release-20.2@31d3f451066eb1c23398dd862f2fd6a9a12ccf7c](https://github.com/cockroachdb/cockroach/commits/31d3f451066eb1c23398dd862f2fd6a9a12ccf7c): Fatal error: ``` panic: runtime error: invalid memory address or nil pointer dereference [signal SIGSEGV: segmentation violation code=0x1 addr=0x18 pc=0x1b48bf9] ``` Stack: ``` goroutine 3593613 [running]: github.com/cockroachdb/cockroach/pkg/sql/parser.(*lexer).populateErrorDetails(0xc000ed6c30) /go/src/github.com/cockroachdb/cockroach/pkg/sql/parser/lexer.go:214 +0x699 github.com/cockroachdb/cockroach/pkg/sql/parser.(*lexer).setErr(0xc000ed6c30, 0x0, 0x0) /go/src/github.com/cockroachdb/cockroach/pkg/sql/parser/lexer.go:194 +0x5c github.com/cockroachdb/cockroach/pkg/sql/parser.setErr(0x5f374c0, 0xc000ed6c30, 0x0, 0x0, 0x8) sql-gen.y:39 +0x50 github.com/cockroachdb/cockroach/pkg/sql/parser.(*sqlParserImpl).Parse(0xc000ed6c98, 0x5f374c0, 0xc000ed6c30, 0x0) sql-gen.y:10030 +0x6ca65 github.com/cockroachdb/cockroach/pkg/sql/parser.(*Parser).parse(0xc000ed6c00, 0x2, 0xc00c43ea20, 0x2c, 0xc00131f200, 0xb, 0x10, 0x803f2c0, 0x0, 0x0, ...) /go/src/github.com/cockroachdb/cockroach/pkg/sql/parser/parse.go:176 +0x175 github.com/cockroachdb/cockroach/pkg/sql/parser.(*Parser).parseWithDepth(0xc000ed6c00, 0x1, 0xc00c43ea20, 0x2d, 0x803f2c0, 0x0, 0x0, 0x0, 0x0, 0x0) /go/src/github.com/cockroachdb/cockroach/pkg/sql/parser/parse.go:156 +0x1b0 github.com/cockroachdb/cockroach/pkg/sql/parser.Parse(...) /go/src/github.com/cockroachdb/cockroach/pkg/sql/parser/parse.go:225 github.com/cockroachdb/cockroach/pkg/sql/tests_test.verifyFormat(0xc00c43ea20, 0x2d, 0x0, 0x73) /go/src/github.com/cockroachdb/cockroach/pkg/sql/tests/rsg_test.go:57 +0x7c github.com/cockroachdb/cockroach/pkg/sql/tests_test.(*verifyFormatDB).exec(0xc00d335050, 0xc0056ded80, 0x5f7ee00, 0xc02055e500, 0xc00c43ea20, 0x2d, 0x0, 0x0) /go/src/github.com/cockroachdb/cockroach/pkg/sql/tests/rsg_test.go:125 +0x73 github.com/cockroachdb/cockroach/pkg/sql/tests_test.TestRandomSyntaxSelect.func2(0x5f7ee00, 0xc02055e500, 0xc00d335050, 0xc0352f8dc0, 0x5ef4900, 0xc001e04c00) /go/src/github.com/cockroachdb/cockroach/pkg/sql/tests/rsg_test.go:253 +0x1f9 github.com/cockroachdb/cockroach/pkg/sql/tests_test.testRandomSyntax.func3(0x5f7ee00, 0xc02055e500, 0x7, 0x5905c00, 0x5905cc8) /go/src/github.com/cockroachdb/cockroach/pkg/sql/tests/rsg_test.go:745 +0xc7 github.com/cockroachdb/cockroach/pkg/util/ctxgroup.GroupWorkers.func1(0x5f7ee00, 0xc02055e500, 0x5f7ee40, 0xc000074120) /go/src/github.com/cockroachdb/cockroach/pkg/util/ctxgroup/ctxgroup.go:175 +0x42 github.com/cockroachdb/cockroach/pkg/util/ctxgroup.Group.GoCtx.func1(0xc030979580, 0x15) /go/src/github.com/cockroachdb/cockroach/pkg/util/ctxgroup/ctxgroup.go:166 +0x3a golang.org/x/sync/errgroup.(*Group).Go.func1(0xc008832090, 0xc008832210) /go/src/github.com/cockroachdb/cockroach/vendor/golang.org/x/sync/errgroup/errgroup.go:57 +0x59 created by golang.org/x/sync/errgroup.(*Group).Go /go/src/github.com/cockroachdb/cockroach/vendor/golang.org/x/sync/errgroup/errgroup.go:54 +0x66 ``` <details><summary>Log preceding fatal error</summary><p> ``` === RUN TestRandomSyntaxSelect test_log_scope.go:158: test logs captured to: /go/src/github.com/cockroachdb/cockroach/artifacts/logTestRandomSyntaxSelect175685025 test_log_scope.go:63: use -show-logs to present logs inline rsg_test.go:728: 5s of 5m0s: 856 executions, 24 successful rsg_test.go:728: 10s of 5m0s: 1550 executions, 35 successful rsg_test.go:728: 15s of 5m0s: 2230 executions, 54 successful rsg_test.go:728: 20s of 5m0s: 2909 executions, 73 successful rsg_test.go:728: 25s of 5m0s: 3548 executions, 84 successful rsg_test.go:728: 30s of 5m0s: 4173 executions, 103 successful rsg_test.go:728: 35s of 5m0s: 4762 executions, 116 successful rsg_test.go:728: 40s of 5m0s: 5344 executions, 126 successful rsg_test.go:728: 45s of 5m0s: 5948 executions, 135 successful rsg_test.go:728: 50s of 5m0s: 6498 executions, 142 successful rsg_test.go:728: 55s of 5m0s: 7085 executions, 159 successful rsg_test.go:728: 1m0s of 5m0s: 7645 executions, 173 successful rsg_test.go:728: 1m5s of 5m0s: 8181 executions, 183 successful rsg_test.go:728: 1m10s of 5m0s: 8688 executions, 197 successful rsg_test.go:728: 1m15s of 5m0s: 9233 executions, 212 successful rsg_test.go:728: 1m20s of 5m0s: 9770 executions, 223 successful rsg_test.go:728: 1m25s of 5m0s: 10279 executions, 232 successful rsg_test.go:728: 1m30s of 5m0s: 10798 executions, 247 successful rsg_test.go:728: 1m35s of 5m0s: 11364 executions, 258 successful rsg_test.go:728: 1m40s of 5m0s: 11898 executions, 272 successful rsg_test.go:728: 1m45s of 5m0s: 12417 executions, 277 successful ``` </p></details> <details><summary>More</summary><p> ``` make stressrace TESTS=TestRandomSyntaxSelect PKG=./pkg/sql/tests TESTTIMEOUT=5m STRESSFLAGS='-timeout 5m' 2>&1 ``` [See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2ATestRandomSyntaxSelect.%2A&sort=title&restgroup=false&display=lastcommented+project) <sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
test
sql tests testrandomsyntaxselect failed on fatal error panic runtime error invalid memory address or nil pointer dereference stack goroutine github com cockroachdb cockroach pkg sql parser lexer populateerrordetails go src github com cockroachdb cockroach pkg sql parser lexer go github com cockroachdb cockroach pkg sql parser lexer seterr go src github com cockroachdb cockroach pkg sql parser lexer go github com cockroachdb cockroach pkg sql parser seterr sql gen y github com cockroachdb cockroach pkg sql parser sqlparserimpl parse sql gen y github com cockroachdb cockroach pkg sql parser parser parse go src github com cockroachdb cockroach pkg sql parser parse go github com cockroachdb cockroach pkg sql parser parser parsewithdepth go src github com cockroachdb cockroach pkg sql parser parse go github com cockroachdb cockroach pkg sql parser parse go src github com cockroachdb cockroach pkg sql parser parse go github com cockroachdb cockroach pkg sql tests test verifyformat go src github com cockroachdb cockroach pkg sql tests rsg test go github com cockroachdb cockroach pkg sql tests test verifyformatdb exec go src github com cockroachdb cockroach pkg sql tests rsg test go github com cockroachdb cockroach pkg sql tests test testrandomsyntaxselect go src github com cockroachdb cockroach pkg sql tests rsg test go github com cockroachdb cockroach pkg sql tests test testrandomsyntax go src github com cockroachdb cockroach pkg sql tests rsg test go github com cockroachdb cockroach pkg util ctxgroup groupworkers go src github com cockroachdb cockroach pkg util ctxgroup ctxgroup go github com cockroachdb cockroach pkg util ctxgroup group goctx go src github com cockroachdb cockroach pkg util ctxgroup ctxgroup go golang org x sync errgroup group go go src github com cockroachdb cockroach vendor golang org x sync errgroup errgroup go created by golang org x sync errgroup group go go src github com cockroachdb cockroach vendor golang org x sync errgroup errgroup go log preceding fatal error run testrandomsyntaxselect test log scope go test logs captured to go src github com cockroachdb cockroach artifacts test log scope go use show logs to present logs inline rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful more make stressrace tests testrandomsyntaxselect pkg pkg sql tests testtimeout stressflags timeout powered by
1
124,284
10,303,091,001
IssuesEvent
2019-08-28 19:20:29
approvals/ApprovalTests.cpp
https://api.github.com/repos/approvals/ApprovalTests.cpp
closed
Tidy up the contents of the testsuite directories
tests tidying
I intend the structure to be as per #17 - probably with the addition of a `documentation/` directory, which will contain any tests that exist only for the purpose of creating snippets for documentation. I'll also need to move the corresponding files in `approval_tests/` to the same sub-directory in the new location. I'll do this with `git mv` because moving non-source files in CLion doesn't retain that history of the move...
1.0
Tidy up the contents of the testsuite directories - I intend the structure to be as per #17 - probably with the addition of a `documentation/` directory, which will contain any tests that exist only for the purpose of creating snippets for documentation. I'll also need to move the corresponding files in `approval_tests/` to the same sub-directory in the new location. I'll do this with `git mv` because moving non-source files in CLion doesn't retain that history of the move...
test
tidy up the contents of the testsuite directories i intend the structure to be as per probably with the addition of a documentation directory which will contain any tests that exist only for the purpose of creating snippets for documentation i ll also need to move the corresponding files in approval tests to the same sub directory in the new location i ll do this with git mv because moving non source files in clion doesn t retain that history of the move
1
84,740
7,931,619,684
IssuesEvent
2018-07-07 02:31:38
byaka/VombatiDB
https://api.github.com/repos/byaka/VombatiDB
closed
Проверить оверхед `DBSearch_simple.query()` перед `iterIndex()`
testing
Размер тестовой базы оказался таким, что `iterIndex()` по ней выполняется ровно столькоже, сколько занимает компиляция базовых запросов. Нужно увеличить размер базы на порядок и будет ясно, является ли оверхед константным, или линейным.
1.0
Проверить оверхед `DBSearch_simple.query()` перед `iterIndex()` - Размер тестовой базы оказался таким, что `iterIndex()` по ней выполняется ровно столькоже, сколько занимает компиляция базовых запросов. Нужно увеличить размер базы на порядок и будет ясно, является ли оверхед константным, или линейным.
test
проверить оверхед dbsearch simple query перед iterindex размер тестовой базы оказался таким что iterindex по ней выполняется ровно столькоже сколько занимает компиляция базовых запросов нужно увеличить размер базы на порядок и будет ясно является ли оверхед константным или линейным
1
26,886
4,259,730,409
IssuesEvent
2016-07-11 12:12:24
fossasia/open-event-orga-server
https://api.github.com/repos/fossasia/open-event-orga-server
closed
Test PUT API with single fields as payload
Rest-API testing
Related issue - #1436 PUT API can be tested with single fields as the payload. Then the response of the API can be checked to make sure that only the field included in the payload has changed and rest all fields are untouched. So basically what I am saying is - ```bash d1 = GET item UPDATE item # just one field d2 = GET item assert d1 - d2 == only_field_included_in_step_2 # difference between d1 and d2 ```
1.0
Test PUT API with single fields as payload - Related issue - #1436 PUT API can be tested with single fields as the payload. Then the response of the API can be checked to make sure that only the field included in the payload has changed and rest all fields are untouched. So basically what I am saying is - ```bash d1 = GET item UPDATE item # just one field d2 = GET item assert d1 - d2 == only_field_included_in_step_2 # difference between d1 and d2 ```
test
test put api with single fields as payload related issue put api can be tested with single fields as the payload then the response of the api can be checked to make sure that only the field included in the payload has changed and rest all fields are untouched so basically what i am saying is bash get item update item just one field get item assert only field included in step difference between and
1
42,449
5,437,791,352
IssuesEvent
2017-03-06 08:28:17
mautic/mautic
https://api.github.com/repos/mautic/mautic
closed
Outlook DPI settings are erased even in Code Mode
Bug Ready To Test
| Q | A | ---| --- | Bug report? | X | Feature request? | | Enhancement? | ## Description: The following Outlook DPI settings are altered upon saving an email even if that email is using the Code Mode. ```html <!--[if gte mso 9]><xml> <o:OfficeDocumentSettings> <o:AllowPNG/> <o:PixelsPerInch>96</o:PixelsPerInch> </o:OfficeDocumentSettings> </xml><![endif]--> ``` It is then replaced with the following which simply displays the number 96 when viewed in Outlook. ```html <!--[if gte mso 9]> 96 <![endif]--> ``` ## If a bug: | Q | A | --- | --- | Mautic version | 2.4.0 | PHP version | 7.0.14 ### Steps to reproduce: 1. Create an email containing the above snippet. 2. Save 3. Preview and view the source code
1.0
Outlook DPI settings are erased even in Code Mode - | Q | A | ---| --- | Bug report? | X | Feature request? | | Enhancement? | ## Description: The following Outlook DPI settings are altered upon saving an email even if that email is using the Code Mode. ```html <!--[if gte mso 9]><xml> <o:OfficeDocumentSettings> <o:AllowPNG/> <o:PixelsPerInch>96</o:PixelsPerInch> </o:OfficeDocumentSettings> </xml><![endif]--> ``` It is then replaced with the following which simply displays the number 96 when viewed in Outlook. ```html <!--[if gte mso 9]> 96 <![endif]--> ``` ## If a bug: | Q | A | --- | --- | Mautic version | 2.4.0 | PHP version | 7.0.14 ### Steps to reproduce: 1. Create an email containing the above snippet. 2. Save 3. Preview and view the source code
test
outlook dpi settings are erased even in code mode q a bug report x feature request enhancement description the following outlook dpi settings are altered upon saving an email even if that email is using the code mode html it is then replaced with the following which simply displays the number when viewed in outlook html if a bug q a mautic version php version steps to reproduce create an email containing the above snippet save preview and view the source code
1
105,038
9,014,887,126
IssuesEvent
2019-02-06 00:04:20
Microsoft/microsoft-ui-xaml
https://api.github.com/repos/Microsoft/microsoft-ui-xaml
opened
Test failures with keyboard/gamepad input
test issue
Flaky tests with keyboard/gamepad input. - RatingControlTests.VerifyThatProgrammaticallyRemovingEngagementResetsValue - ScrollViewerTestsWithInputHelper.VerifyScrollViewerGamePadHorizontalInteraction - TreeViewTests.TreeViewKeyDownRightToLeftTest_ContentMode - TreeViewTests.TreeViewMultiSelectGamepadTest_ContentMode https://dev.azure.com/ms/microsoft-ui-xaml/_build/results?buildId=2157&view=ms.vss-test-web.build-test-results-tab
1.0
Test failures with keyboard/gamepad input - Flaky tests with keyboard/gamepad input. - RatingControlTests.VerifyThatProgrammaticallyRemovingEngagementResetsValue - ScrollViewerTestsWithInputHelper.VerifyScrollViewerGamePadHorizontalInteraction - TreeViewTests.TreeViewKeyDownRightToLeftTest_ContentMode - TreeViewTests.TreeViewMultiSelectGamepadTest_ContentMode https://dev.azure.com/ms/microsoft-ui-xaml/_build/results?buildId=2157&view=ms.vss-test-web.build-test-results-tab
test
test failures with keyboard gamepad input flaky tests with keyboard gamepad input ratingcontroltests verifythatprogrammaticallyremovingengagementresetsvalue scrollviewertestswithinputhelper verifyscrollviewergamepadhorizontalinteraction treeviewtests treeviewkeydownrighttolefttest contentmode treeviewtests treeviewmultiselectgamepadtest contentmode
1
27,473
5,353,949,938
IssuesEvent
2017-02-20 08:18:10
swagger-api/swagger-codegen
https://api.github.com/repos/swagger-api/swagger-codegen
closed
Static HTML generator drops response headers
Client: HTML Feature: Documentation Need community contribution
``` json "responses" : { "500" : { "description" : "Ouch", "headers" : { "X-Request-ID" : { "type" : "string", "description" : "Blah" } } } } ``` This is rendered in the editor, but missing from the static HTML. `swagger-codegen-cli:2.1.5`
1.0
Static HTML generator drops response headers - ``` json "responses" : { "500" : { "description" : "Ouch", "headers" : { "X-Request-ID" : { "type" : "string", "description" : "Blah" } } } } ``` This is rendered in the editor, but missing from the static HTML. `swagger-codegen-cli:2.1.5`
non_test
static html generator drops response headers json responses description ouch headers x request id type string description blah this is rendered in the editor but missing from the static html swagger codegen cli
0
145,762
22,778,202,115
IssuesEvent
2022-07-08 16:30:49
dotnet/efcore
https://api.github.com/repos/dotnet/efcore
closed
Entity is not updated using new context instance from new service provider scope, InMemoryProvider
closed-by-design customer-reported
# Bug Details 1. Register DbContext using In memory provider sc.AddDbContext<TestDbContext>(b => b.UseInMemoryDatabase("test")); 2. Build provider 3. Create a scope1 and get DbContext ctx1 4. Add new item to DbSet in ctx1 and save changes 5. Create scope2 and get new DbContext ctx2 6. Do updated of entity in ctx2 and save 7. Get entity from ctx1 using id Actual: Entity from ctx1 not updated and contains the old value Expected: Entity from ctx1 updated and contains new value # Example ```C# public class UnitTest1 { [Fact] public void Test1() { var sc = new ServiceCollection(); sc.AddDbContext<TestDbContext>(b => b.UseInMemoryDatabase("test")); var sp = sc.BuildServiceProvider(); var scope1 = sp.CreateScope(); var ctx1 = scope1.ServiceProvider.GetService<TestDbContext>(); var id = Guid.NewGuid(); ctx1.Add(new Customer() { Id = id, Name = "init name" }); ctx1.SaveChanges(); var scope2 = sp.CreateScope(); var ctx2 = scope2.ServiceProvider.GetService<TestDbContext>(); var myCustomer = ctx2.Customers.Find(id); myCustomer.Name = "updated"; ctx2.Update(myCustomer); ctx2.SaveChanges(); var updated = ctx1.Customers.Find(id); Assert.Equal("updated", updated.Name); } } public class TestDbContext : DbContext { public TestDbContext(DbContextOptions<TestDbContext> options) : base(options) { } public DbSet<Customer> Customers { get; set; } } public class Customer { public Guid Id { get; set; } = Guid.NewGuid(); public string Name { get; set; } } ``` # Provider and version information EF Core version: Microsoft.EntityFrameworkCore.InMemory : 6.0.6 DependencyInjection: Microsoft.Extensions.DependencyInjection: 6.0.0 Operating system: Windows 11 IDE: Visual Studio Professional 2022 17.1.2
1.0
Entity is not updated using new context instance from new service provider scope, InMemoryProvider - # Bug Details 1. Register DbContext using In memory provider sc.AddDbContext<TestDbContext>(b => b.UseInMemoryDatabase("test")); 2. Build provider 3. Create a scope1 and get DbContext ctx1 4. Add new item to DbSet in ctx1 and save changes 5. Create scope2 and get new DbContext ctx2 6. Do updated of entity in ctx2 and save 7. Get entity from ctx1 using id Actual: Entity from ctx1 not updated and contains the old value Expected: Entity from ctx1 updated and contains new value # Example ```C# public class UnitTest1 { [Fact] public void Test1() { var sc = new ServiceCollection(); sc.AddDbContext<TestDbContext>(b => b.UseInMemoryDatabase("test")); var sp = sc.BuildServiceProvider(); var scope1 = sp.CreateScope(); var ctx1 = scope1.ServiceProvider.GetService<TestDbContext>(); var id = Guid.NewGuid(); ctx1.Add(new Customer() { Id = id, Name = "init name" }); ctx1.SaveChanges(); var scope2 = sp.CreateScope(); var ctx2 = scope2.ServiceProvider.GetService<TestDbContext>(); var myCustomer = ctx2.Customers.Find(id); myCustomer.Name = "updated"; ctx2.Update(myCustomer); ctx2.SaveChanges(); var updated = ctx1.Customers.Find(id); Assert.Equal("updated", updated.Name); } } public class TestDbContext : DbContext { public TestDbContext(DbContextOptions<TestDbContext> options) : base(options) { } public DbSet<Customer> Customers { get; set; } } public class Customer { public Guid Id { get; set; } = Guid.NewGuid(); public string Name { get; set; } } ``` # Provider and version information EF Core version: Microsoft.EntityFrameworkCore.InMemory : 6.0.6 DependencyInjection: Microsoft.Extensions.DependencyInjection: 6.0.0 Operating system: Windows 11 IDE: Visual Studio Professional 2022 17.1.2
non_test
entity is not updated using new context instance from new service provider scope inmemoryprovider bug details register dbcontext using in memory provider sc adddbcontext b b useinmemorydatabase test build provider create a and get dbcontext add new item to dbset in and save changes create and get new dbcontext do updated of entity in and save get entity from using id actual entity from not updated and contains the old value expected entity from updated and contains new value example c public class public void var sc new servicecollection sc adddbcontext b b useinmemorydatabase test var sp sc buildserviceprovider var sp createscope var serviceprovider getservice var id guid newguid add new customer id id name init name savechanges var sp createscope var serviceprovider getservice var mycustomer customers find id mycustomer name updated update mycustomer savechanges var updated customers find id assert equal updated updated name public class testdbcontext dbcontext public testdbcontext dbcontextoptions options base options public dbset customers get set public class customer public guid id get set guid newguid public string name get set provider and version information ef core version microsoft entityframeworkcore inmemory dependencyinjection microsoft extensions dependencyinjection operating system windows ide visual studio professional
0
113,144
17,116,004,167
IssuesEvent
2021-07-11 11:14:28
theHinneh/ridge-condos-
https://api.github.com/repos/theHinneh/ridge-condos-
closed
CVE-2020-28481 (Medium) detected in socket.io-2.1.1.tgz
security vulnerability
## CVE-2020-28481 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>socket.io-2.1.1.tgz</b></p></summary> <p>node.js realtime framework server</p> <p>Library home page: <a href="https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz">https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz</a></p> <p>Path to dependency file: ridge-condos-/package.json</p> <p>Path to vulnerable library: ridge-condos-/node_modules/socket.io/package.json</p> <p> Dependency Hierarchy: - karma-5.0.2.tgz (Root Library) - :x: **socket.io-2.1.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/theHinneh/ridge-condos-/commit/7c1e7fe6dd1bbeac6fe80f7baf08a8f793d23222">7c1e7fe6dd1bbeac6fe80f7baf08a8f793d23222</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The package socket.io before 2.4.0 are vulnerable to Insecure Defaults due to CORS Misconfiguration. All domains are whitelisted by default. <p>Publish Date: 2021-01-19 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28481>CVE-2020-28481</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28481">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28481</a></p> <p>Release Date: 2021-01-19</p> <p>Fix Resolution: 2.4.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-28481 (Medium) detected in socket.io-2.1.1.tgz - ## CVE-2020-28481 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>socket.io-2.1.1.tgz</b></p></summary> <p>node.js realtime framework server</p> <p>Library home page: <a href="https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz">https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz</a></p> <p>Path to dependency file: ridge-condos-/package.json</p> <p>Path to vulnerable library: ridge-condos-/node_modules/socket.io/package.json</p> <p> Dependency Hierarchy: - karma-5.0.2.tgz (Root Library) - :x: **socket.io-2.1.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/theHinneh/ridge-condos-/commit/7c1e7fe6dd1bbeac6fe80f7baf08a8f793d23222">7c1e7fe6dd1bbeac6fe80f7baf08a8f793d23222</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The package socket.io before 2.4.0 are vulnerable to Insecure Defaults due to CORS Misconfiguration. All domains are whitelisted by default. <p>Publish Date: 2021-01-19 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28481>CVE-2020-28481</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28481">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-28481</a></p> <p>Release Date: 2021-01-19</p> <p>Fix Resolution: 2.4.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve medium detected in socket io tgz cve medium severity vulnerability vulnerable library socket io tgz node js realtime framework server library home page a href path to dependency file ridge condos package json path to vulnerable library ridge condos node modules socket io package json dependency hierarchy karma tgz root library x socket io tgz vulnerable library found in head commit a href found in base branch master vulnerability details the package socket io before are vulnerable to insecure defaults due to cors misconfiguration all domains are whitelisted by default publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
94,580
11,887,146,614
IssuesEvent
2020-03-28 00:21:02
MozillaFoundation/Design
https://api.github.com/repos/MozillaFoundation/Design
opened
Graphic for Zoom blog post
design
Kaili has the [copy here](https://docs.google.com/document/d/19fQBQrXR_aau6XmbREueWOQQjMN8kji0Sc91zWS9qmM/edit?ts=5e7e2f8b). The post will need some graphics for the tips, the thumbnail and for social. Natalie will meet up with Kaili and Audrey to discuss!
1.0
Graphic for Zoom blog post - Kaili has the [copy here](https://docs.google.com/document/d/19fQBQrXR_aau6XmbREueWOQQjMN8kji0Sc91zWS9qmM/edit?ts=5e7e2f8b). The post will need some graphics for the tips, the thumbnail and for social. Natalie will meet up with Kaili and Audrey to discuss!
non_test
graphic for zoom blog post kaili has the the post will need some graphics for the tips the thumbnail and for social natalie will meet up with kaili and audrey to discuss
0
23,342
6,419,936,101
IssuesEvent
2017-08-08 22:29:43
xhqiao89/HydroDesktop_test
https://api.github.com/repos/xhqiao89/HydroDesktop_test
closed
Error in download metadata in HydroDesktop ver 1.7
CodePlex
<b>7124119[CodePlex]</b> <br />Dear Developers, When I am trying to download metadata from HydroDesktop ver 1.7, it occurs errors like this : offset and length were out of bounds for the array or count is greater than the number of elements is greater Could you please check it . I am waiting for your response . Thanks and best regards,
1.0
Error in download metadata in HydroDesktop ver 1.7 - <b>7124119[CodePlex]</b> <br />Dear Developers, When I am trying to download metadata from HydroDesktop ver 1.7, it occurs errors like this : offset and length were out of bounds for the array or count is greater than the number of elements is greater Could you please check it . I am waiting for your response . Thanks and best regards,
non_test
error in download metadata in hydrodesktop ver dear developers when i am trying to download metadata from hydrodesktop ver it occurs errors like this offset and length were out of bounds for the array or count is greater than the number of elements is greater could you please check it i am waiting for your response thanks and best regards
0
786,663
27,661,544,131
IssuesEvent
2023-03-12 15:27:37
AY2223S2-CS2113-T12-1/tp
https://api.github.com/repos/AY2223S2-CS2113-T12-1/tp
opened
[Task] Edit expenses and income
type.Task priority.Medium
Add functionality for user to edit existing expense and income. Edits the item at the specified INDEX, where INDEX must be accessible by the respective arrayList. Fiels provided will be updated to the new inputs.
1.0
[Task] Edit expenses and income - Add functionality for user to edit existing expense and income. Edits the item at the specified INDEX, where INDEX must be accessible by the respective arrayList. Fiels provided will be updated to the new inputs.
non_test
edit expenses and income add functionality for user to edit existing expense and income edits the item at the specified index where index must be accessible by the respective arraylist fiels provided will be updated to the new inputs
0
486,184
14,005,636,988
IssuesEvent
2020-10-28 18:45:04
GoogleContainerTools/skaffold
https://api.github.com/repos/GoogleContainerTools/skaffold
closed
Relocate Skaffold debug images from gcr.io/gcp-dev-tools
area/debug kind/todo priority/p0
gcp-dev-tools is not obviously associated with Skaffold See http://go/gcp-dev-tools-acl-pm ---- Short-term plan: - [x] Publish helper images to both gcr.io/k8s-skaffold/skaffold-debug-support and gcr.io/gcp-dev-tools/duct-tape - [ ] Switch Skaffold to retrieve from gcr.io/k8s-skaffold/skaffold-debug-support
1.0
Relocate Skaffold debug images from gcr.io/gcp-dev-tools - gcp-dev-tools is not obviously associated with Skaffold See http://go/gcp-dev-tools-acl-pm ---- Short-term plan: - [x] Publish helper images to both gcr.io/k8s-skaffold/skaffold-debug-support and gcr.io/gcp-dev-tools/duct-tape - [ ] Switch Skaffold to retrieve from gcr.io/k8s-skaffold/skaffold-debug-support
non_test
relocate skaffold debug images from gcr io gcp dev tools gcp dev tools is not obviously associated with skaffold see short term plan publish helper images to both gcr io skaffold skaffold debug support and gcr io gcp dev tools duct tape switch skaffold to retrieve from gcr io skaffold skaffold debug support
0
131,781
18,249,484,925
IssuesEvent
2021-10-02 01:14:28
praneethpanasala/linux
https://api.github.com/repos/praneethpanasala/linux
opened
WS-2021-0334 (High) detected in linuxlinux-4.19.6
security vulnerability
## WS-2021-0334 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.6</b></p></summary> <p> <p>Apache Software Foundation (ASF)</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>linux/net/netfilter/nf_synproxy_core.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>linux/net/netfilter/nf_synproxy_core.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Linux/Kernel in versions v5.13-rc1 to v5.13-rc6 is vulnerable to out of bounds when parsing TCP options <p>Publish Date: 2021-05-31 <p>URL: <a href=https://github.com/gregkh/linux/commit/6defc77d48eff74075b80ad5925061b2fc010d98>WS-2021-0334</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.4</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://osv.dev/vulnerability/UVI-2021-1000919">https://osv.dev/vulnerability/UVI-2021-1000919</a></p> <p>Release Date: 2021-05-31</p> <p>Fix Resolution: v5.4.128</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
WS-2021-0334 (High) detected in linuxlinux-4.19.6 - ## WS-2021-0334 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.6</b></p></summary> <p> <p>Apache Software Foundation (ASF)</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>linux/net/netfilter/nf_synproxy_core.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>linux/net/netfilter/nf_synproxy_core.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Linux/Kernel in versions v5.13-rc1 to v5.13-rc6 is vulnerable to out of bounds when parsing TCP options <p>Publish Date: 2021-05-31 <p>URL: <a href=https://github.com/gregkh/linux/commit/6defc77d48eff74075b80ad5925061b2fc010d98>WS-2021-0334</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.4</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://osv.dev/vulnerability/UVI-2021-1000919">https://osv.dev/vulnerability/UVI-2021-1000919</a></p> <p>Release Date: 2021-05-31</p> <p>Fix Resolution: v5.4.128</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
ws high detected in linuxlinux ws high severity vulnerability vulnerable library linuxlinux apache software foundation asf library home page a href found in base branch master vulnerable source files linux net netfilter nf synproxy core c linux net netfilter nf synproxy core c vulnerability details linux kernel in versions to is vulnerable to out of bounds when parsing tcp options publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
13,341
15,801,356,784
IssuesEvent
2021-04-03 04:27:26
ooi-data/RS01SBPD-DP01A-04-FLNTUA102-recovered_wfp-dpc_flnturtd_instrument_recovered
https://api.github.com/repos/ooi-data/RS01SBPD-DP01A-04-FLNTUA102-recovered_wfp-dpc_flnturtd_instrument_recovered
opened
🛑 Processing failed: PermissionError
process
## Overview `PermissionError` found in `processing_task` task during run ended on 2021-04-03T04:27:26.019877. ## Details Flow name: `RS01SBPD-DP01A-04-FLNTUA102-recovered_wfp-dpc_flnturtd_instrument_recovered` Task name: `processing_task` Error type: `PermissionError` Error message: The difference between the request time and the current time is too large. <details> <summary>Traceback</summary> ``` Traceback (most recent call last): File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 234, in _call_s3 return await method(**additional_kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/client.py", line 154, in _make_api_call raise error_class(parsed_response, operation_name) botocore.exceptions.ClientError: An error occurred (RequestTimeTooSkewed) when calling the PutObject operation: The difference between the request time and the current time is too large. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/share/miniconda/envs/harvester/lib/python3.8/site-packages/ooi_harvester/processor/pipeline.py", line 71, in processing_task File "/srv/conda/envs/notebook/lib/python3.8/site-packages/ooi_harvester/processor/__init__.py", line 305, in finalize_zarr array_plan.execute() File "/srv/conda/envs/notebook/lib/python3.8/site-packages/rechunker/api.py", line 76, in execute self._executor.execute_plan(self._plan, **kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/rechunker/executors/dask.py", line 24, in execute_plan return plan.compute(**kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/base.py", line 283, in compute (result,) = compute(self, traverse=False, **kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/base.py", line 565, in compute results = schedule(dsk, keys, **kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/threaded.py", line 76, in get results = get_async( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/local.py", line 487, in get_async raise_exception(exc, tb) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/local.py", line 317, in reraise raise exc File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/local.py", line 222, in execute_task result = _execute_task(task, data) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/core.py", line 121, in _execute_task return func(*(_execute_task(a, cache) for a in args)) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/array/core.py", line 3984, in store_chunk return load_store_chunk(x, out, index, lock, return_stored, False) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/array/core.py", line 3971, in load_store_chunk out[index] = x File "/srv/conda/envs/notebook/lib/python3.8/site-packages/zarr/core.py", line 1211, in __setitem__ self.set_basic_selection(selection, value, fields=fields) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/zarr/core.py", line 1306, in set_basic_selection return self._set_basic_selection_nd(selection, value, fields=fields) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/zarr/core.py", line 1597, in _set_basic_selection_nd self._set_selection(indexer, value, fields=fields) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/zarr/core.py", line 1669, in _set_selection self._chunk_setitems(lchunk_coords, lchunk_selection, chunk_values, File "/srv/conda/envs/notebook/lib/python3.8/site-packages/zarr/core.py", line 1861, in _chunk_setitems self.chunk_store.setitems(values) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/mapping.py", line 111, in setitems self.fs.pipe(values) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 121, in wrapper return maybe_sync(func, self, *args, **kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 100, in maybe_sync return sync(loop, func, *args, **kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 71, in sync raise exc.with_traceback(tb) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 55, in f result[0] = await future File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 224, in _pipe await asyncio.gather( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 753, in _pipe_file return await self._call_s3( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 252, in _call_s3 raise translate_boto_error(err) from err PermissionError: The difference between the request time and the current time is too large. ``` </details>
1.0
🛑 Processing failed: PermissionError - ## Overview `PermissionError` found in `processing_task` task during run ended on 2021-04-03T04:27:26.019877. ## Details Flow name: `RS01SBPD-DP01A-04-FLNTUA102-recovered_wfp-dpc_flnturtd_instrument_recovered` Task name: `processing_task` Error type: `PermissionError` Error message: The difference between the request time and the current time is too large. <details> <summary>Traceback</summary> ``` Traceback (most recent call last): File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 234, in _call_s3 return await method(**additional_kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/client.py", line 154, in _make_api_call raise error_class(parsed_response, operation_name) botocore.exceptions.ClientError: An error occurred (RequestTimeTooSkewed) when calling the PutObject operation: The difference between the request time and the current time is too large. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/share/miniconda/envs/harvester/lib/python3.8/site-packages/ooi_harvester/processor/pipeline.py", line 71, in processing_task File "/srv/conda/envs/notebook/lib/python3.8/site-packages/ooi_harvester/processor/__init__.py", line 305, in finalize_zarr array_plan.execute() File "/srv/conda/envs/notebook/lib/python3.8/site-packages/rechunker/api.py", line 76, in execute self._executor.execute_plan(self._plan, **kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/rechunker/executors/dask.py", line 24, in execute_plan return plan.compute(**kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/base.py", line 283, in compute (result,) = compute(self, traverse=False, **kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/base.py", line 565, in compute results = schedule(dsk, keys, **kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/threaded.py", line 76, in get results = get_async( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/local.py", line 487, in get_async raise_exception(exc, tb) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/local.py", line 317, in reraise raise exc File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/local.py", line 222, in execute_task result = _execute_task(task, data) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/core.py", line 121, in _execute_task return func(*(_execute_task(a, cache) for a in args)) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/array/core.py", line 3984, in store_chunk return load_store_chunk(x, out, index, lock, return_stored, False) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/dask/array/core.py", line 3971, in load_store_chunk out[index] = x File "/srv/conda/envs/notebook/lib/python3.8/site-packages/zarr/core.py", line 1211, in __setitem__ self.set_basic_selection(selection, value, fields=fields) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/zarr/core.py", line 1306, in set_basic_selection return self._set_basic_selection_nd(selection, value, fields=fields) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/zarr/core.py", line 1597, in _set_basic_selection_nd self._set_selection(indexer, value, fields=fields) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/zarr/core.py", line 1669, in _set_selection self._chunk_setitems(lchunk_coords, lchunk_selection, chunk_values, File "/srv/conda/envs/notebook/lib/python3.8/site-packages/zarr/core.py", line 1861, in _chunk_setitems self.chunk_store.setitems(values) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/mapping.py", line 111, in setitems self.fs.pipe(values) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 121, in wrapper return maybe_sync(func, self, *args, **kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 100, in maybe_sync return sync(loop, func, *args, **kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 71, in sync raise exc.with_traceback(tb) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 55, in f result[0] = await future File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 224, in _pipe await asyncio.gather( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 753, in _pipe_file return await self._call_s3( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 252, in _call_s3 raise translate_boto_error(err) from err PermissionError: The difference between the request time and the current time is too large. ``` </details>
non_test
🛑 processing failed permissionerror overview permissionerror found in processing task task during run ended on details flow name recovered wfp dpc flnturtd instrument recovered task name processing task error type permissionerror error message the difference between the request time and the current time is too large traceback traceback most recent call last file srv conda envs notebook lib site packages core py line in call return await method additional kwargs file srv conda envs notebook lib site packages aiobotocore client py line in make api call raise error class parsed response operation name botocore exceptions clienterror an error occurred requesttimetooskewed when calling the putobject operation the difference between the request time and the current time is too large the above exception was the direct cause of the following exception traceback most recent call last file usr share miniconda envs harvester lib site packages ooi harvester processor pipeline py line in processing task file srv conda envs notebook lib site packages ooi harvester processor init py line in finalize zarr array plan execute file srv conda envs notebook lib site packages rechunker api py line in execute self executor execute plan self plan kwargs file srv conda envs notebook lib site packages rechunker executors dask py line in execute plan return plan compute kwargs file srv conda envs notebook lib site packages dask base py line in compute result compute self traverse false kwargs file srv conda envs notebook lib site packages dask base py line in compute results schedule dsk keys kwargs file srv conda envs notebook lib site packages dask threaded py line in get results get async file srv conda envs notebook lib site packages dask local py line in get async raise exception exc tb file srv conda envs notebook lib site packages dask local py line in reraise raise exc file srv conda envs notebook lib site packages dask local py line in execute task result execute task task data file srv conda envs notebook lib site packages dask core py line in execute task return func execute task a cache for a in args file srv conda envs notebook lib site packages dask array core py line in store chunk return load store chunk x out index lock return stored false file srv conda envs notebook lib site packages dask array core py line in load store chunk out x file srv conda envs notebook lib site packages zarr core py line in setitem self set basic selection selection value fields fields file srv conda envs notebook lib site packages zarr core py line in set basic selection return self set basic selection nd selection value fields fields file srv conda envs notebook lib site packages zarr core py line in set basic selection nd self set selection indexer value fields fields file srv conda envs notebook lib site packages zarr core py line in set selection self chunk setitems lchunk coords lchunk selection chunk values file srv conda envs notebook lib site packages zarr core py line in chunk setitems self chunk store setitems values file srv conda envs notebook lib site packages fsspec mapping py line in setitems self fs pipe values file srv conda envs notebook lib site packages fsspec asyn py line in wrapper return maybe sync func self args kwargs file srv conda envs notebook lib site packages fsspec asyn py line in maybe sync return sync loop func args kwargs file srv conda envs notebook lib site packages fsspec asyn py line in sync raise exc with traceback tb file srv conda envs notebook lib site packages fsspec asyn py line in f result await future file srv conda envs notebook lib site packages fsspec asyn py line in pipe await asyncio gather file srv conda envs notebook lib site packages core py line in pipe file return await self call file srv conda envs notebook lib site packages core py line in call raise translate boto error err from err permissionerror the difference between the request time and the current time is too large
0
337,639
30,252,631,414
IssuesEvent
2023-07-06 22:05:03
Bears-R-Us/arkouda
https://api.github.com/repos/Bears-R-Us/arkouda
closed
`message_test.py` Conversion for new test framework
Testing
Part of #2499 Configure `message_test.py` to work in new testing framework
1.0
`message_test.py` Conversion for new test framework - Part of #2499 Configure `message_test.py` to work in new testing framework
test
message test py conversion for new test framework part of configure message test py to work in new testing framework
1
245,548
20,776,890,561
IssuesEvent
2022-03-16 11:19:57
theupdateframework/python-tuf
https://api.github.com/repos/theupdateframework/python-tuf
closed
Provide a way to generate test data with invalid content but valid signatures
question testing
**Description of issue or feature request**: As part of PHP-TUF, we've been building out our own conformance suite of test fixtures. (I've posted before about how we've continued to improve the determinism of the design.) Unfortunately, there's a conflict between easy fixture generation, good security design, and comprehensive testing. It's making it hard for us to test scenarios where the signatures are valid but the signed data is broken in some way: - The current library is mostly (and understandably) designed to provide correctly structured, signed data. - As is best practice for our design, we minimize parsing and usage until after signature validation. - We want to test our robustness against incorrect but signed data. So, we're looking for advice or features -- we're not completely sure. Advice could be in the form of how to leverage `securesystemslib` or the TUF API to generate a signature over invalid metadata. Features could be in the form of a well-defined way to force the high-level TUF library to re-sign invalid data. We've also considered adding a bypass to PHP-TUF to allow a fixture to be used in a way that skips signature validation before usage, but this seems undesirable in terms of making our fixtures maximally re-usable by other implementations.
1.0
Provide a way to generate test data with invalid content but valid signatures - **Description of issue or feature request**: As part of PHP-TUF, we've been building out our own conformance suite of test fixtures. (I've posted before about how we've continued to improve the determinism of the design.) Unfortunately, there's a conflict between easy fixture generation, good security design, and comprehensive testing. It's making it hard for us to test scenarios where the signatures are valid but the signed data is broken in some way: - The current library is mostly (and understandably) designed to provide correctly structured, signed data. - As is best practice for our design, we minimize parsing and usage until after signature validation. - We want to test our robustness against incorrect but signed data. So, we're looking for advice or features -- we're not completely sure. Advice could be in the form of how to leverage `securesystemslib` or the TUF API to generate a signature over invalid metadata. Features could be in the form of a well-defined way to force the high-level TUF library to re-sign invalid data. We've also considered adding a bypass to PHP-TUF to allow a fixture to be used in a way that skips signature validation before usage, but this seems undesirable in terms of making our fixtures maximally re-usable by other implementations.
test
provide a way to generate test data with invalid content but valid signatures description of issue or feature request as part of php tuf we ve been building out our own conformance suite of test fixtures i ve posted before about how we ve continued to improve the determinism of the design unfortunately there s a conflict between easy fixture generation good security design and comprehensive testing it s making it hard for us to test scenarios where the signatures are valid but the signed data is broken in some way the current library is mostly and understandably designed to provide correctly structured signed data as is best practice for our design we minimize parsing and usage until after signature validation we want to test our robustness against incorrect but signed data so we re looking for advice or features we re not completely sure advice could be in the form of how to leverage securesystemslib or the tuf api to generate a signature over invalid metadata features could be in the form of a well defined way to force the high level tuf library to re sign invalid data we ve also considered adding a bypass to php tuf to allow a fixture to be used in a way that skips signature validation before usage but this seems undesirable in terms of making our fixtures maximally re usable by other implementations
1
79,833
3,547,791,631
IssuesEvent
2016-01-20 11:23:48
xcat2/xcat-core
https://api.github.com/repos/xcat2/xcat-core
opened
The postscript confignetwork can create one link aggregation device only
component:postscripts priority:normal type:bug
Below is the nicdevices and nictypes attributes of node ``c910f02c03p23``. [root@c910f02c03p27 ~]# lsdef c910f02c03p23 -i nicdevices,nictypes Object name: c910f02c03p23 nicdevices.bond0=eth2|eth3 nicdevices.bond1=eth1|eth4 nictypes.eth4=ethernet nictypes.eth2=ethernet nictypes.eth3=ethernet nictypes.bond0=bond nictypes.bond1=bond nictypes.eth1=ethernet Only one link aggregation device can be defined. All the network adapters belong to the remain link aggregation devices are wrongly configured to join the first link aggregation device. # updatenode c910f02c03p23 confignetwork c910f02c03p23: xcatdsklspost: updating VPD database c910f02c03p23: xcatdsklspost: downloaded postscripts successfully c910f02c03p23: Wed Jan 20 06:14:08 EST 2016 Running postscript: confignetwork c910f02c03p23: [I]: NetworkManager is inactive. c910f02c03p23: [I]: All valid nics and device list: c910f02c03p23: [I]: bond0 eth2@eth3 c910f02c03p23: [I]: bond1 eth1@eth4 c910f02c03p23: ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ c910f02c03p23: configure nic and its device : bond0 eth2@eth3 c910f02c03p23: [I]: create_bond_interface ifname=bond0 slave_ports=eth2,eth3 c910f02c03p23: [I]: Pickup xcatnet, "", from NICNETWORKS for interface "bond0". c910f02c03p23: [I]: ip link set bond0 down c910f02c03p23: [I]: [bond.down] >> 5: bond0: <BROADCAST,MULTICAST,MASTER> mtu 1500 qdisc noqueue state DOWN mode DEFAULT c910f02c03p23: [I]: [bond.down] >> link/ether 16:3f:4d:b4:fe:04 brd ff:ff:ff:ff:ff:ff c910f02c03p23: [I]: [bond.slavesAft] >> eth1 eth2 c910f02c03p23: [I]: create_persistent_ifcfg ifname=eth2 inattrs=ONBOOT=yes,USERCTL=no,TYPE=Ethernet,SLAVE=yes,MASTER=bond0,BOOTPROTO=none c910f02c03p23: ['ifcfg-eth2'] c910f02c03p23: [I]: >> DEVICE="eth2" c910f02c03p23: [I]: >> BOOTPROTO="none" c910f02c03p23: [I]: >> NAME="eth2" c910f02c03p23: [I]: >> ONBOOT="yes" c910f02c03p23: [I]: >> USERCTL="no" c910f02c03p23: [I]: >> TYPE="Ethernet" c910f02c03p23: [I]: >> SLAVE="yes" c910f02c03p23: [I]: >> MASTER="bond0" c910f02c03p23: Cannot find device "eth3" c910f02c03p23: [I]: ip link set eth3 down c910f02c03p23: Device "eth3" does not exist. c910f02c03p23: ./nicutils.sh: line 1270: echo: write error: No such device c910f02c03p23: [I]: create_persistent_ifcfg ifname=eth3 inattrs=ONBOOT=yes,USERCTL=no,TYPE=Ethernet,SLAVE=yes,MASTER=bond0,BOOTPROTO=none c910f02c03p23: ['ifcfg-eth3'] c910f02c03p23: [I]: >> DEVICE="eth3" c910f02c03p23: [I]: >> BOOTPROTO="none" c910f02c03p23: [I]: >> NAME="eth3" c910f02c03p23: [I]: >> ONBOOT="yes" c910f02c03p23: [I]: >> USERCTL="no" c910f02c03p23: [I]: >> TYPE="Ethernet" c910f02c03p23: [I]: >> SLAVE="yes" c910f02c03p23: [I]: >> MASTER="bond0" c910f02c03p23: [I]: [bond.slavesNew] >> eth1 eth2 c910f02c03p23: [I]: ip link set bond0 up c910f02c03p23: [I]: State of "bond0" was "DOWN" instead of expected "UP". Wait 0 of 200 with interval 1. c910f02c03p23: [I]: [ip.link] >> 5: bond0: <BROADCAST,MULTICAST,MASTER,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP mode DEFAULT c910f02c03p23: [I]: [ip.link] >> link/ether 16:3f:4d:b4:fe:04 brd ff:ff:ff:ff:ff:ff c910f02c03p23: [I]: create_persistent_ifcfg ifname=bond0 xcatnet= inattrs=ONBOOT=yes,USERCTL=no,TYPE=Bond,BONDING_MASTER=yes,BONDING_OPTS='mode=802.3ad miimon=100',BOOTPROTO=none,DHCLIENTARGS='-timeout 200' c910f02c03p23: ['ifcfg-bond0'] c910f02c03p23: [I]: >> DEVICE="bond0" c910f02c03p23: [I]: >> BOOTPROTO="none" c910f02c03p23: [I]: >> NAME="bond0" c910f02c03p23: [I]: >> BONDING_MASTER="yes" c910f02c03p23: [I]: >> ONBOOT="yes" c910f02c03p23: [I]: >> USERCTL="no" c910f02c03p23: [I]: >> TYPE="Bond" c910f02c03p23: [I]: >> BONDING_OPTS="mode=802.3ad miimon=100" c910f02c03p23: [I]: >> DHCLIENTARGS="-timeout 200" c910f02c03p23: ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ c910f02c03p23: configure nic and its device : bond1 eth1@eth4 c910f02c03p23: [I]: create_bond_interface ifname=bond0 slave_ports=eth1,eth4 c910f02c03p23: [I]: Pickup xcatnet, "", from NICNETWORKS for interface "bond0". c910f02c03p23: [I]: ip link set bond0 down c910f02c03p23: [I]: [bond.down] >> 5: bond0: <BROADCAST,MULTICAST,MASTER> mtu 1500 qdisc noqueue state DOWN mode DEFAULT c910f02c03p23: [I]: [bond.down] >> link/ether 16:3f:4d:b4:fe:04 brd ff:ff:ff:ff:ff:ff c910f02c03p23: [I]: [bond.slavesAft] >> eth1 eth2 c910f02c03p23: [I]: create_persistent_ifcfg ifname=eth1 inattrs=ONBOOT=yes,USERCTL=no,TYPE=Ethernet,SLAVE=yes,MASTER=bond0,BOOTPROTO=none c910f02c03p23: ['ifcfg-eth1'] c910f02c03p23: [I]: >> DEVICE="eth1" c910f02c03p23: [I]: >> BOOTPROTO="none" c910f02c03p23: [I]: >> NAME="eth1" c910f02c03p23: [I]: >> ONBOOT="yes" c910f02c03p23: [I]: >> USERCTL="no" c910f02c03p23: [I]: >> TYPE="Ethernet" c910f02c03p23: [I]: >> SLAVE="yes" c910f02c03p23: [I]: >> MASTER="bond0" c910f02c03p23: Cannot find device "eth4" c910f02c03p23: [I]: ip link set eth4 down c910f02c03p23: Device "eth4" does not exist. c910f02c03p23: ./nicutils.sh: line 1270: echo: write error: No such device c910f02c03p23: [I]: create_persistent_ifcfg ifname=eth4 inattrs=ONBOOT=yes,USERCTL=no,TYPE=Ethernet,SLAVE=yes,MASTER=bond0,BOOTPROTO=none c910f02c03p23: ['ifcfg-eth4'] c910f02c03p23: [I]: >> DEVICE="eth4" c910f02c03p23: [I]: >> BOOTPROTO="none" c910f02c03p23: [I]: >> NAME="eth4" c910f02c03p23: [I]: >> ONBOOT="yes" c910f02c03p23: [I]: >> USERCTL="no" c910f02c03p23: [I]: >> TYPE="Ethernet" c910f02c03p23: [I]: >> SLAVE="yes" c910f02c03p23: [I]: >> MASTER="bond0" c910f02c03p23: [I]: [bond.slavesNew] >> eth1 eth2 c910f02c03p23: [I]: ip link set bond0 up c910f02c03p23: [I]: [ip.link] >> 5: bond0: <BROADCAST,MULTICAST,MASTER,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP mode DEFAULT c910f02c03p23: [I]: [ip.link] >> link/ether 16:3f:4d:b4:fe:04 brd ff:ff:ff:ff:ff:ff c910f02c03p23: [I]: create_persistent_ifcfg ifname=bond0 xcatnet= inattrs=ONBOOT=yes,USERCTL=no,TYPE=Bond,BONDING_MASTER=yes,BONDING_OPTS='mode=802.3ad miimon=100',BOOTPROTO=none,DHCLIENTARGS='-timeout 200' c910f02c03p23: ['ifcfg-bond0'] c910f02c03p23: [I]: >> DEVICE="bond0" c910f02c03p23: [I]: >> BOOTPROTO="none" c910f02c03p23: [I]: >> NAME="bond0" c910f02c03p23: [I]: >> BONDING_MASTER="yes" c910f02c03p23: [I]: >> ONBOOT="yes" c910f02c03p23: [I]: >> USERCTL="no" c910f02c03p23: [I]: >> TYPE="Bond" c910f02c03p23: [I]: >> BONDING_OPTS="mode=802.3ad miimon=100" c910f02c03p23: [I]: >> DHCLIENTARGS="-timeout 200" c910f02c03p23: [I]: State of "bond0" was "DOWN" instead of expected "UP". Wait 0 of 200 with interval 1. c910f02c03p23: Postscript: confignetwork exited with code 0 c910f02c03p23: Running of postscripts has completed. Below are the network configuration files for all the four network adapters. As you can see here, all adapters goes to ``bond0``. Which is incorrect. # head -n 999 /etc/sysconfig/network-scripts/ifcfg-eth{1,2,3,4} ==> /etc/sysconfig/network-scripts/ifcfg-eth1 <== DEVICE="eth1" BOOTPROTO="none" NAME="eth1" ONBOOT="yes" USERCTL="no" TYPE="Ethernet" SLAVE="yes" MASTER="bond0" ==> /etc/sysconfig/network-scripts/ifcfg-eth2 <== DEVICE="eth2" BOOTPROTO="none" NAME="eth2" ONBOOT="yes" USERCTL="no" TYPE="Ethernet" SLAVE="yes" MASTER="bond0" ==> /etc/sysconfig/network-scripts/ifcfg-eth3 <== DEVICE="eth3" BOOTPROTO="none" NAME="eth3" ONBOOT="yes" USERCTL="no" TYPE="Ethernet" SLAVE="yes" MASTER="bond0" ==> /etc/sysconfig/network-scripts/ifcfg-eth4 <== DEVICE="eth4" BOOTPROTO="none" NAME="eth4" ONBOOT="yes" USERCTL="no" TYPE="Ethernet" SLAVE="yes" MASTER="bond0"
1.0
The postscript confignetwork can create one link aggregation device only - Below is the nicdevices and nictypes attributes of node ``c910f02c03p23``. [root@c910f02c03p27 ~]# lsdef c910f02c03p23 -i nicdevices,nictypes Object name: c910f02c03p23 nicdevices.bond0=eth2|eth3 nicdevices.bond1=eth1|eth4 nictypes.eth4=ethernet nictypes.eth2=ethernet nictypes.eth3=ethernet nictypes.bond0=bond nictypes.bond1=bond nictypes.eth1=ethernet Only one link aggregation device can be defined. All the network adapters belong to the remain link aggregation devices are wrongly configured to join the first link aggregation device. # updatenode c910f02c03p23 confignetwork c910f02c03p23: xcatdsklspost: updating VPD database c910f02c03p23: xcatdsklspost: downloaded postscripts successfully c910f02c03p23: Wed Jan 20 06:14:08 EST 2016 Running postscript: confignetwork c910f02c03p23: [I]: NetworkManager is inactive. c910f02c03p23: [I]: All valid nics and device list: c910f02c03p23: [I]: bond0 eth2@eth3 c910f02c03p23: [I]: bond1 eth1@eth4 c910f02c03p23: ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ c910f02c03p23: configure nic and its device : bond0 eth2@eth3 c910f02c03p23: [I]: create_bond_interface ifname=bond0 slave_ports=eth2,eth3 c910f02c03p23: [I]: Pickup xcatnet, "", from NICNETWORKS for interface "bond0". c910f02c03p23: [I]: ip link set bond0 down c910f02c03p23: [I]: [bond.down] >> 5: bond0: <BROADCAST,MULTICAST,MASTER> mtu 1500 qdisc noqueue state DOWN mode DEFAULT c910f02c03p23: [I]: [bond.down] >> link/ether 16:3f:4d:b4:fe:04 brd ff:ff:ff:ff:ff:ff c910f02c03p23: [I]: [bond.slavesAft] >> eth1 eth2 c910f02c03p23: [I]: create_persistent_ifcfg ifname=eth2 inattrs=ONBOOT=yes,USERCTL=no,TYPE=Ethernet,SLAVE=yes,MASTER=bond0,BOOTPROTO=none c910f02c03p23: ['ifcfg-eth2'] c910f02c03p23: [I]: >> DEVICE="eth2" c910f02c03p23: [I]: >> BOOTPROTO="none" c910f02c03p23: [I]: >> NAME="eth2" c910f02c03p23: [I]: >> ONBOOT="yes" c910f02c03p23: [I]: >> USERCTL="no" c910f02c03p23: [I]: >> TYPE="Ethernet" c910f02c03p23: [I]: >> SLAVE="yes" c910f02c03p23: [I]: >> MASTER="bond0" c910f02c03p23: Cannot find device "eth3" c910f02c03p23: [I]: ip link set eth3 down c910f02c03p23: Device "eth3" does not exist. c910f02c03p23: ./nicutils.sh: line 1270: echo: write error: No such device c910f02c03p23: [I]: create_persistent_ifcfg ifname=eth3 inattrs=ONBOOT=yes,USERCTL=no,TYPE=Ethernet,SLAVE=yes,MASTER=bond0,BOOTPROTO=none c910f02c03p23: ['ifcfg-eth3'] c910f02c03p23: [I]: >> DEVICE="eth3" c910f02c03p23: [I]: >> BOOTPROTO="none" c910f02c03p23: [I]: >> NAME="eth3" c910f02c03p23: [I]: >> ONBOOT="yes" c910f02c03p23: [I]: >> USERCTL="no" c910f02c03p23: [I]: >> TYPE="Ethernet" c910f02c03p23: [I]: >> SLAVE="yes" c910f02c03p23: [I]: >> MASTER="bond0" c910f02c03p23: [I]: [bond.slavesNew] >> eth1 eth2 c910f02c03p23: [I]: ip link set bond0 up c910f02c03p23: [I]: State of "bond0" was "DOWN" instead of expected "UP". Wait 0 of 200 with interval 1. c910f02c03p23: [I]: [ip.link] >> 5: bond0: <BROADCAST,MULTICAST,MASTER,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP mode DEFAULT c910f02c03p23: [I]: [ip.link] >> link/ether 16:3f:4d:b4:fe:04 brd ff:ff:ff:ff:ff:ff c910f02c03p23: [I]: create_persistent_ifcfg ifname=bond0 xcatnet= inattrs=ONBOOT=yes,USERCTL=no,TYPE=Bond,BONDING_MASTER=yes,BONDING_OPTS='mode=802.3ad miimon=100',BOOTPROTO=none,DHCLIENTARGS='-timeout 200' c910f02c03p23: ['ifcfg-bond0'] c910f02c03p23: [I]: >> DEVICE="bond0" c910f02c03p23: [I]: >> BOOTPROTO="none" c910f02c03p23: [I]: >> NAME="bond0" c910f02c03p23: [I]: >> BONDING_MASTER="yes" c910f02c03p23: [I]: >> ONBOOT="yes" c910f02c03p23: [I]: >> USERCTL="no" c910f02c03p23: [I]: >> TYPE="Bond" c910f02c03p23: [I]: >> BONDING_OPTS="mode=802.3ad miimon=100" c910f02c03p23: [I]: >> DHCLIENTARGS="-timeout 200" c910f02c03p23: ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ c910f02c03p23: configure nic and its device : bond1 eth1@eth4 c910f02c03p23: [I]: create_bond_interface ifname=bond0 slave_ports=eth1,eth4 c910f02c03p23: [I]: Pickup xcatnet, "", from NICNETWORKS for interface "bond0". c910f02c03p23: [I]: ip link set bond0 down c910f02c03p23: [I]: [bond.down] >> 5: bond0: <BROADCAST,MULTICAST,MASTER> mtu 1500 qdisc noqueue state DOWN mode DEFAULT c910f02c03p23: [I]: [bond.down] >> link/ether 16:3f:4d:b4:fe:04 brd ff:ff:ff:ff:ff:ff c910f02c03p23: [I]: [bond.slavesAft] >> eth1 eth2 c910f02c03p23: [I]: create_persistent_ifcfg ifname=eth1 inattrs=ONBOOT=yes,USERCTL=no,TYPE=Ethernet,SLAVE=yes,MASTER=bond0,BOOTPROTO=none c910f02c03p23: ['ifcfg-eth1'] c910f02c03p23: [I]: >> DEVICE="eth1" c910f02c03p23: [I]: >> BOOTPROTO="none" c910f02c03p23: [I]: >> NAME="eth1" c910f02c03p23: [I]: >> ONBOOT="yes" c910f02c03p23: [I]: >> USERCTL="no" c910f02c03p23: [I]: >> TYPE="Ethernet" c910f02c03p23: [I]: >> SLAVE="yes" c910f02c03p23: [I]: >> MASTER="bond0" c910f02c03p23: Cannot find device "eth4" c910f02c03p23: [I]: ip link set eth4 down c910f02c03p23: Device "eth4" does not exist. c910f02c03p23: ./nicutils.sh: line 1270: echo: write error: No such device c910f02c03p23: [I]: create_persistent_ifcfg ifname=eth4 inattrs=ONBOOT=yes,USERCTL=no,TYPE=Ethernet,SLAVE=yes,MASTER=bond0,BOOTPROTO=none c910f02c03p23: ['ifcfg-eth4'] c910f02c03p23: [I]: >> DEVICE="eth4" c910f02c03p23: [I]: >> BOOTPROTO="none" c910f02c03p23: [I]: >> NAME="eth4" c910f02c03p23: [I]: >> ONBOOT="yes" c910f02c03p23: [I]: >> USERCTL="no" c910f02c03p23: [I]: >> TYPE="Ethernet" c910f02c03p23: [I]: >> SLAVE="yes" c910f02c03p23: [I]: >> MASTER="bond0" c910f02c03p23: [I]: [bond.slavesNew] >> eth1 eth2 c910f02c03p23: [I]: ip link set bond0 up c910f02c03p23: [I]: [ip.link] >> 5: bond0: <BROADCAST,MULTICAST,MASTER,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP mode DEFAULT c910f02c03p23: [I]: [ip.link] >> link/ether 16:3f:4d:b4:fe:04 brd ff:ff:ff:ff:ff:ff c910f02c03p23: [I]: create_persistent_ifcfg ifname=bond0 xcatnet= inattrs=ONBOOT=yes,USERCTL=no,TYPE=Bond,BONDING_MASTER=yes,BONDING_OPTS='mode=802.3ad miimon=100',BOOTPROTO=none,DHCLIENTARGS='-timeout 200' c910f02c03p23: ['ifcfg-bond0'] c910f02c03p23: [I]: >> DEVICE="bond0" c910f02c03p23: [I]: >> BOOTPROTO="none" c910f02c03p23: [I]: >> NAME="bond0" c910f02c03p23: [I]: >> BONDING_MASTER="yes" c910f02c03p23: [I]: >> ONBOOT="yes" c910f02c03p23: [I]: >> USERCTL="no" c910f02c03p23: [I]: >> TYPE="Bond" c910f02c03p23: [I]: >> BONDING_OPTS="mode=802.3ad miimon=100" c910f02c03p23: [I]: >> DHCLIENTARGS="-timeout 200" c910f02c03p23: [I]: State of "bond0" was "DOWN" instead of expected "UP". Wait 0 of 200 with interval 1. c910f02c03p23: Postscript: confignetwork exited with code 0 c910f02c03p23: Running of postscripts has completed. Below are the network configuration files for all the four network adapters. As you can see here, all adapters goes to ``bond0``. Which is incorrect. # head -n 999 /etc/sysconfig/network-scripts/ifcfg-eth{1,2,3,4} ==> /etc/sysconfig/network-scripts/ifcfg-eth1 <== DEVICE="eth1" BOOTPROTO="none" NAME="eth1" ONBOOT="yes" USERCTL="no" TYPE="Ethernet" SLAVE="yes" MASTER="bond0" ==> /etc/sysconfig/network-scripts/ifcfg-eth2 <== DEVICE="eth2" BOOTPROTO="none" NAME="eth2" ONBOOT="yes" USERCTL="no" TYPE="Ethernet" SLAVE="yes" MASTER="bond0" ==> /etc/sysconfig/network-scripts/ifcfg-eth3 <== DEVICE="eth3" BOOTPROTO="none" NAME="eth3" ONBOOT="yes" USERCTL="no" TYPE="Ethernet" SLAVE="yes" MASTER="bond0" ==> /etc/sysconfig/network-scripts/ifcfg-eth4 <== DEVICE="eth4" BOOTPROTO="none" NAME="eth4" ONBOOT="yes" USERCTL="no" TYPE="Ethernet" SLAVE="yes" MASTER="bond0"
non_test
the postscript confignetwork can create one link aggregation device only below is the nicdevices and nictypes attributes of node lsdef i nicdevices nictypes object name nicdevices nicdevices nictypes ethernet nictypes ethernet nictypes ethernet nictypes bond nictypes bond nictypes ethernet only one link aggregation device can be defined all the network adapters belong to the remain link aggregation devices are wrongly configured to join the first link aggregation device updatenode confignetwork xcatdsklspost updating vpd database xcatdsklspost downloaded postscripts successfully wed jan est running postscript confignetwork networkmanager is inactive all valid nics and device list configure nic and its device create bond interface ifname slave ports pickup xcatnet from nicnetworks for interface ip link set down mtu qdisc noqueue state down mode default link ether fe brd ff ff ff ff ff ff create persistent ifcfg ifname inattrs onboot yes userctl no type ethernet slave yes master bootproto none device bootproto none name onboot yes userctl no type ethernet slave yes master cannot find device ip link set down device does not exist nicutils sh line echo write error no such device create persistent ifcfg ifname inattrs onboot yes userctl no type ethernet slave yes master bootproto none device bootproto none name onboot yes userctl no type ethernet slave yes master ip link set up state of was down instead of expected up wait of with interval mtu qdisc noqueue state up mode default link ether fe brd ff ff ff ff ff ff create persistent ifcfg ifname xcatnet inattrs onboot yes userctl no type bond bonding master yes bonding opts mode miimon bootproto none dhclientargs timeout device bootproto none name bonding master yes onboot yes userctl no type bond bonding opts mode miimon dhclientargs timeout configure nic and its device create bond interface ifname slave ports pickup xcatnet from nicnetworks for interface ip link set down mtu qdisc noqueue state down mode default link ether fe brd ff ff ff ff ff ff create persistent ifcfg ifname inattrs onboot yes userctl no type ethernet slave yes master bootproto none device bootproto none name onboot yes userctl no type ethernet slave yes master cannot find device ip link set down device does not exist nicutils sh line echo write error no such device create persistent ifcfg ifname inattrs onboot yes userctl no type ethernet slave yes master bootproto none device bootproto none name onboot yes userctl no type ethernet slave yes master ip link set up mtu qdisc noqueue state up mode default link ether fe brd ff ff ff ff ff ff create persistent ifcfg ifname xcatnet inattrs onboot yes userctl no type bond bonding master yes bonding opts mode miimon bootproto none dhclientargs timeout device bootproto none name bonding master yes onboot yes userctl no type bond bonding opts mode miimon dhclientargs timeout state of was down instead of expected up wait of with interval postscript confignetwork exited with code running of postscripts has completed below are the network configuration files for all the four network adapters as you can see here all adapters goes to which is incorrect head n etc sysconfig network scripts ifcfg eth etc sysconfig network scripts ifcfg device bootproto none name onboot yes userctl no type ethernet slave yes master etc sysconfig network scripts ifcfg device bootproto none name onboot yes userctl no type ethernet slave yes master etc sysconfig network scripts ifcfg device bootproto none name onboot yes userctl no type ethernet slave yes master etc sysconfig network scripts ifcfg device bootproto none name onboot yes userctl no type ethernet slave yes master
0
334,965
10,147,420,667
IssuesEvent
2019-08-05 10:31:30
mozilla/addons-server
https://api.github.com/repos/mozilla/addons-server
closed
log add-on guid reuse to our logging pipeline
component: devhub priority: p4 state: pull request ready
from https://github.com/mozilla/addons-server/pull/11824/ @EnTeQuAk said: > Just an improvement idea: log add-on guid reuse to our logging pipeline, so that maybe in the future it can be part of malicious detection etc.
1.0
log add-on guid reuse to our logging pipeline - from https://github.com/mozilla/addons-server/pull/11824/ @EnTeQuAk said: > Just an improvement idea: log add-on guid reuse to our logging pipeline, so that maybe in the future it can be part of malicious detection etc.
non_test
log add on guid reuse to our logging pipeline from entequak said just an improvement idea log add on guid reuse to our logging pipeline so that maybe in the future it can be part of malicious detection etc
0
218,513
16,994,665,084
IssuesEvent
2021-07-01 03:56:11
blynkkk/blynk_Issues
https://api.github.com/repos/blynkkk/blynk_Issues
closed
Upgrade flair still shows after subscribing
billing iOS ready to test
I paid for a subscription today, but I still see the icon/button flair for the upgrade call-to-action (this is why I upgraded, to get rid of these... and also to use the controls). ![image](https://user-images.githubusercontent.com/313427/123843871-42e2eb00-d90a-11eb-8871-f51617762089.png)
1.0
Upgrade flair still shows after subscribing - I paid for a subscription today, but I still see the icon/button flair for the upgrade call-to-action (this is why I upgraded, to get rid of these... and also to use the controls). ![image](https://user-images.githubusercontent.com/313427/123843871-42e2eb00-d90a-11eb-8871-f51617762089.png)
test
upgrade flair still shows after subscribing i paid for a subscription today but i still see the icon button flair for the upgrade call to action this is why i upgraded to get rid of these and also to use the controls
1
136,369
11,047,445,446
IssuesEvent
2019-12-09 18:59:24
dexpenses/dexpenses-extract
https://api.github.com/repos/dexpenses/dexpenses-extract
closed
Implement test receipt normal/hog-tierpark-sababurg-long-cash
enhancement test-data
Receipt to implement: ![normal/hog-tierpark-sababurg-long-cash](https://firebasestorage.googleapis.com/v0/b/dexpenses-207219-test-images/o/normal%2Fhog-tierpark-sababurg-long-cash.JPEG?alt=media "normal/hog-tierpark-sababurg-long-cash")
1.0
Implement test receipt normal/hog-tierpark-sababurg-long-cash - Receipt to implement: ![normal/hog-tierpark-sababurg-long-cash](https://firebasestorage.googleapis.com/v0/b/dexpenses-207219-test-images/o/normal%2Fhog-tierpark-sababurg-long-cash.JPEG?alt=media "normal/hog-tierpark-sababurg-long-cash")
test
implement test receipt normal hog tierpark sababurg long cash receipt to implement normal hog tierpark sababurg long cash
1
336,830
30,223,814,654
IssuesEvent
2023-07-05 21:46:37
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
sql/tests: TestRandomSyntaxSelect failed
C-test-failure O-robot T-sql-foundations branch-release-22.1
sql/tests.TestRandomSyntaxSelect [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=4583041&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=4583041&tab=artifacts#/) on release-22.1 @ [a1c1879e01ceee79a81693c67a1dba184b5fc1b1](https://github.com/cockroachdb/cockroach/commits/a1c1879e01ceee79a81693c67a1dba184b5fc1b1): ``` rsg_test.go:755: 1m20s of 5m0s: 9855 executions, 211 successful rsg_test.go:755: 1m25s of 5m0s: 10414 executions, 222 successful rsg_test.go:755: 1m30s of 5m0s: 10982 executions, 234 successful rsg_test.go:755: 1m35s of 5m0s: 11508 executions, 239 successful rsg_test.go:755: 1m40s of 5m0s: 12014 executions, 250 successful rsg_test.go:755: 1m45s of 5m0s: 12537 executions, 259 successful rsg_test.go:755: 1m50s of 5m0s: 13027 executions, 269 successful rsg_test.go:755: 1m55s of 5m0s: 13567 executions, 291 successful rsg_test.go:755: 2m0s of 5m0s: 14042 executions, 299 successful rsg_test.go:755: 2m5s of 5m0s: 14546 executions, 314 successful rsg_test.go:755: 2m10s of 5m0s: 15070 executions, 324 successful rsg_test.go:755: 2m15s of 5m0s: 15564 executions, 334 successful rsg_test.go:755: 2m20s of 5m0s: 16062 executions, 345 successful rsg_test.go:755: 2m25s of 5m0s: 16594 executions, 354 successful rsg_test.go:755: 2m30s of 5m0s: 17090 executions, 365 successful rsg_test.go:755: 2m35s of 5m0s: 17590 executions, 380 successful rsg_test.go:755: 2m40s of 5m0s: 18140 executions, 397 successful rsg_test.go:755: 2m45s of 5m0s: 18642 executions, 413 successful rsg_test.go:755: 2m50s of 5m0s: 19130 executions, 427 successful rsg_test.go:755: 2m55s of 5m0s: 19641 executions, 440 successful rsg_test.go:755: 3m0s of 5m0s: 20148 executions, 450 successful rsg_test.go:755: 3m5s of 5m0s: 20634 executions, 463 successful rsg_test.go:755: 3m10s of 5m0s: 21145 executions, 475 successful rsg_test.go:755: 3m15s of 5m0s: 21631 executions, 483 successful rsg_test.go:755: 3m20s of 5m0s: 22115 executions, 492 successful rsg_test.go:755: 3m25s of 5m0s: 22603 executions, 511 successful rsg_test.go:755: 3m30s of 5m0s: 23094 executions, 524 successful rsg_test.go:755: 3m35s of 5m0s: 23581 executions, 543 successful rsg_test.go:755: 3m40s of 5m0s: 24050 executions, 546 successful rsg_test.go:755: 3m45s of 5m0s: 24526 executions, 555 successful rsg_test.go:755: 3m50s of 5m0s: 24993 executions, 566 successful rsg_test.go:755: 3m55s of 5m0s: 25478 executions, 578 successful rsg_test.go:755: 4m0s of 5m0s: 25975 executions, 589 successful rsg_test.go:755: 4m5s of 5m0s: 26457 executions, 601 successful rsg_test.go:755: 4m10s of 5m0s: 26937 executions, 612 successful rsg_test.go:755: 4m15s of 5m0s: 27426 executions, 625 successful rsg_test.go:755: 4m20s of 5m0s: 27894 executions, 637 successful rsg_test.go:755: 4m25s of 5m0s: 28346 executions, 646 successful rsg_test.go:755: 4m30s of 5m0s: 28784 executions, 653 successful rsg_test.go:755: 4m35s of 5m0s: 29271 executions, 667 successful rsg_test.go:755: 4m40s of 5m0s: 29769 executions, 682 successful rsg_test.go:755: 4m45s of 5m0s: 30250 executions, 695 successful rsg_test.go:755: 4m50s of 5m0s: 30694 executions, 703 successful rsg_test.go:755: 4m55s of 5m0s: 31187 executions, 716 successful rsg_test.go:755: 5m0s of 5m0s: 31613 executions, 725 successful rsg_test.go:791: 31620 executions, 725 successful rsg_test.go:799: cannot parse output of Format: sql="SELECT EXISTS ( ( ( WITH RECURSIVE EXPIRATION AS MATERIALIZED ( BACKUP POINTM INTO LATEST IN 'string' WITH REVISION_HISTORY ) , BOX2D AS NOT MATERIALIZED ( RESTORE SYSTEM USERS FROM ( 'string' , 'string' , POINT ) ) TABLE error ORDER BY INDEX FAMILY . ident . ident @ FAMILY DESC ) ) ) AS BOOLEAN FROM ident ", formattedSQL="SELECT EXISTS (((WITH RECURSIVE expiration AS MATERIALIZED (BACKUP TABLE pointm INTO LATEST IN 'string' WITH revision_history), \"box2d\" AS NOT MATERIALIZED (RESTORE TABLE FROM ('string', 'string', 'point')) TABLE error ORDER BY INDEX \"family\".ident.ident@family DESC))) AS boolean FROM ident": at or near "from": syntax error rsg_test.go:275: -- test log scope end -- test logs left over in: /go/src/github.com/cockroachdb/cockroach/artifacts/logTestRandomSyntaxSelect3445841199 --- FAIL: TestRandomSyntaxSelect (300.45s) ``` <details><summary>Help</summary> <p> See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM) </p> </details> /cc @cockroachdb/sql-experience <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestRandomSyntaxSelect.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub> Jira issue: CRDB-13820
1.0
sql/tests: TestRandomSyntaxSelect failed - sql/tests.TestRandomSyntaxSelect [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=4583041&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=4583041&tab=artifacts#/) on release-22.1 @ [a1c1879e01ceee79a81693c67a1dba184b5fc1b1](https://github.com/cockroachdb/cockroach/commits/a1c1879e01ceee79a81693c67a1dba184b5fc1b1): ``` rsg_test.go:755: 1m20s of 5m0s: 9855 executions, 211 successful rsg_test.go:755: 1m25s of 5m0s: 10414 executions, 222 successful rsg_test.go:755: 1m30s of 5m0s: 10982 executions, 234 successful rsg_test.go:755: 1m35s of 5m0s: 11508 executions, 239 successful rsg_test.go:755: 1m40s of 5m0s: 12014 executions, 250 successful rsg_test.go:755: 1m45s of 5m0s: 12537 executions, 259 successful rsg_test.go:755: 1m50s of 5m0s: 13027 executions, 269 successful rsg_test.go:755: 1m55s of 5m0s: 13567 executions, 291 successful rsg_test.go:755: 2m0s of 5m0s: 14042 executions, 299 successful rsg_test.go:755: 2m5s of 5m0s: 14546 executions, 314 successful rsg_test.go:755: 2m10s of 5m0s: 15070 executions, 324 successful rsg_test.go:755: 2m15s of 5m0s: 15564 executions, 334 successful rsg_test.go:755: 2m20s of 5m0s: 16062 executions, 345 successful rsg_test.go:755: 2m25s of 5m0s: 16594 executions, 354 successful rsg_test.go:755: 2m30s of 5m0s: 17090 executions, 365 successful rsg_test.go:755: 2m35s of 5m0s: 17590 executions, 380 successful rsg_test.go:755: 2m40s of 5m0s: 18140 executions, 397 successful rsg_test.go:755: 2m45s of 5m0s: 18642 executions, 413 successful rsg_test.go:755: 2m50s of 5m0s: 19130 executions, 427 successful rsg_test.go:755: 2m55s of 5m0s: 19641 executions, 440 successful rsg_test.go:755: 3m0s of 5m0s: 20148 executions, 450 successful rsg_test.go:755: 3m5s of 5m0s: 20634 executions, 463 successful rsg_test.go:755: 3m10s of 5m0s: 21145 executions, 475 successful rsg_test.go:755: 3m15s of 5m0s: 21631 executions, 483 successful rsg_test.go:755: 3m20s of 5m0s: 22115 executions, 492 successful rsg_test.go:755: 3m25s of 5m0s: 22603 executions, 511 successful rsg_test.go:755: 3m30s of 5m0s: 23094 executions, 524 successful rsg_test.go:755: 3m35s of 5m0s: 23581 executions, 543 successful rsg_test.go:755: 3m40s of 5m0s: 24050 executions, 546 successful rsg_test.go:755: 3m45s of 5m0s: 24526 executions, 555 successful rsg_test.go:755: 3m50s of 5m0s: 24993 executions, 566 successful rsg_test.go:755: 3m55s of 5m0s: 25478 executions, 578 successful rsg_test.go:755: 4m0s of 5m0s: 25975 executions, 589 successful rsg_test.go:755: 4m5s of 5m0s: 26457 executions, 601 successful rsg_test.go:755: 4m10s of 5m0s: 26937 executions, 612 successful rsg_test.go:755: 4m15s of 5m0s: 27426 executions, 625 successful rsg_test.go:755: 4m20s of 5m0s: 27894 executions, 637 successful rsg_test.go:755: 4m25s of 5m0s: 28346 executions, 646 successful rsg_test.go:755: 4m30s of 5m0s: 28784 executions, 653 successful rsg_test.go:755: 4m35s of 5m0s: 29271 executions, 667 successful rsg_test.go:755: 4m40s of 5m0s: 29769 executions, 682 successful rsg_test.go:755: 4m45s of 5m0s: 30250 executions, 695 successful rsg_test.go:755: 4m50s of 5m0s: 30694 executions, 703 successful rsg_test.go:755: 4m55s of 5m0s: 31187 executions, 716 successful rsg_test.go:755: 5m0s of 5m0s: 31613 executions, 725 successful rsg_test.go:791: 31620 executions, 725 successful rsg_test.go:799: cannot parse output of Format: sql="SELECT EXISTS ( ( ( WITH RECURSIVE EXPIRATION AS MATERIALIZED ( BACKUP POINTM INTO LATEST IN 'string' WITH REVISION_HISTORY ) , BOX2D AS NOT MATERIALIZED ( RESTORE SYSTEM USERS FROM ( 'string' , 'string' , POINT ) ) TABLE error ORDER BY INDEX FAMILY . ident . ident @ FAMILY DESC ) ) ) AS BOOLEAN FROM ident ", formattedSQL="SELECT EXISTS (((WITH RECURSIVE expiration AS MATERIALIZED (BACKUP TABLE pointm INTO LATEST IN 'string' WITH revision_history), \"box2d\" AS NOT MATERIALIZED (RESTORE TABLE FROM ('string', 'string', 'point')) TABLE error ORDER BY INDEX \"family\".ident.ident@family DESC))) AS boolean FROM ident": at or near "from": syntax error rsg_test.go:275: -- test log scope end -- test logs left over in: /go/src/github.com/cockroachdb/cockroach/artifacts/logTestRandomSyntaxSelect3445841199 --- FAIL: TestRandomSyntaxSelect (300.45s) ``` <details><summary>Help</summary> <p> See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM) </p> </details> /cc @cockroachdb/sql-experience <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestRandomSyntaxSelect.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub> Jira issue: CRDB-13820
test
sql tests testrandomsyntaxselect failed sql tests testrandomsyntaxselect with on release rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go executions successful rsg test go cannot parse output of format sql select exists with recursive expiration as materialized backup pointm into latest in string with revision history as not materialized restore system users from string string point table error order by index family ident ident family desc as boolean from ident formattedsql select exists with recursive expiration as materialized backup table pointm into latest in string with revision history as not materialized restore table from string string point table error order by index family ident ident family desc as boolean from ident at or near from syntax error rsg test go test log scope end test logs left over in go src github com cockroachdb cockroach artifacts fail testrandomsyntaxselect help see also cc cockroachdb sql experience jira issue crdb
1
61,674
6,746,380,592
IssuesEvent
2017-10-21 01:22:30
fossasia/badgeyay
https://api.github.com/repos/fossasia/badgeyay
closed
Fix Travis tests
hacktoberfest priority test-and-quality
**I'm submitting a ...** - [X] bug report **Current behavior:** <!-- How the bug manifests. --> Currently, travis is failing. https://travis-ci.org/fossasia/badgeyay/builds/278248341?utm_source=github_status&utm_medium=notification **Expected behavior:** <!-- Behavior would be without the bug. --> Travis should run all the tests #147 and succeed. **Steps to reproduce:** Please create a pull request to work on this issue and submit commits which fix the errors.
1.0
Fix Travis tests - **I'm submitting a ...** - [X] bug report **Current behavior:** <!-- How the bug manifests. --> Currently, travis is failing. https://travis-ci.org/fossasia/badgeyay/builds/278248341?utm_source=github_status&utm_medium=notification **Expected behavior:** <!-- Behavior would be without the bug. --> Travis should run all the tests #147 and succeed. **Steps to reproduce:** Please create a pull request to work on this issue and submit commits which fix the errors.
test
fix travis tests i m submitting a bug report current behavior currently travis is failing expected behavior travis should run all the tests and succeed steps to reproduce please create a pull request to work on this issue and submit commits which fix the errors
1
328,266
28,110,319,376
IssuesEvent
2023-03-31 06:33:35
void-linux/void-packages
https://api.github.com/repos/void-linux/void-packages
opened
wine 32bit: couldn't be able to find ntdll.so
bug needs-testing
### Is this a new report? Yes ### System Info Void 6.1.21_1 x86_64 GenuineIntel uptodate rrmFFF ### Package(s) Affected wine-32bit-8.4_1 ### Does a report exist for this bug with the project's home (upstream) and/or another distro? _No response_ ### Expected behaviour 32-bit version of wine should run just fine. ### Actual behaviour Launching wine (32-bit) literally throws that ntdll.so couldn't be found on its path. Running wine (64-bit) with its usual name, wine64 which runs just fine. #### Message wine (32-bit) ``` wine: could not load ntdll.so: /usr/libexec/wine/../../lib32/wine/i386-unix/ntdll.so: cannot open shared object file: No such file or directory ``` ### Steps to reproduce 1. Enable 32-bit repositories in Void. Usually with (`xbps-install -Sy void-repo-multilib`) 2. Install wine with its package name, wine-32bit with (`xbps-install -Sy wine-32bit`) 3. Open a terminal and run wine (which will now address 32-bit version of wine)
1.0
wine 32bit: couldn't be able to find ntdll.so - ### Is this a new report? Yes ### System Info Void 6.1.21_1 x86_64 GenuineIntel uptodate rrmFFF ### Package(s) Affected wine-32bit-8.4_1 ### Does a report exist for this bug with the project's home (upstream) and/or another distro? _No response_ ### Expected behaviour 32-bit version of wine should run just fine. ### Actual behaviour Launching wine (32-bit) literally throws that ntdll.so couldn't be found on its path. Running wine (64-bit) with its usual name, wine64 which runs just fine. #### Message wine (32-bit) ``` wine: could not load ntdll.so: /usr/libexec/wine/../../lib32/wine/i386-unix/ntdll.so: cannot open shared object file: No such file or directory ``` ### Steps to reproduce 1. Enable 32-bit repositories in Void. Usually with (`xbps-install -Sy void-repo-multilib`) 2. Install wine with its package name, wine-32bit with (`xbps-install -Sy wine-32bit`) 3. Open a terminal and run wine (which will now address 32-bit version of wine)
test
wine couldn t be able to find ntdll so is this a new report yes system info void genuineintel uptodate rrmfff package s affected wine does a report exist for this bug with the project s home upstream and or another distro no response expected behaviour bit version of wine should run just fine actual behaviour launching wine bit literally throws that ntdll so couldn t be found on its path running wine bit with its usual name which runs just fine message wine bit wine could not load ntdll so usr libexec wine wine unix ntdll so cannot open shared object file no such file or directory steps to reproduce enable bit repositories in void usually with xbps install sy void repo multilib install wine with its package name wine with xbps install sy wine open a terminal and run wine which will now address bit version of wine
1
170,436
13,187,086,977
IssuesEvent
2020-08-13 02:18:28
KMcNickel/HH-DMX-Console-Hardware
https://api.github.com/repos/KMcNickel/HH-DMX-Console-Hardware
closed
PWRON pull-up needs to be removed/modified
bodge tested
Removing the pull up prevents the capacitance from re-enabling the LTC3101 after the MCU shuts off. If a shorter button press is desired to turn the unit on, the PWRON pin can be shorted to the RESET pin and a pull-up can be added. HOWEVER, an investigation must be done to determine if there will be an issue with the SWD reset connection. One thought is this: If the PWRON is entirely dependent on the MCU, the power button must be held down during programming (SWD or USB). If the PWRON is tied to reset, the button will only need to be held during SWD programming, the unit should stay on by itself during USB programming.
1.0
PWRON pull-up needs to be removed/modified - Removing the pull up prevents the capacitance from re-enabling the LTC3101 after the MCU shuts off. If a shorter button press is desired to turn the unit on, the PWRON pin can be shorted to the RESET pin and a pull-up can be added. HOWEVER, an investigation must be done to determine if there will be an issue with the SWD reset connection. One thought is this: If the PWRON is entirely dependent on the MCU, the power button must be held down during programming (SWD or USB). If the PWRON is tied to reset, the button will only need to be held during SWD programming, the unit should stay on by itself during USB programming.
test
pwron pull up needs to be removed modified removing the pull up prevents the capacitance from re enabling the after the mcu shuts off if a shorter button press is desired to turn the unit on the pwron pin can be shorted to the reset pin and a pull up can be added however an investigation must be done to determine if there will be an issue with the swd reset connection one thought is this if the pwron is entirely dependent on the mcu the power button must be held down during programming swd or usb if the pwron is tied to reset the button will only need to be held during swd programming the unit should stay on by itself during usb programming
1
260,745
27,784,709,106
IssuesEvent
2023-03-17 01:30:38
michaeldotson/home-inventory-vue-app
https://api.github.com/repos/michaeldotson/home-inventory-vue-app
opened
CVE-2023-28155 (Medium) detected in request-2.88.0.tgz
Mend: dependency security vulnerability
## CVE-2023-28155 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>request-2.88.0.tgz</b></p></summary> <p>Simplified HTTP request client.</p> <p>Library home page: <a href="https://registry.npmjs.org/request/-/request-2.88.0.tgz">https://registry.npmjs.org/request/-/request-2.88.0.tgz</a></p> <p>Path to dependency file: /home-inventory-vue-app/package.json</p> <p>Path to vulnerable library: /node_modules/request/package.json</p> <p> Dependency Hierarchy: - cli-plugin-babel-3.5.1.tgz (Root Library) - cli-shared-utils-3.5.1.tgz - :x: **request-2.88.0.tgz** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> ** UNSUPPORTED WHEN ASSIGNED ** The Request package through 2.88.1 for Node.js allows a bypass of SSRF mitigations via an attacker-controller server that does a cross-protocol redirect (HTTP to HTTPS, or HTTPS to HTTP). NOTE: This vulnerability only affects products that are no longer supported by the maintainer. <p>Publish Date: 2023-03-16 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-28155>CVE-2023-28155</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2023-28155 (Medium) detected in request-2.88.0.tgz - ## CVE-2023-28155 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>request-2.88.0.tgz</b></p></summary> <p>Simplified HTTP request client.</p> <p>Library home page: <a href="https://registry.npmjs.org/request/-/request-2.88.0.tgz">https://registry.npmjs.org/request/-/request-2.88.0.tgz</a></p> <p>Path to dependency file: /home-inventory-vue-app/package.json</p> <p>Path to vulnerable library: /node_modules/request/package.json</p> <p> Dependency Hierarchy: - cli-plugin-babel-3.5.1.tgz (Root Library) - cli-shared-utils-3.5.1.tgz - :x: **request-2.88.0.tgz** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> ** UNSUPPORTED WHEN ASSIGNED ** The Request package through 2.88.1 for Node.js allows a bypass of SSRF mitigations via an attacker-controller server that does a cross-protocol redirect (HTTP to HTTPS, or HTTPS to HTTP). NOTE: This vulnerability only affects products that are no longer supported by the maintainer. <p>Publish Date: 2023-03-16 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-28155>CVE-2023-28155</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve medium detected in request tgz cve medium severity vulnerability vulnerable library request tgz simplified http request client library home page a href path to dependency file home inventory vue app package json path to vulnerable library node modules request package json dependency hierarchy cli plugin babel tgz root library cli shared utils tgz x request tgz vulnerable library vulnerability details unsupported when assigned the request package through for node js allows a bypass of ssrf mitigations via an attacker controller server that does a cross protocol redirect http to https or https to http note this vulnerability only affects products that are no longer supported by the maintainer publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href step up your open source security game with mend
0
52,194
13,211,405,555
IssuesEvent
2020-08-15 22:54:50
icecube-trac/tix4
https://api.github.com/repos/icecube-trac/tix4
opened
[photospline] Modern C++ Issues (Trac #1831)
Incomplete Migration Migrated from Trac combo reconstruction defect
<details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1831">https://code.icecube.wisc.edu/projects/icecube/ticket/1831</a>, reported by olivasand owned by jvansanten</em></summary> <p> ```json { "status": "closed", "changetime": "2019-02-13T14:12:38", "_ts": "1550067158057333", "description": "A few of the bots complain about linking photospline after the cpp11 switch was thrown.\n\nhttp://builds.icecube.wisc.edu/builders/OS%20X%20El%20Capitan/builds/851/steps/compile/logs/stdio", "reporter": "olivas", "cc": "", "resolution": "fixed", "time": "2016-08-19T18:31:51", "component": "combo reconstruction", "summary": "[photospline] Modern C++ Issues", "priority": "blocker", "keywords": "", "milestone": "", "owner": "jvansanten", "type": "defect" } ``` </p> </details>
1.0
[photospline] Modern C++ Issues (Trac #1831) - <details> <summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1831">https://code.icecube.wisc.edu/projects/icecube/ticket/1831</a>, reported by olivasand owned by jvansanten</em></summary> <p> ```json { "status": "closed", "changetime": "2019-02-13T14:12:38", "_ts": "1550067158057333", "description": "A few of the bots complain about linking photospline after the cpp11 switch was thrown.\n\nhttp://builds.icecube.wisc.edu/builders/OS%20X%20El%20Capitan/builds/851/steps/compile/logs/stdio", "reporter": "olivas", "cc": "", "resolution": "fixed", "time": "2016-08-19T18:31:51", "component": "combo reconstruction", "summary": "[photospline] Modern C++ Issues", "priority": "blocker", "keywords": "", "milestone": "", "owner": "jvansanten", "type": "defect" } ``` </p> </details>
non_test
modern c issues trac migrated from json status closed changetime ts description a few of the bots complain about linking photospline after the switch was thrown n n reporter olivas cc resolution fixed time component combo reconstruction summary modern c issues priority blocker keywords milestone owner jvansanten type defect
0
39,457
5,234,226,939
IssuesEvent
2017-01-30 15:09:45
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
github.com/cockroachdb/cockroach/pkg/kv: TestMultiRangeBoundedBatchDelRangeOverlappingKeys failed under stress
Robot test-failure
SHA: https://github.com/cockroachdb/cockroach/commits/4c53128707d07d268833ff5ccb5acee9d8720544 Parameters: ``` COCKROACH_PROPOSER_EVALUATED_KV=false TAGS=deadlock GOFLAGS= ``` Stress build found a failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=133235&tab=buildLog ``` W170130 11:00:14.309983 3087265 server/status/runtime.go:116 Could not parse build timestamp: parsing time "" as "2006/01/02 15:04:05": cannot parse "" as "2006" I170130 11:00:14.311208 3087265 server/config.go:456 1 storage engine initialized I170130 11:00:14.313891 3087265 server/node.go:444 [n?] store [n0,s0] not bootstrapped I170130 11:00:14.318787 3087776 storage/replica.go:4339 [n?,s1,r1/1:/M{in-ax},@c42825f200] gossip not initialized I170130 11:00:14.321214 3087265 server/node.go:373 [n?] **** cluster 05c0a6e5-392d-4cbe-ac01-3c068475f88e has been created I170130 11:00:14.321275 3087265 server/node.go:374 [n?] **** add additional nodes by specifying --join=127.0.0.1:35914 I170130 11:00:14.326024 3087265 storage/store.go:1255 [n1] [n1,s1]: failed initial metrics computation: [n1,s1]: system config not yet available I170130 11:00:14.326206 3087265 server/node.go:457 [n1] initialized store [n1,s1]: {Capacity:536870912 Available:536870912 RangeCount:1 LeaseCount:0} I170130 11:00:14.326315 3087265 server/node.go:342 [n1] node ID 1 initialized I170130 11:00:14.326414 3087265 gossip/gossip.go:293 [n1] NodeDescriptor set to node_id:1 address:<network_field:"tcp" address_field:"127.0.0.1:35914" > attrs:<> locality:<> I170130 11:00:14.326753 3087265 storage/stores.go:296 [n1] read 0 node addresses from persistent storage I170130 11:00:14.326866 3087265 server/node.go:589 [n1] connecting to gossip network to verify cluster ID... I170130 11:00:14.329342 3087265 server/node.go:613 [n1] node connected via gossip and verified as part of cluster "05c0a6e5-392d-4cbe-ac01-3c068475f88e" I170130 11:00:14.329422 3087265 server/node.go:392 [n1] node=1: started with [[]=] engine(s) and attributes [] I170130 11:00:14.329493 3087265 sql/executor.go:332 [n1] creating distSQLPlanner with address {tcp 127.0.0.1:35914} I170130 11:00:14.331381 3087265 server/server.go:629 [n1] starting https server at 127.0.0.1:40989 I170130 11:00:14.331454 3087265 server/server.go:630 [n1] starting grpc/postgres server at 127.0.0.1:35914 I170130 11:00:14.331508 3087265 server/server.go:631 [n1] advertising CockroachDB node at 127.0.0.1:35914 I170130 11:00:14.355226 3088612 sql/event_log.go:95 [n1] Event: "node_join", target: 1, info: {Descriptor:{NodeID:1 Address:{NetworkField:tcp AddressField:127.0.0.1:35914} Attrs: Locality:} ClusterID:05c0a6e5-392d-4cbe-ac01-3c068475f88e StartedAt:1485774014329395083} I170130 11:00:14.391460 3087265 sql/event_log.go:95 [n1] Event: "alter_table", target: 12, info: {TableName:eventlog Statement:ALTER TABLE system.eventlog ALTER COLUMN uniqueID SET DEFAULT uuid_v4() User:node MutationID:0 CascadeDroppedViews:[]} I170130 11:00:14.437169 3087265 server/server.go:686 [n1] done ensuring all necessary migrations have run I170130 11:00:14.437232 3087265 server/server.go:688 [n1] serving sql connections I170130 11:00:24.552383 3088741 vendor/google.golang.org/grpc/transport/http2_server.go:320 transport: http2Server.HandleStreams failed to read frame: read tcp 127.0.0.1:35914->127.0.0.1:41301: use of closed network connection I170130 11:00:24.552483 3089195 vendor/google.golang.org/grpc/transport/http2_client.go:1123 transport: http2Client.notifyError got notified that the client transport was broken EOF. test_server_shim.go:133: had 1 ranges at startup, expected 6 ```
1.0
github.com/cockroachdb/cockroach/pkg/kv: TestMultiRangeBoundedBatchDelRangeOverlappingKeys failed under stress - SHA: https://github.com/cockroachdb/cockroach/commits/4c53128707d07d268833ff5ccb5acee9d8720544 Parameters: ``` COCKROACH_PROPOSER_EVALUATED_KV=false TAGS=deadlock GOFLAGS= ``` Stress build found a failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=133235&tab=buildLog ``` W170130 11:00:14.309983 3087265 server/status/runtime.go:116 Could not parse build timestamp: parsing time "" as "2006/01/02 15:04:05": cannot parse "" as "2006" I170130 11:00:14.311208 3087265 server/config.go:456 1 storage engine initialized I170130 11:00:14.313891 3087265 server/node.go:444 [n?] store [n0,s0] not bootstrapped I170130 11:00:14.318787 3087776 storage/replica.go:4339 [n?,s1,r1/1:/M{in-ax},@c42825f200] gossip not initialized I170130 11:00:14.321214 3087265 server/node.go:373 [n?] **** cluster 05c0a6e5-392d-4cbe-ac01-3c068475f88e has been created I170130 11:00:14.321275 3087265 server/node.go:374 [n?] **** add additional nodes by specifying --join=127.0.0.1:35914 I170130 11:00:14.326024 3087265 storage/store.go:1255 [n1] [n1,s1]: failed initial metrics computation: [n1,s1]: system config not yet available I170130 11:00:14.326206 3087265 server/node.go:457 [n1] initialized store [n1,s1]: {Capacity:536870912 Available:536870912 RangeCount:1 LeaseCount:0} I170130 11:00:14.326315 3087265 server/node.go:342 [n1] node ID 1 initialized I170130 11:00:14.326414 3087265 gossip/gossip.go:293 [n1] NodeDescriptor set to node_id:1 address:<network_field:"tcp" address_field:"127.0.0.1:35914" > attrs:<> locality:<> I170130 11:00:14.326753 3087265 storage/stores.go:296 [n1] read 0 node addresses from persistent storage I170130 11:00:14.326866 3087265 server/node.go:589 [n1] connecting to gossip network to verify cluster ID... I170130 11:00:14.329342 3087265 server/node.go:613 [n1] node connected via gossip and verified as part of cluster "05c0a6e5-392d-4cbe-ac01-3c068475f88e" I170130 11:00:14.329422 3087265 server/node.go:392 [n1] node=1: started with [[]=] engine(s) and attributes [] I170130 11:00:14.329493 3087265 sql/executor.go:332 [n1] creating distSQLPlanner with address {tcp 127.0.0.1:35914} I170130 11:00:14.331381 3087265 server/server.go:629 [n1] starting https server at 127.0.0.1:40989 I170130 11:00:14.331454 3087265 server/server.go:630 [n1] starting grpc/postgres server at 127.0.0.1:35914 I170130 11:00:14.331508 3087265 server/server.go:631 [n1] advertising CockroachDB node at 127.0.0.1:35914 I170130 11:00:14.355226 3088612 sql/event_log.go:95 [n1] Event: "node_join", target: 1, info: {Descriptor:{NodeID:1 Address:{NetworkField:tcp AddressField:127.0.0.1:35914} Attrs: Locality:} ClusterID:05c0a6e5-392d-4cbe-ac01-3c068475f88e StartedAt:1485774014329395083} I170130 11:00:14.391460 3087265 sql/event_log.go:95 [n1] Event: "alter_table", target: 12, info: {TableName:eventlog Statement:ALTER TABLE system.eventlog ALTER COLUMN uniqueID SET DEFAULT uuid_v4() User:node MutationID:0 CascadeDroppedViews:[]} I170130 11:00:14.437169 3087265 server/server.go:686 [n1] done ensuring all necessary migrations have run I170130 11:00:14.437232 3087265 server/server.go:688 [n1] serving sql connections I170130 11:00:24.552383 3088741 vendor/google.golang.org/grpc/transport/http2_server.go:320 transport: http2Server.HandleStreams failed to read frame: read tcp 127.0.0.1:35914->127.0.0.1:41301: use of closed network connection I170130 11:00:24.552483 3089195 vendor/google.golang.org/grpc/transport/http2_client.go:1123 transport: http2Client.notifyError got notified that the client transport was broken EOF. test_server_shim.go:133: had 1 ranges at startup, expected 6 ```
test
github com cockroachdb cockroach pkg kv testmultirangeboundedbatchdelrangeoverlappingkeys failed under stress sha parameters cockroach proposer evaluated kv false tags deadlock goflags stress build found a failed test server status runtime go could not parse build timestamp parsing time as cannot parse as server config go storage engine initialized server node go store not bootstrapped storage replica go gossip not initialized server node go cluster has been created server node go add additional nodes by specifying join storage store go failed initial metrics computation system config not yet available server node go initialized store capacity available rangecount leasecount server node go node id initialized gossip gossip go nodedescriptor set to node id address attrs locality storage stores go read node addresses from persistent storage server node go connecting to gossip network to verify cluster id server node go node connected via gossip and verified as part of cluster server node go node started with engine s and attributes sql executor go creating distsqlplanner with address tcp server server go starting https server at server server go starting grpc postgres server at server server go advertising cockroachdb node at sql event log go event node join target info descriptor nodeid address networkfield tcp addressfield attrs locality clusterid startedat sql event log go event alter table target info tablename eventlog statement alter table system eventlog alter column uniqueid set default uuid user node mutationid cascadedroppedviews server server go done ensuring all necessary migrations have run server server go serving sql connections vendor google golang org grpc transport server go transport handlestreams failed to read frame read tcp use of closed network connection vendor google golang org grpc transport client go transport notifyerror got notified that the client transport was broken eof test server shim go had ranges at startup expected
1
151,350
12,033,490,315
IssuesEvent
2020-04-13 14:20:57
magento/magento2-phpstorm-plugin
https://api.github.com/repos/magento/magento2-phpstorm-plugin
opened
Reference navigation. Reference disabled plugin declaration -> plugin declaration
Award: special achievement Award: test coverage reference navigation
### Preconditions (*) * 2 modules in the system: Module_Foo and Module_Bar * Module_Foo di.xml declares a plugin ``` <type name="SomeTarget"> <plugin name="plugin_id" type="SomePlugin" /> </type> ``` * Module_Bar di.xml disables a plugin ``` <type name="SomeTarget"> <plugin name="plugin_id" disabled="true" /> </type> ``` ### Acceptance criteria * Reference navigation element is added to attribute name of the plugin tag with attribute `disabled="true"` to easily navigate form disabling declaration to original declaration, in this case, from `Module_Foo/etc/di.xml` to `Module_Bar/ets/di.xml`. * The functionality is covered by tests
1.0
Reference navigation. Reference disabled plugin declaration -> plugin declaration - ### Preconditions (*) * 2 modules in the system: Module_Foo and Module_Bar * Module_Foo di.xml declares a plugin ``` <type name="SomeTarget"> <plugin name="plugin_id" type="SomePlugin" /> </type> ``` * Module_Bar di.xml disables a plugin ``` <type name="SomeTarget"> <plugin name="plugin_id" disabled="true" /> </type> ``` ### Acceptance criteria * Reference navigation element is added to attribute name of the plugin tag with attribute `disabled="true"` to easily navigate form disabling declaration to original declaration, in this case, from `Module_Foo/etc/di.xml` to `Module_Bar/ets/di.xml`. * The functionality is covered by tests
test
reference navigation reference disabled plugin declaration plugin declaration preconditions modules in the system module foo and module bar module foo di xml declares a plugin module bar di xml disables a plugin acceptance criteria reference navigation element is added to attribute name of the plugin tag with attribute disabled true to easily navigate form disabling declaration to original declaration in this case from module foo etc di xml to module bar ets di xml the functionality is covered by tests
1
556,918
16,494,980,191
IssuesEvent
2021-05-25 09:21:37
SmashMC-Development/Bugs-and-Issues
https://api.github.com/repos/SmashMC-Development/Bugs-and-Issues
closed
Survival /buy command
custom dev high priority survival
**Describe the Bug** Clicking on the items in the GUI for /buy doesn't do anything **To Reproduce** Steps to reproduce the behavior: 1. Go to survival 2. Type /buy 3. Try clicking on the items in the GUI **Servers with the Bug** Survival **Expected behavior** Clicking on an item in the GUI of /buy should give you a link to the corresponding store page **Screenshots** N/A **Additional context** N/A
1.0
Survival /buy command - **Describe the Bug** Clicking on the items in the GUI for /buy doesn't do anything **To Reproduce** Steps to reproduce the behavior: 1. Go to survival 2. Type /buy 3. Try clicking on the items in the GUI **Servers with the Bug** Survival **Expected behavior** Clicking on an item in the GUI of /buy should give you a link to the corresponding store page **Screenshots** N/A **Additional context** N/A
non_test
survival buy command describe the bug clicking on the items in the gui for buy doesn t do anything to reproduce steps to reproduce the behavior go to survival type buy try clicking on the items in the gui servers with the bug survival expected behavior clicking on an item in the gui of buy should give you a link to the corresponding store page screenshots n a additional context n a
0
320,136
27,420,279,357
IssuesEvent
2023-03-01 16:16:05
unifyai/ivy
https://api.github.com/repos/unifyai/ivy
opened
Fix container.test_container_assert_contains
Sub Task Failing Test
| | | |---|---| |tensorflow|<a href="https://github.com/unifyai/ivy/actions/" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="null" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4133150123/jobs/7142729843" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |jax|<a href="https://github.com/unifyai/ivy/actions/runs/4133150123/jobs/7142742966" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
1.0
Fix container.test_container_assert_contains - | | | |---|---| |tensorflow|<a href="https://github.com/unifyai/ivy/actions/" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="null" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4133150123/jobs/7142729843" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |jax|<a href="https://github.com/unifyai/ivy/actions/runs/4133150123/jobs/7142742966" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
test
fix container test container assert contains tensorflow img src torch img src numpy img src jax img src
1
518,971
15,038,223,146
IssuesEvent
2021-02-02 17:12:14
zeebe-io/zeebe
https://api.github.com/repos/zeebe-io/zeebe
closed
Document Prometheus metrics
Priority: Low Type: Docs Type: Maintenance
**Description** In the last months we added a lot of new metrics, which are currently not yet documented. In order to get better understanding and user adoption we should document them.
1.0
Document Prometheus metrics - **Description** In the last months we added a lot of new metrics, which are currently not yet documented. In order to get better understanding and user adoption we should document them.
non_test
document prometheus metrics description in the last months we added a lot of new metrics which are currently not yet documented in order to get better understanding and user adoption we should document them
0
55,584
14,010,029,999
IssuesEvent
2020-10-29 03:57:44
uniquelyparticular/zendesk-magento-m1-request
https://api.github.com/repos/uniquelyparticular/zendesk-magento-m1-request
opened
CVE-2020-7751 (Medium) detected in pathval-1.1.0.tgz
security vulnerability
## CVE-2020-7751 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>pathval-1.1.0.tgz</b></p></summary> <p>Object value retrieval given a string path</p> <p>Library home page: <a href="https://registry.npmjs.org/pathval/-/pathval-1.1.0.tgz">https://registry.npmjs.org/pathval/-/pathval-1.1.0.tgz</a></p> <p>Path to dependency file: zendesk-magento-m1-request/package.json</p> <p>Path to vulnerable library: zendesk-magento-m1-request/node_modules/pathval/package.json</p> <p> Dependency Hierarchy: - chai-4.2.0.tgz (Root Library) - :x: **pathval-1.1.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/uniquelyparticular/zendesk-magento-m1-request/commit/acd61ff3a2c1b077260bcd02ee53c9a913f6889c">acd61ff3a2c1b077260bcd02ee53c9a913f6889c</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects all versions of package pathval. <p>Publish Date: 2020-10-26 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7751>CVE-2020-7751</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.0</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-7751 (Medium) detected in pathval-1.1.0.tgz - ## CVE-2020-7751 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>pathval-1.1.0.tgz</b></p></summary> <p>Object value retrieval given a string path</p> <p>Library home page: <a href="https://registry.npmjs.org/pathval/-/pathval-1.1.0.tgz">https://registry.npmjs.org/pathval/-/pathval-1.1.0.tgz</a></p> <p>Path to dependency file: zendesk-magento-m1-request/package.json</p> <p>Path to vulnerable library: zendesk-magento-m1-request/node_modules/pathval/package.json</p> <p> Dependency Hierarchy: - chai-4.2.0.tgz (Root Library) - :x: **pathval-1.1.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/uniquelyparticular/zendesk-magento-m1-request/commit/acd61ff3a2c1b077260bcd02ee53c9a913f6889c">acd61ff3a2c1b077260bcd02ee53c9a913f6889c</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects all versions of package pathval. <p>Publish Date: 2020-10-26 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7751>CVE-2020-7751</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.0</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve medium detected in pathval tgz cve medium severity vulnerability vulnerable library pathval tgz object value retrieval given a string path library home page a href path to dependency file zendesk magento request package json path to vulnerable library zendesk magento request node modules pathval package json dependency hierarchy chai tgz root library x pathval tgz vulnerable library found in head commit a href vulnerability details this affects all versions of package pathval publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required high user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact high for more information on scores click a href step up your open source security game with whitesource
0
192,646
14,623,602,585
IssuesEvent
2020-12-23 03:52:04
github-vet/rangeloop-pointer-findings
https://api.github.com/repos/github-vet/rangeloop-pointer-findings
closed
liqingqiya/readcode-etcd-v3.4.10: src/go.etcd.io/etcd/mvcc/kv_test.go; 23 LoC
fresh small test
Found a possible issue in [liqingqiya/readcode-etcd-v3.4.10](https://www.github.com/liqingqiya/readcode-etcd-v3.4.10) at [src/go.etcd.io/etcd/mvcc/kv_test.go](https://github.com/liqingqiya/readcode-etcd-v3.4.10/blob/2397d5699e11c4eb902e621a2cde587dcb52a385/src/go.etcd.io/etcd/mvcc/kv_test.go#L422-L444) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > range-loop variable tt used in defer or goroutine at line 428 [Click here to see the code in its original context.](https://github.com/liqingqiya/readcode-etcd-v3.4.10/blob/2397d5699e11c4eb902e621a2cde587dcb52a385/src/go.etcd.io/etcd/mvcc/kv_test.go#L422-L444) <details> <summary>Click here to show the 23 line(s) of Go which triggered the analyzer.</summary> ```go for i, tt := range tests { // 创建出一个写事务 txn := s.Write(traceutil.TODO()) done := make(chan struct{}, 1) go func() { // 执行 Put,Delete 操作会被提前拉起的事务阻塞到(被 batchTx 的锁阻塞) tt() done <- struct{}{} }() select { case <-done: t.Fatalf("#%d: operation failed to be blocked", i) case <-time.After(10 * time.Millisecond): } // txn.End 里面才会释放锁,其他更新事务才能继续 txn.End() select { case <-done: case <-time.After(10 * time.Second): testutil.FatalStack(t, fmt.Sprintf("#%d: operation failed to be unblocked", i)) } } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 2397d5699e11c4eb902e621a2cde587dcb52a385
1.0
liqingqiya/readcode-etcd-v3.4.10: src/go.etcd.io/etcd/mvcc/kv_test.go; 23 LoC - Found a possible issue in [liqingqiya/readcode-etcd-v3.4.10](https://www.github.com/liqingqiya/readcode-etcd-v3.4.10) at [src/go.etcd.io/etcd/mvcc/kv_test.go](https://github.com/liqingqiya/readcode-etcd-v3.4.10/blob/2397d5699e11c4eb902e621a2cde587dcb52a385/src/go.etcd.io/etcd/mvcc/kv_test.go#L422-L444) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > range-loop variable tt used in defer or goroutine at line 428 [Click here to see the code in its original context.](https://github.com/liqingqiya/readcode-etcd-v3.4.10/blob/2397d5699e11c4eb902e621a2cde587dcb52a385/src/go.etcd.io/etcd/mvcc/kv_test.go#L422-L444) <details> <summary>Click here to show the 23 line(s) of Go which triggered the analyzer.</summary> ```go for i, tt := range tests { // 创建出一个写事务 txn := s.Write(traceutil.TODO()) done := make(chan struct{}, 1) go func() { // 执行 Put,Delete 操作会被提前拉起的事务阻塞到(被 batchTx 的锁阻塞) tt() done <- struct{}{} }() select { case <-done: t.Fatalf("#%d: operation failed to be blocked", i) case <-time.After(10 * time.Millisecond): } // txn.End 里面才会释放锁,其他更新事务才能继续 txn.End() select { case <-done: case <-time.After(10 * time.Second): testutil.FatalStack(t, fmt.Sprintf("#%d: operation failed to be unblocked", i)) } } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 2397d5699e11c4eb902e621a2cde587dcb52a385
test
liqingqiya readcode etcd src go etcd io etcd mvcc kv test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message range loop variable tt used in defer or goroutine at line click here to show the line s of go which triggered the analyzer go for i tt range tests 创建出一个写事务 txn s write traceutil todo done make chan struct go func 执行 put,delete 操作会被提前拉起的事务阻塞到(被 batchtx 的锁阻塞) tt done struct select case done t fatalf d operation failed to be blocked i case time after time millisecond txn end 里面才会释放锁,其他更新事务才能继续 txn end select case done case time after time second testutil fatalstack t fmt sprintf d operation failed to be unblocked i leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
1
146,890
11,760,848,474
IssuesEvent
2020-03-13 20:31:44
dask/distributed
https://api.github.com/repos/dask/distributed
opened
test_dont_steal_unknown_functions failure
flaky test
xref https://travis-ci.org/github/dask/distributed/jobs/662107052 <details> <summary>Full traceback:</summary> ```python ______________________ test_dont_steal_unknown_functions _______________________ def test_func(): result = None workers = [] with clean(timeout=active_rpc_timeout, **clean_kwargs) as loop: async def coro(): with dask.config.set(config): s = False for i in range(5): try: s, ws = await start_cluster( nthreads, scheduler, loop, security=security, Worker=Worker, scheduler_kwargs=scheduler_kwargs, worker_kwargs=worker_kwargs, ) except Exception as e: logger.error( "Failed to start gen_cluster, retrying", exc_info=True, ) else: workers[:] = ws args = [s] + workers break if s is False: raise Exception("Could not start cluster") if client: c = await Client( s.address, loop=loop, security=security, asynchronous=True, **client_kwargs ) args = [c] + args try: future = func(*args) if timeout: future = asyncio.wait_for(future, timeout) result = await future if s.validate: s.validate_state() finally: if client and c.status not in ("closing", "closed"): await c._close(fast=s.status == "closed") await end_cluster(s, workers) await asyncio.wait_for(cleanup_global_workers(), 1) try: c = await default_client() except ValueError: pass else: await c._close(fast=True) for i in range(5): if all(c.closed() for c in Comm._instances): break else: await asyncio.sleep(0.05) else: L = [c for c in Comm._instances if not c.closed()] Comm._instances.clear() # raise ValueError("Unclosed Comms", L) print("Unclosed Comms", L) return result result = loop.run_sync( > coro, timeout=timeout * 2 if timeout else timeout ) distributed/utils_test.py:957: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../miniconda/envs/test-environment/lib/python3.6/site-packages/tornado/ioloop.py:576: in run_sync return future_cell[0].result() distributed/utils_test.py:927: in coro result = await future ../../../miniconda/envs/test-environment/lib/python3.6/asyncio/tasks.py:358: in wait_for return fut.result() ../../../miniconda/envs/test-environment/lib/python3.6/site-packages/tornado/gen.py:1147: in run yielded = self.gen.send(value) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ c = <Client: not connected> s = <Scheduler: "tcp://127.0.0.1:43315" processes: 0 cores: 0> a = <Worker: 'tcp://127.0.0.1:35456', 0, closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0> b = <Worker: 'tcp://127.0.0.1:42222', 1, closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0> @gen_cluster(client=True, nthreads=[("127.0.0.1", 1)] * 2) def test_dont_steal_unknown_functions(c, s, a, b): futures = c.map(inc, [1, 2], workers=a.address, allow_other_workers=True) yield wait(futures) > assert len(a.data) == 2, [len(a.data), len(b.data)] E AssertionError: [1, 1] E assert 1 == 2 E + where 1 = len(Buffer<<LRU: 28/5023005081 on dict>, <Func: serialize_bytelist<->deserialize_bytes <File: /home/travis/build/dask/distributed/dask-worker-space/worker-hbrk40rs/storage, mode="a", 0 elements>>>) E + where Buffer<<LRU: 28/5023005081 on dict>, <Func: serialize_bytelist<->deserialize_bytes <File: /home/travis/build/dask/distributed/dask-worker-space/worker-hbrk40rs/storage, mode="a", 0 elements>>> = <Worker: 'tcp://127.0.0.1:35456', 0, running, stored: 1, running: 0/1, ready: 0, comm: 0, waiting: 0>.data distributed/tests/test_steal.py:116: AssertionError ``` </details>
1.0
test_dont_steal_unknown_functions failure - xref https://travis-ci.org/github/dask/distributed/jobs/662107052 <details> <summary>Full traceback:</summary> ```python ______________________ test_dont_steal_unknown_functions _______________________ def test_func(): result = None workers = [] with clean(timeout=active_rpc_timeout, **clean_kwargs) as loop: async def coro(): with dask.config.set(config): s = False for i in range(5): try: s, ws = await start_cluster( nthreads, scheduler, loop, security=security, Worker=Worker, scheduler_kwargs=scheduler_kwargs, worker_kwargs=worker_kwargs, ) except Exception as e: logger.error( "Failed to start gen_cluster, retrying", exc_info=True, ) else: workers[:] = ws args = [s] + workers break if s is False: raise Exception("Could not start cluster") if client: c = await Client( s.address, loop=loop, security=security, asynchronous=True, **client_kwargs ) args = [c] + args try: future = func(*args) if timeout: future = asyncio.wait_for(future, timeout) result = await future if s.validate: s.validate_state() finally: if client and c.status not in ("closing", "closed"): await c._close(fast=s.status == "closed") await end_cluster(s, workers) await asyncio.wait_for(cleanup_global_workers(), 1) try: c = await default_client() except ValueError: pass else: await c._close(fast=True) for i in range(5): if all(c.closed() for c in Comm._instances): break else: await asyncio.sleep(0.05) else: L = [c for c in Comm._instances if not c.closed()] Comm._instances.clear() # raise ValueError("Unclosed Comms", L) print("Unclosed Comms", L) return result result = loop.run_sync( > coro, timeout=timeout * 2 if timeout else timeout ) distributed/utils_test.py:957: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../miniconda/envs/test-environment/lib/python3.6/site-packages/tornado/ioloop.py:576: in run_sync return future_cell[0].result() distributed/utils_test.py:927: in coro result = await future ../../../miniconda/envs/test-environment/lib/python3.6/asyncio/tasks.py:358: in wait_for return fut.result() ../../../miniconda/envs/test-environment/lib/python3.6/site-packages/tornado/gen.py:1147: in run yielded = self.gen.send(value) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ c = <Client: not connected> s = <Scheduler: "tcp://127.0.0.1:43315" processes: 0 cores: 0> a = <Worker: 'tcp://127.0.0.1:35456', 0, closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0> b = <Worker: 'tcp://127.0.0.1:42222', 1, closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0> @gen_cluster(client=True, nthreads=[("127.0.0.1", 1)] * 2) def test_dont_steal_unknown_functions(c, s, a, b): futures = c.map(inc, [1, 2], workers=a.address, allow_other_workers=True) yield wait(futures) > assert len(a.data) == 2, [len(a.data), len(b.data)] E AssertionError: [1, 1] E assert 1 == 2 E + where 1 = len(Buffer<<LRU: 28/5023005081 on dict>, <Func: serialize_bytelist<->deserialize_bytes <File: /home/travis/build/dask/distributed/dask-worker-space/worker-hbrk40rs/storage, mode="a", 0 elements>>>) E + where Buffer<<LRU: 28/5023005081 on dict>, <Func: serialize_bytelist<->deserialize_bytes <File: /home/travis/build/dask/distributed/dask-worker-space/worker-hbrk40rs/storage, mode="a", 0 elements>>> = <Worker: 'tcp://127.0.0.1:35456', 0, running, stored: 1, running: 0/1, ready: 0, comm: 0, waiting: 0>.data distributed/tests/test_steal.py:116: AssertionError ``` </details>
test
test dont steal unknown functions failure xref full traceback python test dont steal unknown functions def test func result none workers with clean timeout active rpc timeout clean kwargs as loop async def coro with dask config set config s false for i in range try s ws await start cluster nthreads scheduler loop security security worker worker scheduler kwargs scheduler kwargs worker kwargs worker kwargs except exception as e logger error failed to start gen cluster retrying exc info true else workers ws args workers break if s is false raise exception could not start cluster if client c await client s address loop loop security security asynchronous true client kwargs args args try future func args if timeout future asyncio wait for future timeout result await future if s validate s validate state finally if client and c status not in closing closed await c close fast s status closed await end cluster s workers await asyncio wait for cleanup global workers try c await default client except valueerror pass else await c close fast true for i in range if all c closed for c in comm instances break else await asyncio sleep else l comm instances clear raise valueerror unclosed comms l print unclosed comms l return result result loop run sync coro timeout timeout if timeout else timeout distributed utils test py miniconda envs test environment lib site packages tornado ioloop py in run sync return future cell result distributed utils test py in coro result await future miniconda envs test environment lib asyncio tasks py in wait for return fut result miniconda envs test environment lib site packages tornado gen py in run yielded self gen send value c s a b gen cluster client true nthreads def test dont steal unknown functions c s a b futures c map inc workers a address allow other workers true yield wait futures assert len a data e assertionerror e assert e where len buffer deserialize bytes e where buffer deserialize bytes data distributed tests test steal py assertionerror
1
67,184
7,041,794,671
IssuesEvent
2017-12-30 00:52:57
saltstack/salt
https://api.github.com/repos/saltstack/salt
closed
Change bower tests to not depend on external resource
Bug Help Wanted Needs Testcase P3 Platform stale Tests
This test needs to be rewritten so that it does not depend on an external resource: https://jenkins.saltstack.com/job/salt-linode-ubuntu14.04/2602/testReport/junit/integration.states.bower/BowerStateTest/test_bower_installed_from_file/
2.0
Change bower tests to not depend on external resource - This test needs to be rewritten so that it does not depend on an external resource: https://jenkins.saltstack.com/job/salt-linode-ubuntu14.04/2602/testReport/junit/integration.states.bower/BowerStateTest/test_bower_installed_from_file/
test
change bower tests to not depend on external resource this test needs to be rewritten so that it does not depend on an external resource
1
779,369
27,350,675,564
IssuesEvent
2023-02-27 09:20:57
infinitymalle/Project-D0020E
https://api.github.com/repos/infinitymalle/Project-D0020E
closed
Application test case 2: Multiple agents
Medium Priority
Description: Principal sends transferable PoA to agent1, agent1 sends PoA to agent2. Agent 2 validates the PoA Estimated time: 2 h
1.0
Application test case 2: Multiple agents - Description: Principal sends transferable PoA to agent1, agent1 sends PoA to agent2. Agent 2 validates the PoA Estimated time: 2 h
non_test
application test case multiple agents description principal sends transferable poa to sends poa to agent validates the poa estimated time h
0
14,074
2,789,885,649
IssuesEvent
2015-05-08 22:10:15
google/google-visualization-api-issues
https://api.github.com/repos/google/google-visualization-api-issues
closed
Memory leak when using gauges
Priority-Medium Type-Defect
Original [issue 425](https://code.google.com/p/google-visualization-api-issues/issues/detail?id=425) created by orwant on 2010-10-08T09:38:31.000Z: <b>What steps will reproduce the problem? Please provide a link to a</b> <b>demonstration page if at all possible, or attach code.</b> Visiting the below page shows that a memory leak exists in the gauges when they are redrawn. http://code.google.com/apis/visualization/documentation/gallery/gauge.html#Methods <b>What component is this issue related to (PieChart, LineChart, DataTable,</b> <b>Query, etc)?</b> Gauges <b>Are you using the test environment (version 1.1)?</b> <b>(If you are not sure, answer NO)</b> NO <b>What operating system and browser are you using?</b> Windows 7, IE v8.0.7600.16385 &amp; Chrome 6.0.472.63 <b>*********************************************************</b> <b>For developers viewing this issue: please click the 'star' icon to be</b> <b>notified of future changes, and to let us know how many of you are</b> <b>interested in seeing it resolved.</b> <b>*********************************************************</b>
1.0
Memory leak when using gauges - Original [issue 425](https://code.google.com/p/google-visualization-api-issues/issues/detail?id=425) created by orwant on 2010-10-08T09:38:31.000Z: <b>What steps will reproduce the problem? Please provide a link to a</b> <b>demonstration page if at all possible, or attach code.</b> Visiting the below page shows that a memory leak exists in the gauges when they are redrawn. http://code.google.com/apis/visualization/documentation/gallery/gauge.html#Methods <b>What component is this issue related to (PieChart, LineChart, DataTable,</b> <b>Query, etc)?</b> Gauges <b>Are you using the test environment (version 1.1)?</b> <b>(If you are not sure, answer NO)</b> NO <b>What operating system and browser are you using?</b> Windows 7, IE v8.0.7600.16385 &amp; Chrome 6.0.472.63 <b>*********************************************************</b> <b>For developers viewing this issue: please click the 'star' icon to be</b> <b>notified of future changes, and to let us know how many of you are</b> <b>interested in seeing it resolved.</b> <b>*********************************************************</b>
non_test
memory leak when using gauges original created by orwant on what steps will reproduce the problem please provide a link to a demonstration page if at all possible or attach code visiting the below page shows that a memory leak exists in the gauges when they are redrawn what component is this issue related to piechart linechart datatable query etc gauges are you using the test environment version if you are not sure answer no no what operating system and browser are you using windows ie amp chrome for developers viewing this issue please click the star icon to be notified of future changes and to let us know how many of you are interested in seeing it resolved
0
221,977
17,379,989,297
IssuesEvent
2021-07-31 13:58:08
ColoredCow/portal
https://api.github.com/repos/ColoredCow/portal
closed
Issue when creating a new user.
priority : high status : ready to test
When a new user is created, a new employee entry is also created. The following error is showing up. ![image](https://user-images.githubusercontent.com/12053186/46579090-cce71900-ca28-11e8-9101-1582d4720d24.png)
1.0
Issue when creating a new user. - When a new user is created, a new employee entry is also created. The following error is showing up. ![image](https://user-images.githubusercontent.com/12053186/46579090-cce71900-ca28-11e8-9101-1582d4720d24.png)
test
issue when creating a new user when a new user is created a new employee entry is also created the following error is showing up
1
103,452
11,356,902,813
IssuesEvent
2020-01-25 00:40:49
f5devcentral/f5-cloud-failover-extension
https://api.github.com/repos/f5devcentral/f5-cloud-failover-extension
closed
IAM roles not clear
documentation
I'm working with a customer and we're trying to understand the documentation on this. Where it says this: In GCP, go to IAM > Roles and create the member with the following scopes: compute-rw storage-rw cloud-platform Are those roles supposed to be an equivalent to something Google uses here? If so, what roles are those supposed to be equal to? If not, and we're supposed to create them, what roles do we assign?
1.0
IAM roles not clear - I'm working with a customer and we're trying to understand the documentation on this. Where it says this: In GCP, go to IAM > Roles and create the member with the following scopes: compute-rw storage-rw cloud-platform Are those roles supposed to be an equivalent to something Google uses here? If so, what roles are those supposed to be equal to? If not, and we're supposed to create them, what roles do we assign?
non_test
iam roles not clear i m working with a customer and we re trying to understand the documentation on this where it says this in gcp go to iam roles and create the member with the following scopes compute rw storage rw cloud platform are those roles supposed to be an equivalent to something google uses here if so what roles are those supposed to be equal to if not and we re supposed to create them what roles do we assign
0
14,544
3,410,191,004
IssuesEvent
2015-12-04 19:01:54
rancher/rancher
https://api.github.com/repos/rancher/rancher
closed
Metadata does not have information about exposed ports for internal LB
area/metadata kind/bug status/to-test
Server version - v0.47.0 Steps to reproduce the problem: Create a internal LB service. Metadata for this service , does not include any information about exposed ports for internal LB . ports field is empty. We Should have another metadata field for expose ports.
1.0
Metadata does not have information about exposed ports for internal LB - Server version - v0.47.0 Steps to reproduce the problem: Create a internal LB service. Metadata for this service , does not include any information about exposed ports for internal LB . ports field is empty. We Should have another metadata field for expose ports.
test
metadata does not have information about exposed ports for internal lb server version steps to reproduce the problem create a internal lb service metadata for this service does not include any information about exposed ports for internal lb ports field is empty we should have another metadata field for expose ports
1
813,926
30,479,617,564
IssuesEvent
2023-07-17 19:14:13
IBMa/equal-access
https://api.github.com/repos/IBMa/equal-access
closed
[BUG]: Low scans due to an error in invocation, scan button keep spinning.
Bug priority-1 (high) not reproducible
### Project a11y checker extension ### Browser Chrome ### Operating system MacOS ### Description Tested with the lasted master action. When conduction a scan on https://www.ibm.com/able/toolkit/develop/overview The checker gets the following error and scan button keeps spinning. `background.js:2 {type: 'BG_requestScan', dest: {…}, content: 1357278005}content: 1357278005dest: tabId: -1type: "background"[[Prototype]]: Objectconstructor: ƒ Object()hasOwnProperty: ƒ hasOwnProperty()isPrototypeOf: ƒ isPrototypeOf()propertyIsEnumerable: ƒ propertyIsEnumerable()toLocaleString: ƒ toLocaleString()toString: ƒ toString()valueOf: ƒ valueOf()__defineGetter__: ƒ __defineGetter__()__defineSetter__: ƒ __defineSetter__()__lookupGetter__: ƒ __lookupGetter__()__lookupSetter__: ƒ __lookupSetter__()__proto__: (...)get __proto__: ƒ __proto__()set __proto__: ƒ __proto__()type: "BG_requestScan"[[Prototype]]: Object TypeError: Error in invocation of scripting.executeScript(scripting.ScriptInjection injection, optional function callback): Error at parameter 'injection': Error at property 'target': Missing required property 'tabId'. at g (background.js:2:2331956) at background.js:2:2332684 at new Promise (<anonymous>) at background.js:2:2332663 at Generator.next (<anonymous>) at background.js:2:1157876 at new Promise (<anonymous>) at e (background.js:2:1157621) at y (background.js:2:2332616) at D.<anonymous> (background.js:2:2324430)` ![Screen Shot 2023-06-29 at 9 15 29 PM](https://github.com/IBMa/equal-access/assets/62436670/4b93ced2-4985-45e0-95fb-6e6190434bc7) ![Screen Shot 2023-06-29 at 9 13 00 PM](https://github.com/IBMa/equal-access/assets/62436670/01fcb440-7bf5-4ed2-ad8f-a342cdde7a9a) ### Steps to reproduce Load the lates from master (Action). Scan https://www.ibm.com/able/toolkit/develop/overview Open dev tools for checker.
1.0
[BUG]: Low scans due to an error in invocation, scan button keep spinning. - ### Project a11y checker extension ### Browser Chrome ### Operating system MacOS ### Description Tested with the lasted master action. When conduction a scan on https://www.ibm.com/able/toolkit/develop/overview The checker gets the following error and scan button keeps spinning. `background.js:2 {type: 'BG_requestScan', dest: {…}, content: 1357278005}content: 1357278005dest: tabId: -1type: "background"[[Prototype]]: Objectconstructor: ƒ Object()hasOwnProperty: ƒ hasOwnProperty()isPrototypeOf: ƒ isPrototypeOf()propertyIsEnumerable: ƒ propertyIsEnumerable()toLocaleString: ƒ toLocaleString()toString: ƒ toString()valueOf: ƒ valueOf()__defineGetter__: ƒ __defineGetter__()__defineSetter__: ƒ __defineSetter__()__lookupGetter__: ƒ __lookupGetter__()__lookupSetter__: ƒ __lookupSetter__()__proto__: (...)get __proto__: ƒ __proto__()set __proto__: ƒ __proto__()type: "BG_requestScan"[[Prototype]]: Object TypeError: Error in invocation of scripting.executeScript(scripting.ScriptInjection injection, optional function callback): Error at parameter 'injection': Error at property 'target': Missing required property 'tabId'. at g (background.js:2:2331956) at background.js:2:2332684 at new Promise (<anonymous>) at background.js:2:2332663 at Generator.next (<anonymous>) at background.js:2:1157876 at new Promise (<anonymous>) at e (background.js:2:1157621) at y (background.js:2:2332616) at D.<anonymous> (background.js:2:2324430)` ![Screen Shot 2023-06-29 at 9 15 29 PM](https://github.com/IBMa/equal-access/assets/62436670/4b93ced2-4985-45e0-95fb-6e6190434bc7) ![Screen Shot 2023-06-29 at 9 13 00 PM](https://github.com/IBMa/equal-access/assets/62436670/01fcb440-7bf5-4ed2-ad8f-a342cdde7a9a) ### Steps to reproduce Load the lates from master (Action). Scan https://www.ibm.com/able/toolkit/develop/overview Open dev tools for checker.
non_test
low scans due to an error in invocation scan button keep spinning project checker extension browser chrome operating system macos description tested with the lasted master action when conduction a scan on the checker gets the following error and scan button keeps spinning background js type bg requestscan dest … content content tabid background objectconstructor ƒ object hasownproperty ƒ hasownproperty isprototypeof ƒ isprototypeof propertyisenumerable ƒ propertyisenumerable tolocalestring ƒ tolocalestring tostring ƒ tostring valueof ƒ valueof definegetter ƒ definegetter definesetter ƒ definesetter lookupgetter ƒ lookupgetter lookupsetter ƒ lookupsetter proto get proto ƒ proto set proto ƒ proto type bg requestscan object typeerror error in invocation of scripting executescript scripting scriptinjection injection optional function callback error at parameter injection error at property target missing required property tabid at g background js at background js at new promise at background js at generator next at background js at new promise at e background js at y background js at d background js steps to reproduce load the lates from master action scan open dev tools for checker
0
689,137
23,609,275,315
IssuesEvent
2022-08-24 10:57:49
magento/magento2
https://api.github.com/repos/magento/magento2
closed
GraphQL mutation uptdateCartItems return data is incorrect when removing multiple items
Issue: Format is valid Partner: Kensium Solutions LLC Progress: PR in progress Priority: P2 Project: GraphQL PAP
<!--- Please review our guidelines before adding a new issue: https://github.com/magento/magento2/wiki/Issue-reporting-guidelines Fields marked with (*) are required. Please don't remove the template. --> ### Preconditions (*) 1. Magento version: 2.4.0 ### Steps to reproduce (*) <!--- Important: Provide a set of clear steps to reproduce this bug. We can not provide support without clear instructions on how to reproduce. --> 1. Add a set of items to cart (minimum 2) 2. Using the graphQL mutation `updateCartItems` set multiple or all of the item quantities to 0 query ``` mutation { updateCartItems( input: { cart_id: {{cartID}}", cart_items: [ { cart_item_id: 1 quantity: 0 }, { cart_item_id: 2 quantity: 0 }, ] } ) { cart { items { id product { name } quantity } prices { grand_total{ value currency } } } } } ``` ### Expected result (*) 1. Expected cart data where all items that were set to a quantity of 0 to be removed ### Actual result (*) <!--- Tell us what happened instead. Include error messages and issues. --> 1. The cart data that is returned will remove 1 item. The rest of the items will still be in the cart. 2. Even though All items were set to 0. The cart returned with all items minus 1. 3. The Items are still removed from cart, but an extra query to get the cart was required to get the correct cart data. --- Please provide [Severity](https://devdocs.magento.com/guides/v2.3/contributor-guide/contributing.html#backlog) assessment for the Issue as Reporter. This information will help during Confirmation and Issue triage processes. - [ ] Severity: **S0** _- Affects critical data or functionality and leaves users without workaround._ - [X] Severity: **S1** _- Affects critical data or functionality and forces users to employ a workaround._ - [ ] Severity: **S2** _- Affects non-critical data or functionality and forces users to employ a workaround._ - [ ] Severity: **S3** _- Affects non-critical data or functionality and does not force users to employ a workaround._ - [ ] Severity: **S4** _- Affects aesthetics, professional look and feel, “quality” or “usability”._
1.0
GraphQL mutation uptdateCartItems return data is incorrect when removing multiple items - <!--- Please review our guidelines before adding a new issue: https://github.com/magento/magento2/wiki/Issue-reporting-guidelines Fields marked with (*) are required. Please don't remove the template. --> ### Preconditions (*) 1. Magento version: 2.4.0 ### Steps to reproduce (*) <!--- Important: Provide a set of clear steps to reproduce this bug. We can not provide support without clear instructions on how to reproduce. --> 1. Add a set of items to cart (minimum 2) 2. Using the graphQL mutation `updateCartItems` set multiple or all of the item quantities to 0 query ``` mutation { updateCartItems( input: { cart_id: {{cartID}}", cart_items: [ { cart_item_id: 1 quantity: 0 }, { cart_item_id: 2 quantity: 0 }, ] } ) { cart { items { id product { name } quantity } prices { grand_total{ value currency } } } } } ``` ### Expected result (*) 1. Expected cart data where all items that were set to a quantity of 0 to be removed ### Actual result (*) <!--- Tell us what happened instead. Include error messages and issues. --> 1. The cart data that is returned will remove 1 item. The rest of the items will still be in the cart. 2. Even though All items were set to 0. The cart returned with all items minus 1. 3. The Items are still removed from cart, but an extra query to get the cart was required to get the correct cart data. --- Please provide [Severity](https://devdocs.magento.com/guides/v2.3/contributor-guide/contributing.html#backlog) assessment for the Issue as Reporter. This information will help during Confirmation and Issue triage processes. - [ ] Severity: **S0** _- Affects critical data or functionality and leaves users without workaround._ - [X] Severity: **S1** _- Affects critical data or functionality and forces users to employ a workaround._ - [ ] Severity: **S2** _- Affects non-critical data or functionality and forces users to employ a workaround._ - [ ] Severity: **S3** _- Affects non-critical data or functionality and does not force users to employ a workaround._ - [ ] Severity: **S4** _- Affects aesthetics, professional look and feel, “quality” or “usability”._
non_test
graphql mutation uptdatecartitems return data is incorrect when removing multiple items please review our guidelines before adding a new issue fields marked with are required please don t remove the template preconditions magento version steps to reproduce important provide a set of clear steps to reproduce this bug we can not provide support without clear instructions on how to reproduce add a set of items to cart minimum using the graphql mutation updatecartitems set multiple or all of the item quantities to query mutation updatecartitems input cart id cartid cart items cart item id quantity cart item id quantity cart items id product name quantity prices grand total value currency expected result expected cart data where all items that were set to a quantity of to be removed actual result the cart data that is returned will remove item the rest of the items will still be in the cart even though all items were set to the cart returned with all items minus the items are still removed from cart but an extra query to get the cart was required to get the correct cart data please provide assessment for the issue as reporter this information will help during confirmation and issue triage processes severity affects critical data or functionality and leaves users without workaround severity affects critical data or functionality and forces users to employ a workaround severity affects non critical data or functionality and forces users to employ a workaround severity affects non critical data or functionality and does not force users to employ a workaround severity affects aesthetics professional look and feel “quality” or “usability”
0
723,927
24,912,742,104
IssuesEvent
2022-10-30 02:56:35
AY2223S1-CS2113-T17-1/tp
https://api.github.com/repos/AY2223S1-CS2113-T17-1/tp
opened
Update DG for Non-Functional Requirements & Glossary & Instructions for manual testing
priority.High
Non-Functional Requirements {Give non-functional requirements} Glossary glossary item - Definition Instructions for manual testing {Give instructions on how to do a manual product testing e.g., how to load sample data to be used for testing} This one just copy paste the command summary in UG to here
1.0
Update DG for Non-Functional Requirements & Glossary & Instructions for manual testing - Non-Functional Requirements {Give non-functional requirements} Glossary glossary item - Definition Instructions for manual testing {Give instructions on how to do a manual product testing e.g., how to load sample data to be used for testing} This one just copy paste the command summary in UG to here
non_test
update dg for non functional requirements glossary instructions for manual testing non functional requirements give non functional requirements glossary glossary item definition instructions for manual testing give instructions on how to do a manual product testing e g how to load sample data to be used for testing this one just copy paste the command summary in ug to here
0
5,587
3,251,436,739
IssuesEvent
2015-10-19 09:47:42
VcDevel/Vc
https://api.github.com/repos/VcDevel/Vc
opened
Extract the unit test framework of Vc.
code cleanup enhancement
Vc has a unique unit test framework to ease testing several types with the same generic function with minimal boilerplate. The test framework is also used in another software project and may be of interest to other projects. The test framework itself is a grown hack and needs a good cleanup to become a standalone project.
1.0
Extract the unit test framework of Vc. - Vc has a unique unit test framework to ease testing several types with the same generic function with minimal boilerplate. The test framework is also used in another software project and may be of interest to other projects. The test framework itself is a grown hack and needs a good cleanup to become a standalone project.
non_test
extract the unit test framework of vc vc has a unique unit test framework to ease testing several types with the same generic function with minimal boilerplate the test framework is also used in another software project and may be of interest to other projects the test framework itself is a grown hack and needs a good cleanup to become a standalone project
0
280,201
24,283,770,231
IssuesEvent
2022-09-28 19:54:03
chshersh/iris
https://api.github.com/repos/chshersh/iris
closed
Add test for '--help' option parser
good first issue cli-options test hacktoberfest
It would be nice to test that the `--help` option by default outputs what we want. And it's also nice to see what changes when we change default parsers. So I propose to add a test that parses the `--help` option and compares it with the expected output of this option. The plan is: - [ ] Add new module `Test.Iris.Cli` where the new test will live - [ ] Extract `cmdParserInfo` from `Iris.Env` to `Iris.Cli` - [ ] Write a unit test for the `--help` option using [execParserPure](https://hackage.haskell.org/package/optparse-applicative-0.17.0.0/docs/Options-Applicative-Extra.html#v:execParserPure) from `optparse-applicative`
1.0
Add test for '--help' option parser - It would be nice to test that the `--help` option by default outputs what we want. And it's also nice to see what changes when we change default parsers. So I propose to add a test that parses the `--help` option and compares it with the expected output of this option. The plan is: - [ ] Add new module `Test.Iris.Cli` where the new test will live - [ ] Extract `cmdParserInfo` from `Iris.Env` to `Iris.Cli` - [ ] Write a unit test for the `--help` option using [execParserPure](https://hackage.haskell.org/package/optparse-applicative-0.17.0.0/docs/Options-Applicative-Extra.html#v:execParserPure) from `optparse-applicative`
test
add test for help option parser it would be nice to test that the help option by default outputs what we want and it s also nice to see what changes when we change default parsers so i propose to add a test that parses the help option and compares it with the expected output of this option the plan is add new module test iris cli where the new test will live extract cmdparserinfo from iris env to iris cli write a unit test for the help option using from optparse applicative
1
289,916
25,022,770,209
IssuesEvent
2022-11-04 03:34:28
openmsupply/conforma-server
https://api.github.com/repos/openmsupply/conforma-server
closed
Migration code not always working on server
Bug: development Bug: production Bugs during tests Customer: Fiji Customer: Angola
Okay, I know why this is happening now. (and that's what's causing the History panel crash even though it's technically "fixed") When you import a snapshot that is output by the same version as the server, the migration script doesn't run. This is the expected behaviour. However, due the to problem with some of the generated columns not being restored, we *need* the functions/triggers/etc script to run in order to DROP and recreate those columns. In development this wasn't a problem as the migration script would run every restart regardless, but on the live server it doesn't. The best way to deal with this is just to make sure the "functions/triggers" sql script runs after every snapshot reload regardless. I'll do this now.
1.0
Migration code not always working on server - Okay, I know why this is happening now. (and that's what's causing the History panel crash even though it's technically "fixed") When you import a snapshot that is output by the same version as the server, the migration script doesn't run. This is the expected behaviour. However, due the to problem with some of the generated columns not being restored, we *need* the functions/triggers/etc script to run in order to DROP and recreate those columns. In development this wasn't a problem as the migration script would run every restart regardless, but on the live server it doesn't. The best way to deal with this is just to make sure the "functions/triggers" sql script runs after every snapshot reload regardless. I'll do this now.
test
migration code not always working on server okay i know why this is happening now and that s what s causing the history panel crash even though it s technically fixed when you import a snapshot that is output by the same version as the server the migration script doesn t run this is the expected behaviour however due the to problem with some of the generated columns not being restored we need the functions triggers etc script to run in order to drop and recreate those columns in development this wasn t a problem as the migration script would run every restart regardless but on the live server it doesn t the best way to deal with this is just to make sure the functions triggers sql script runs after every snapshot reload regardless i ll do this now
1
53,978
11,170,600,992
IssuesEvent
2019-12-28 14:19:26
becurrie/titandash
https://api.github.com/repos/becurrie/titandash
opened
Bot Implementation Cleanup
code enhancement
Cleanup the bot and it's implementation, many things are condensed and hard to read. It would be better to be verbose but understandable. This only applies to the core bot files.
1.0
Bot Implementation Cleanup - Cleanup the bot and it's implementation, many things are condensed and hard to read. It would be better to be verbose but understandable. This only applies to the core bot files.
non_test
bot implementation cleanup cleanup the bot and it s implementation many things are condensed and hard to read it would be better to be verbose but understandable this only applies to the core bot files
0
210,541
7,190,740,983
IssuesEvent
2018-02-02 18:21:55
HabitRPG/habitica
https://api.github.com/repos/HabitRPG/habitica
closed
two-handed weapons/shields aren't easily identifiable as such
priority: medium section: Equipment status: issue: in progress type: medium level coding
Most two-handed weapons aren't easily identified as such because they don't have anything in their text to state that they are. We sometimes see bug reports from mobile app users when equipping a two-handed item unequips a shield. There are issues to give notifications about that when equipping them on the apps (https://github.com/HabitRPG/habitica-ios/issues/433 and https://github.com/HabitRPG/habitica-android/issues/688) but we should also include a message in each two-handed item's description on website and apps - i.e., the message should be in the data returned from the API's `content` route. For consistency, the message should be automatically added to the end of the item's description when the item has a true value for the `twoHanded` attribute (e.g., the rancherLasso below). I.e., the PR for this change will NOT change any of the `common/locales/en/` json files (with one exception listed below) but instead will modify the API's code to insert the message. The message should be something like "**Two-handed item.**" We won't use "Two-handed weapon" because future items might have more of a shield-like feel to them. The PR should change `weaponSpecialCandycaneNotes` in `common/locales/en/gear.json` to remove the hard-coded two-handed description. I.e., change this: `A powerful mage's staff. Powerfully DELICIOUS, we mean! Two-handed weapon. Increases Intelligence by <%= int %> and Perception by <%= per %>. Limited Edition 2013-2014 Winter Gear.` to this: `A powerful mage's staff. Powerfully DELICIOUS, we mean! Increases Intelligence by <%= int %> and Perception by <%= per %>. Limited Edition 2013-2014 Winter Gear.` Example of a two-handed item: ``` rancherLasso: { twoHanded: true, text: t('weaponArmoireRancherLassoText'), notes: t('weaponArmoireRancherLassoNotes', { str: 5, per: 5, int: 5 }), value: 100, str: 5, per: 5, int: 5, set: 'rancher', canOwn: ownsItem('weapon_armoire_rancherLasso'), }, ```
1.0
two-handed weapons/shields aren't easily identifiable as such - Most two-handed weapons aren't easily identified as such because they don't have anything in their text to state that they are. We sometimes see bug reports from mobile app users when equipping a two-handed item unequips a shield. There are issues to give notifications about that when equipping them on the apps (https://github.com/HabitRPG/habitica-ios/issues/433 and https://github.com/HabitRPG/habitica-android/issues/688) but we should also include a message in each two-handed item's description on website and apps - i.e., the message should be in the data returned from the API's `content` route. For consistency, the message should be automatically added to the end of the item's description when the item has a true value for the `twoHanded` attribute (e.g., the rancherLasso below). I.e., the PR for this change will NOT change any of the `common/locales/en/` json files (with one exception listed below) but instead will modify the API's code to insert the message. The message should be something like "**Two-handed item.**" We won't use "Two-handed weapon" because future items might have more of a shield-like feel to them. The PR should change `weaponSpecialCandycaneNotes` in `common/locales/en/gear.json` to remove the hard-coded two-handed description. I.e., change this: `A powerful mage's staff. Powerfully DELICIOUS, we mean! Two-handed weapon. Increases Intelligence by <%= int %> and Perception by <%= per %>. Limited Edition 2013-2014 Winter Gear.` to this: `A powerful mage's staff. Powerfully DELICIOUS, we mean! Increases Intelligence by <%= int %> and Perception by <%= per %>. Limited Edition 2013-2014 Winter Gear.` Example of a two-handed item: ``` rancherLasso: { twoHanded: true, text: t('weaponArmoireRancherLassoText'), notes: t('weaponArmoireRancherLassoNotes', { str: 5, per: 5, int: 5 }), value: 100, str: 5, per: 5, int: 5, set: 'rancher', canOwn: ownsItem('weapon_armoire_rancherLasso'), }, ```
non_test
two handed weapons shields aren t easily identifiable as such most two handed weapons aren t easily identified as such because they don t have anything in their text to state that they are we sometimes see bug reports from mobile app users when equipping a two handed item unequips a shield there are issues to give notifications about that when equipping them on the apps and but we should also include a message in each two handed item s description on website and apps i e the message should be in the data returned from the api s content route for consistency the message should be automatically added to the end of the item s description when the item has a true value for the twohanded attribute e g the rancherlasso below i e the pr for this change will not change any of the common locales en json files with one exception listed below but instead will modify the api s code to insert the message the message should be something like two handed item we won t use two handed weapon because future items might have more of a shield like feel to them the pr should change weaponspecialcandycanenotes in common locales en gear json to remove the hard coded two handed description i e change this a powerful mage s staff powerfully delicious we mean two handed weapon increases intelligence by and perception by limited edition winter gear to this a powerful mage s staff powerfully delicious we mean increases intelligence by and perception by limited edition winter gear example of a two handed item rancherlasso twohanded true text t weaponarmoirerancherlassotext notes t weaponarmoirerancherlassonotes str per int value str per int set rancher canown ownsitem weapon armoire rancherlasso
0
97,918
8,673,873,705
IssuesEvent
2018-11-30 04:45:23
humera987/FXLabs-Test-Automation
https://api.github.com/repos/humera987/FXLabs-Test-Automation
closed
FXLabs Testing : ApiV1RunsIdTestSuiteSummarySearchGetQueryParamPageInvalidDatatype
FXLabs Testing
Project : FXLabs Testing Job : UAT Env : UAT Region : US_WEST Result : fail Status Code : 404 Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Set-Cookie=[SESSION=NmFlZWVlN2MtOThkNC00NWQzLTk2NWUtMjQ1OWMwNGEwZTE3; Path=/; HttpOnly], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Fri, 30 Nov 2018 04:27:32 GMT]} Endpoint : http://13.56.210.25/api/v1/api/v1/runs/kJOVhHvv/test-suite-summary/search?page=HCMgCW Request : Response : { "timestamp" : "2018-11-30T04:27:32.710+0000", "status" : 404, "error" : "Not Found", "message" : "No message available", "path" : "/api/v1/api/v1/runs/kJOVhHvv/test-suite-summary/search" } Logs : Assertion [@StatusCode != 401] resolved-to [404 != 401] result [Passed]Assertion [@StatusCode != 404] resolved-to [404 != 404] result [Failed] --- FX Bot ---
1.0
FXLabs Testing : ApiV1RunsIdTestSuiteSummarySearchGetQueryParamPageInvalidDatatype - Project : FXLabs Testing Job : UAT Env : UAT Region : US_WEST Result : fail Status Code : 404 Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Set-Cookie=[SESSION=NmFlZWVlN2MtOThkNC00NWQzLTk2NWUtMjQ1OWMwNGEwZTE3; Path=/; HttpOnly], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Fri, 30 Nov 2018 04:27:32 GMT]} Endpoint : http://13.56.210.25/api/v1/api/v1/runs/kJOVhHvv/test-suite-summary/search?page=HCMgCW Request : Response : { "timestamp" : "2018-11-30T04:27:32.710+0000", "status" : 404, "error" : "Not Found", "message" : "No message available", "path" : "/api/v1/api/v1/runs/kJOVhHvv/test-suite-summary/search" } Logs : Assertion [@StatusCode != 401] resolved-to [404 != 401] result [Passed]Assertion [@StatusCode != 404] resolved-to [404 != 404] result [Failed] --- FX Bot ---
test
fxlabs testing project fxlabs testing job uat env uat region us west result fail status code headers x content type options x xss protection cache control pragma expires x frame options set cookie content type transfer encoding date endpoint request response timestamp status error not found message no message available path api api runs kjovhhvv test suite summary search logs assertion resolved to result assertion resolved to result fx bot
1
177,269
13,689,084,146
IssuesEvent
2020-09-30 12:41:43
kubernetes-csi/csi-driver-smb
https://api.github.com/repos/kubernetes-csi/csi-driver-smb
closed
reenable unit tests on Windows
help wanted test
**Is your feature request related to a problem?/Why is this needed** <!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] --> **Describe the solution you'd like in detail** <!-- A clear and concise description of what you want to happen. --> Need to enable following unit tests on Windows since NodeXXX funcs would still run on Windows, should fix the test cases ``` ~/go/src/github.com/kubernetes-csi/csi-driver-smb# grep skipIfTestingOnWindows ./* -R ./pkg/smb/nodeserver_test.go: skipIfTestingOnWindows(t) ./pkg/smb/nodeserver_test.go: skipIfTestingOnWindows(t) ./pkg/smb/nodeserver_test.go: skipIfTestingOnWindows(t) ./pkg/smb/nodeserver_test.go: skipIfTestingOnWindows(t) ./pkg/smb/smb_test.go: skipIfTestingOnWindows(t) ``` **Describe alternatives you've considered** <!-- A clear and concise description of any alternative solutions or features you've considered. --> **Additional context** <!-- Add any other context or screenshots about the feature request here. -->
1.0
reenable unit tests on Windows - **Is your feature request related to a problem?/Why is this needed** <!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] --> **Describe the solution you'd like in detail** <!-- A clear and concise description of what you want to happen. --> Need to enable following unit tests on Windows since NodeXXX funcs would still run on Windows, should fix the test cases ``` ~/go/src/github.com/kubernetes-csi/csi-driver-smb# grep skipIfTestingOnWindows ./* -R ./pkg/smb/nodeserver_test.go: skipIfTestingOnWindows(t) ./pkg/smb/nodeserver_test.go: skipIfTestingOnWindows(t) ./pkg/smb/nodeserver_test.go: skipIfTestingOnWindows(t) ./pkg/smb/nodeserver_test.go: skipIfTestingOnWindows(t) ./pkg/smb/smb_test.go: skipIfTestingOnWindows(t) ``` **Describe alternatives you've considered** <!-- A clear and concise description of any alternative solutions or features you've considered. --> **Additional context** <!-- Add any other context or screenshots about the feature request here. -->
test
reenable unit tests on windows is your feature request related to a problem why is this needed describe the solution you d like in detail need to enable following unit tests on windows since nodexxx funcs would still run on windows should fix the test cases go src github com kubernetes csi csi driver smb grep skipiftestingonwindows r pkg smb nodeserver test go skipiftestingonwindows t pkg smb nodeserver test go skipiftestingonwindows t pkg smb nodeserver test go skipiftestingonwindows t pkg smb nodeserver test go skipiftestingonwindows t pkg smb smb test go skipiftestingonwindows t describe alternatives you ve considered additional context
1
4,499
2,729,760,122
IssuesEvent
2015-04-16 10:43:01
molgenis/molgenis
https://api.github.com/repos/molgenis/molgenis
closed
chr:pos-pos is not working properly
bug data-vcf dataexplorer release test v15.04
To reproduce Browser: SAFARI Plugin: Data Explorer Dataset loaded: molgenis-app/src/test/resources/mutations/test_samples.vcf Action: In the search box put "chr:10050000-10050003" Expected: ![screen shot 2015-04-16 at 12 24 30 pm](https://cloud.githubusercontent.com/assets/5586255/7178971/ae4e3c78-e433-11e4-9615-10393e91cb38.png) Result: ![screen shot 2015-04-16 at 12 26 22 pm](https://cloud.githubusercontent.com/assets/5586255/7179001/e423e8f2-e433-11e4-9127-4b1ed2865d25.png)
1.0
chr:pos-pos is not working properly - To reproduce Browser: SAFARI Plugin: Data Explorer Dataset loaded: molgenis-app/src/test/resources/mutations/test_samples.vcf Action: In the search box put "chr:10050000-10050003" Expected: ![screen shot 2015-04-16 at 12 24 30 pm](https://cloud.githubusercontent.com/assets/5586255/7178971/ae4e3c78-e433-11e4-9615-10393e91cb38.png) Result: ![screen shot 2015-04-16 at 12 26 22 pm](https://cloud.githubusercontent.com/assets/5586255/7179001/e423e8f2-e433-11e4-9127-4b1ed2865d25.png)
test
chr pos pos is not working properly to reproduce browser safari plugin data explorer dataset loaded molgenis app src test resources mutations test samples vcf action in the search box put chr expected result
1
281,176
24,367,670,353
IssuesEvent
2022-10-03 16:25:35
Brain-Bones/skeleton
https://api.github.com/repos/Brain-Bones/skeleton
closed
New Component: File Picker Dropzone
ready to test feature request
References: https://mantine.dev/others/dropzone/ ![Screen Shot 2022-09-26 at 5 04 07 PM](https://user-images.githubusercontent.com/1509726/192388885-7c475ac1-11b0-4072-9e0e-84b2f1dc090b.png) This will work almost identical to the File Button component: - https://github.com/Brain-Bones/skeleton/issues/201#issuecomment-1254449966 The only difference is this doesn't include button UI, but rather keeps the input, but styles it to look like the visible droppable region shown above. Native browsers allow for drag and drop on input:file natively. Nothing special needed! Please follow our contribution guidelines when implementing any new component! https://skeleton.brainandbonesllc.com/docs/contributions
1.0
New Component: File Picker Dropzone - References: https://mantine.dev/others/dropzone/ ![Screen Shot 2022-09-26 at 5 04 07 PM](https://user-images.githubusercontent.com/1509726/192388885-7c475ac1-11b0-4072-9e0e-84b2f1dc090b.png) This will work almost identical to the File Button component: - https://github.com/Brain-Bones/skeleton/issues/201#issuecomment-1254449966 The only difference is this doesn't include button UI, but rather keeps the input, but styles it to look like the visible droppable region shown above. Native browsers allow for drag and drop on input:file natively. Nothing special needed! Please follow our contribution guidelines when implementing any new component! https://skeleton.brainandbonesllc.com/docs/contributions
test
new component file picker dropzone references this will work almost identical to the file button component the only difference is this doesn t include button ui but rather keeps the input but styles it to look like the visible droppable region shown above native browsers allow for drag and drop on input file natively nothing special needed please follow our contribution guidelines when implementing any new component
1
33,918
4,865,720,650
IssuesEvent
2016-11-14 21:34:42
jruby/jruby
https://api.github.com/repos/jruby/jruby
opened
[ruby 2.4] forwardable.rb is broken on non-MRI
JRuby 9000 ruby 2.4 stdlib tests
See https://bugs.ruby-lang.org/issues/12938. forwardable.rb now unconditionally uses RubyVM features not supported on any other Ruby impl. This prevents any tests that depend on forwardable (at boot) from loading properly.
1.0
[ruby 2.4] forwardable.rb is broken on non-MRI - See https://bugs.ruby-lang.org/issues/12938. forwardable.rb now unconditionally uses RubyVM features not supported on any other Ruby impl. This prevents any tests that depend on forwardable (at boot) from loading properly.
test
forwardable rb is broken on non mri see forwardable rb now unconditionally uses rubyvm features not supported on any other ruby impl this prevents any tests that depend on forwardable at boot from loading properly
1