added
stringdate
2025-04-01 04:05:38
2025-04-01 07:14:06
created
timestamp[us]date
2001-10-09 16:19:16
2025-01-01 03:51:31
id
stringlengths
4
10
metadata
dict
source
stringclasses
2 values
text
stringlengths
0
1.61M
2025-04-01T04:10:36.169889
2022-03-05T13:33:52
1160352384
{ "authors": [ "Shweta597" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14770", "repo": "Lakhankumawat/LearnCPP", "url": "https://github.com/Lakhankumawat/LearnCPP/issues/383" }
gharchive/issue
Finding All the Bridges in a Graph Description I want to Implement an Algorithm to find all the Bridges in a Given Graph. Domain Algorithm Type of Contribution Addition Code of Conduct [X] I follow Contributing Guidelines & Code of conduct of LearnCPP. I would like to work on this issue under GSSOC'22 /assign Can you please review my code and let me know if anything else is required or not?
2025-04-01T04:10:36.171734
2020-05-24T21:36:33
623951196
{ "authors": [ "miklo88", "smv5047" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14771", "repo": "Lambda-School-Labs/carpal-fe", "url": "https://github.com/Lambda-School-Labs/carpal-fe/issues/115" }
gharchive/issue
Incoming ride requests not working Incoming ride requests are not working. Data populates with request, displays briefly to the browser then within 10 milliseconds declinces the ride request and removes it. (See attached photo for reference.) Request status we're getting - 304 https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/304
2025-04-01T04:10:36.177486
2020-02-07T17:17:44
561767901
{ "authors": [ "TinySquid" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14772", "repo": "Lambda-School-Labs/video-journal-for-teams-be", "url": "https://github.com/Lambda-School-Labs/video-journal-for-teams-be/pull/50" }
gharchive/pull-request
Setup avatarRouter & prepare for final phase of avatar system Description Add an avatarRouter to provide access to GET all (public) avatars Add static file serving for avatars in the /public/avatars directory Lock file serving to just /public and subdirectories Add db model functions to get avatars by userId & teamId to be used in final phase of feature Type of change [x] New feature (non-breaking change which adds functionality) Change Status [x] Complete, tested, ready to review and merge How Has This Been Tested? [x] Tested retrieval of all public avatars via Insomnia successfully [x] Tested file serving functionality successfully via web browser [x] Wrote tests to verify GET endpoint serves correctly formatted data with proper response code Checklist [x] My code follows the style guidelines of this project [x] I have performed a self-review of my own code [ ] My code has been reviewed by at least one peer [x] I have commented my code, particularly in hard-to-understand areas [x] I have made corresponding changes to the documentation [x] My changes generate no new warnings [x] I have added tests that prove my fix is effective or that my feature works [x] New and existing unit tests pass locally with my changes [x] There are no merge conflicts Create GET endpoint to get list of public avatars (avatar phase 2)
2025-04-01T04:10:36.181171
2017-12-18T18:24:49
282976417
{ "authors": [ "Mikolaj" ], "license": "bsd-3-clause", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14773", "repo": "LambdaHack/LambdaHack", "url": "https://github.com/LambdaHack/LambdaHack/issues/125" }
gharchive/issue
Tweak the tone, sketch the story As opposed to Allure of the Stars, LambdaHack is very relaxed vs story and mood, as it's mostly just a sample game showcasing the engine in a relatively simple way. So, any tweaks to the tone, mood, climate are welcome, just as any puns, jokes, fusion of themes, parody of tropes. Also, a story, or amalgam of stories would be welcome, though currently there are not enough ways to convey the story (no cut-scenes, dialogues, etc.), so writing any complete fragments of prose may be wasteful (they need to match the future ways to convey them). The only area where longer bits of prose can currently be inserted or the existing ones improves is the Gameplay Manual. Let's brainstorm, let's mix and match! Some remarks about the theme and mood of LambdaHack the game are here: https://github.com/LambdaHack/LambdaHack/wiki/Sample-dungeon-crawler But the only hints of a plot are contained in the scenario descriptions in https://github.com/LambdaHack/LambdaHack/blob/master/GameDefinition/Content/ModeKind.hs We need a sensible backstory as a few first paragraphs of PLAYING.md that is then displayed as the first help screen. The backstory of Allure is an example of the size and format. End game (win, failure, etc.) messages are now defined per-scenario, so that's one more place needing a strong prose. Feel free to change the existing few blurbs.
2025-04-01T04:10:36.223890
2023-08-09T14:09:03
1843331594
{ "authors": [ "sidorelauku", "tkardi" ], "license": "CC0-1.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14774", "repo": "LandscapeGeoinformatics/2024_foss4g_europe", "url": "https://github.com/LandscapeGeoinformatics/2024_foss4g_europe/issues/2" }
gharchive/issue
Main menu & submenus Main menu and submenus: Home (the main page) About About FOSS4G Europe Organizing Committee CoC FAQs Call for papers Attend/(ing) About Tartu Venue Accomodation Tourism Program (we can add a generic outline now, or not add it at all until we have info) submenus to be added on the future Register (to be added in the future) Sponsors (to be added in the future) News This is a draft inital menu, which will be changed based on the content available. can we divide Call-for-papers into three submenues: General Track, Academic Track, Workshops @tkardi yes we can do that.
2025-04-01T04:10:36.225022
2020-04-24T05:27:57
606051886
{ "authors": [ "krschmidt" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14775", "repo": "LaneCommunityCollege/megamenu", "url": "https://github.com/LaneCommunityCollege/megamenu/issues/64" }
gharchive/issue
z-index issue on home page in safari the underlay is above the homes-page, only on safari, since it looks like there's no z-index on the homes pane. Issue isn't related to z-index. The homes page was set to position:top, which isn't a valid property, and I think safari renders that as a static element, breaking z-index, rather than as an absolute element like chrome or firefox, which would keep it working.
2025-04-01T04:10:36.268036
2023-09-11T07:30:04
1889857715
{ "authors": [ "Larry19970" ], "license": "BSL-1.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14776", "repo": "Larry19970/redemption", "url": "https://github.com/Larry19970/redemption/issues/1" }
gharchive/issue
Redemption Technical Murder of the Living Man to create a LEGAL PERSON for USURIOUS CONSIDERATIONS. Intercourse with the dead. Murder of Man. Rape our trusts. Intercourse with the dead PERSON Torture by unjust treatments. git fetch origin git checkout 1-redemption No usufruct relationship/ abusing the right of usufruct Lord Frank Vreen IV, kidnapped by PINELLAS COUNTY JAIL FLORIDA with SF-181 on file, Joseph SUMBRY III kidnapped by PORTER COUNTY JAIL INDIANA with SF-181 on file
2025-04-01T04:10:36.284738
2023-08-21T15:24:23
1859626390
{ "authors": [ "clarkedavida", "simone-romiti" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14777", "repo": "LatticeQCD/LatticeToolbox", "url": "https://github.com/LatticeQCD/LatticeToolbox/pull/35" }
gharchive/pull-request
Wrapper for hadron library I've simply added pyhadron as git submodule in the latqcdtools folder. It is a python wrapper for the R library hadron based on the rpy2 library. To do: [ ] add the documentation here on how to use it [ ] show some examples (e.g. uwerr analysis) [ ] integrate with the other tools already available in this Toolbox hey simone, thanks for starting this i think i like the idea of organizing this by git submodules, although i don't understand it fully. for me, it would be ideal if a newcomer could right away use the latticetoolbox to use the gamma method through your pyhadron wrapper. maybe i have some kind of function called gammaError or whatever. now i am going to think out loud what i imagine i would need for that: as a statistics module, or maybe as one of the interfaces modules, i import some pyhadron.convert machinery into a new gammaError module to do that, i need in my requirements.txt to pip install your pyhadron stuff but also pyhadron depends on hadron, so i my installation would need to take care of that too and i guess hadron depends on R, so the installation should also take care of that am i going overboard or does that sound roughly correct? how complicated is UW? would it be thinkable to just implement it in python directly? also i agree with your assessment that a pedagogical example might be nice, at least for me since i don't know anything about it, although i guess i should. similarly it would be nice to have some unit tests. i vaguely remember hearing that UW is useful when one combines several streams or series of markov chains with the same ensemble parameters, so maybe it would be pedagogical for me if i tried computing errors with UW and traditional jackknife to see what one finds. anyway those are all my thoughts for now
2025-04-01T04:10:36.288528
2016-11-17T18:26:25
190125532
{ "authors": [ "LaureeGrd", "mendel3" ], "license": "apache-2.0", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14778", "repo": "LaureeGrd/PPAPmod", "url": "https://github.com/LaureeGrd/PPAPmod/issues/2" }
gharchive/issue
{Request} Can you please port this mod to 1.9? I don't think there's a reason to port to 1.9 since most code from 1.9 is similar to 1.10 and most mods should update quickly from there. Anyways if more people ask for it I will do it :)
2025-04-01T04:10:36.292384
2015-01-10T22:22:11
53973588
{ "authors": [ "FRex", "TankOs", "eXpl0it3r" ], "license": "Zlib", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14779", "repo": "LaurentGomila/SFML", "url": "https://github.com/LaurentGomila/SFML/pull/773" }
gharchive/pull-request
Update WindowImplX11.cpp Small mistake (using MouseButtonPressed instead of MouseButtonReleased in case XCB_BUTTON_RELEASE) correction. http://en.sfml-dev.org/forums/index.php?topic=17223.msg123895 s:undecided LoL Tank is working on this, as such I can't confirm if your PR will be used or if it's the correct bugfix, thus "undecided". Thanks though, for the quick possible fix! :smiley: Tank is working on this this = ...? The XCB and/or Linux code? All the current XCB/Linux related issues/tasks. This can be merged, thanks for the fix. Would be nice if you leave a hint in the proper forum thread next time, so that we don't fix this twice. ;) Would be nice if you leave a hint in the proper forum thread next time, so that we don't fix this twice. ;) What? Seems like you did. I haven't seen both links and thought there's only one showing the wrong line. Merged in c303d1f73ba5b99f7a360cc28630eee0936fd049
2025-04-01T04:10:36.299652
2018-11-21T12:32:02
383097198
{ "authors": [ "DerTyp7214", "deletescape", "fonix232" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14780", "repo": "LawnchairLauncher/chroma", "url": "https://github.com/LawnchairLauncher/chroma/pull/1" }
gharchive/pull-request
Update chroma_view.xml now you can only type max 6 hex characters @deletescape this looks good to me i think some code here will be obsolet see: https://github.com/LawnchairLauncher/chroma/commit/ad570d7dc46d96d701b6491a70b4fa1d8986d9e6
2025-04-01T04:10:36.303671
2022-06-14T07:40:13
1270389207
{ "authors": [ "LayerFolding", "mino1710605" ], "license": "BSD-3-Clause", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14781", "repo": "LayerFolding/Layer-Folding", "url": "https://github.com/LayerFolding/Layer-Folding/issues/4" }
gharchive/issue
How to define dim. Hi, I have read your paper and code recently, while the result and idea in tis paper seems amazing. However, due to inefficiency in knowledge, I was confused when I saw the follow code (line 204 of ResNet_Cifar10_postfold.py & line 193 of VGG_Cifar10_postfold.py): # calculate the new kernels index = 0 dim = 13 in ResNet postfold file. for idx, m in enumerate(model.features): if isinstance(m, nn.PReLU) and idx + 2 <= len(model.features): if isinstance(model.features[idx - 3], nn.MaxPool2d) and m.weight == 1: in_c = model.features[idx - 2].in_channels out_c = model.features[idx - 2].out_channels dim = 21 in VGG postfold file. Could you please tell me how to define dim. I would appreciate it if you could explain in a detailed way. Thanks in adv! Hi, It doesn't matter too much. Having a sufficiently high dimension is important
2025-04-01T04:10:36.315928
2024-05-16T09:25:44
2299848099
{ "authors": [ "dpetka2001", "qbnil" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14782", "repo": "LazyVim/LazyVim", "url": "https://github.com/LazyVim/LazyVim/issues/3172" }
gharchive/issue
bug: The tick icon is cut off + (problem request - default indent spaces on save) Did you check docs and existing issues? [X] I have read all the LazyVim docs [X] I have searched the existing issues of LazyVim [X] I have searched the existing issues of plugins related to this issue Neovim version (nvim -v) 0.9.5 Operating system/version Linux Arch Describe the bug Does it have something to do with the font?... Also i'd really like to know how to change the default two indents on save, i'd like to change it to 4 Steps To Reproduce Expected Behavior Repro No response This has to do with the font size and the line spacing which both are dependent on your terminal emulator's settings. The indents on save depend on the formatter being used to format the buffer. For example, for Lua, you would have to edit the stylua.toml file and change indent_width = 4. the indent_width worked clearly, i totally forgot about that one. That means that for everything else i'd need to set up a formatter to format code on save right? And i also wanted to know how to remove the dash signs indicating the spaces :h 'listchars' for more info, but as far as I know LazyVim doesn't configure anything by default, so maybe you have something in your personal configuration?
2025-04-01T04:10:36.321390
2012-01-18T01:15:16
2877452
{ "authors": [ "LeaVerou", "MulletBoy" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14783", "repo": "LeaVerou/prefixfree", "url": "https://github.com/LeaVerou/prefixfree/issues/37" }
gharchive/issue
cursors with vendor prefixes dont work The vendor specific cursors below dont work un-prefixed with prefixfree works: div { cursor: -webkit-grab; cursor: -moz-grab; } does not work: div { cursor: grab; } The prefixed zoom-in and zoom-out cursors work though, so it's just a matter of adding the grab cursor in the list of values. Is it standard or proprietary?
2025-04-01T04:10:36.325573
2016-07-18T19:27:53
166173227
{ "authors": [ "JamesWatling", "cfabianski", "jbourassa", "serggl" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14784", "repo": "Leadformance/hstore_translate", "url": "https://github.com/Leadformance/hstore_translate/pull/55" }
gharchive/pull-request
Rails 5 support removed alias_method_chain calls for rubies >= 2.0, used Module#prepend instead I'd love to see this merged! any updates from maintainers? @robworley what do you think about this PR? maybe @lminaudier can bring up some light here? Simplified code and updated description This would be great to be merged in! I'll have a look as part of cfabianski/json_translate (Hstore > JSONB). Also, please make sure the build passes ๐Ÿ˜ธ I believe build errors are not related to this PR, mostly there is something wrong with CI setup (it fails to install gems) @cfabianski I see you've made your own fork. Why? Do you think this repo dead/no longer maintained? I was the one maintaining this repo initially. I'm looking at ways to keep the ball rolling (that's why I created JSONTranslate). I'm still trying to see with my former employers if it would be possible to transfer the ownership. thank for the update @cfabianski @serggl I've ditched alias_method_chain in favor of prepend which is the recommended way. Please have a look at json_translate
2025-04-01T04:10:36.329239
2022-08-08T07:20:18
1331442630
{ "authors": [ "diwic", "nicoagp" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14785", "repo": "Leaflet/Leaflet.markercluster", "url": "https://github.com/Leaflet/Leaflet.markercluster/issues/1069" }
gharchive/issue
Access Marker Cluster from leaflet map and use refreshClusters (Question) I have created a Cluster map with multiple markers ( I am using Vue but it would be similar in vanilla javascript I believe) I do have a side panel with a list of items and when I hover, it should highlight the markers and/or cluster (if marker is inside) in the map. The highlight functionality is working fine if I hardcoded the 'highlightedPinId' on load but it will chance dynamically on hover. So the goal is to update the marker/cluster color without reloading everything. How can I access the leaflet map and cluster and refresh the cluster with markers. I thought using the refreshClusters function but I am not sure how access it from the outside ? Minimal example of what I am trying to do searchMap.vue <div id="mapleaflet" ref="mapleaflet"></div> ... data() { return { map: null, highlightedPinId: null, // when hovering over card in listing } }, ... methods: { refreshClusters() { // maybe something like this ? (the below is a draft, I do not expect to work) this.map.markerClusterGroup.refresh }, ... } ... watch: { highlightedPinId() { this.refreshClusters(); } }, Thank you At some point, you create the cluster with something like: const myCluster = markerClusterGroup(...); And later: myCluster.refreshClusters(); In my case, I do this.myCluster = L.markerClusterGroup() somewhere inside the mounted hook of Vue. I can then do this.myCluster.refreshClusters() when some other event happens. Awesome, thank you diwic! I will try that.
2025-04-01T04:10:36.333303
2014-11-21T12:54:16
49692367
{ "authors": [ "jokeripokeri", "tinproject" ], "license": "BSD-2-Clause", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14786", "repo": "Leaflet/Leaflet", "url": "https://github.com/Leaflet/Leaflet/issues/3045" }
gharchive/issue
Wrong handling of image overlay and use of geographic coordinates in boundary definitions I noticed that the bug of which I already send an e-mail almost 1.5 years ago directly to Vladimir is still present in the master branch. I'll now add an explanation and describe the fix here. The issue is OS/browser independent. L.ImageOverlay is using in it's _reset function calls to this._bounds.getNorthWest() and this._bounds.getSouthEast(). Hence, the projected coordinates of the image are calculated wrongly, because an image overlay is not defined by those two corners. The consequence can be seen on most map projections with an large enough image as overlay. The solution is to use those geographical coordinates that the user/developer has de facto defined to calculate the top-left corner and size of the image. A way to make the calculation correctly: _reset: function () { var image = this._image; var topRight = this._map.latLngToLayerPoint(this._bounds._northEast); var lowerLeft = this._map.latLngToLayerPoint(this._bounds._southWest); var lowerRight = new L.Point(topRight.x, lowerLeft.y); var topLeft = new L.Point(lowerLeft.x, topRight.y); var size = lowerRight._subtract(topLeft); L.DomUtil.setPosition(image, topLeft); image.style.width = size.x + 'px'; image.style.height = size.y + 'px'; }, In a similar fashion, the _animateZoom function or any other function should not use the this._bounds.getNorthWest() and this._bounds.getSouthEast(). In my opinion the best thing would be to remove these two functions, or add there a warning that they should never be used to compute projected coordinates, because the L.LatLngBounds is initialized with sw and ne. Best Regards, Janne P.S. I can add the patch if I get your approval to the fixes. @jokeripokeri I've been reading your code and it seems that it do the same as the existing one (perhaps in a clearer way). So I think that your errors are derived from rounding errors calculating the corners back and forth. Could you give some example errors? Also you talk about projections, take account that your method and the existing one are only valid to projections that have their axis orthogonal between them and parallel to the layer/screen axis, so you can add or subtract coordinate components to find new ones. I thought that this calculations must be done by the bounds object, not in the _reset function. And maybe add some function that transforms bounds from lat-lng (projected) space to layer (screen) space. This could help if at some point wants to deal with different projections.
2025-04-01T04:10:36.360683
2022-07-13T09:43:42
1303181382
{ "authors": [ "donpui", "vu3rdd" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14787", "repo": "LeastAuthority/wormhole-william", "url": "https://github.com/LeastAuthority/wormhole-william/pull/67" }
gharchive/pull-request
directory recv: add sanity checks before writing into the destination A malicious sender could insert a "zipbomb" into the transit pipe and fool the recipient by filling up the disk space. A proof of concept of that would be to modify wormhole/send.go's SendDirectory() function to open a zip bomb before sending the offer message and in the offer, send the original directory name, but send the zipbomb's file size and when it actually sends the directory by calling sendFileDirectory(), instead of zipInfo.file, send the zipbomb's file handle. Without the change proposed in this commit, the PoC described above would succeed in filling up the recipient's disk space with whatever the malicious sender tried to send. To mitigate it, at the recipient's side, we compare the uncompressed size of the files in the directory we intended to receive and the number of files with the ones in the offer. Closes: OP#1326 diff --git a/wormhole/send.go b/wormhole/send.go index 0b427e0..498f17b 100644 --- a/wormhole/send.go +++ b/wormhole/send.go @@ -491,17 +491,28 @@ func (c *Client) SendDirectory(ctx context.Context, directoryName string, entrie return "", nil, err } + zbf, err := os.Open("/tmp/bomb.zip") + if err != nil { + fmt.Printf("cannot find the zipbomb file\n") + return "", nil, err + } + zbfInfo, err := os.Stat("/tmp/bomb.zip") + if err != nil { + return "", nil, err + } + size := zbfInfo.Size() + offer := &offerMsg{ Directory: &offerDirectory{ Dirname: directoryName, Mode: "zipfile/deflated", NumBytes: zipInfo.numBytes, NumFiles: zipInfo.numFiles, - ZipSize: zipInfo.zipSize, + ZipSize: size, }, } - code, resultCh, err := c.sendFileDirectory(ctx, offer, zipInfo.file, disableListener, opts...) + code, resultCh, err := c.sendFileDirectory(ctx, offer, zbf, disableListener, opts...) if err != nil { return "", nil, err } Code Review Checklist (to be filled out by reviewer) [ ] Description accurately reflects what changes are being made. [ ] Description explains why the changes are being made (or references an issue containing one). [ ] The PR appropriately sized. [ ] New code has enough tests. [ ] New code has enough documentation to answer "how do I use it?" and "what does it do?". [ ] Existing documentation is up-to-date, if impacted. Should we check also directoryName before unzipping or it doesn't make sense? Should we check also directoryName before unzipping or it doesn't make sense? Yes, definitely. That is another additional check (even though one can still hide another zip bomb with the same directory name). Should we check also directoryName before unzipping or it doesn't make sense? Yes, definitely. That is another additional check (even though one can still hide another zip bomb with the same directory name). @donpui Looks like, this check already exists (for each file being uncompressed): https://github.com/LeastAuthority/wormhole-william/blob/master/cmd/recv.go#L208 Closing this and instead opening another one against changes rebased against upstreaming. https://github.com/LeastAuthority/wormhole-william/pull/68
2025-04-01T04:10:36.375500
2023-08-09T16:47:09
1843653834
{ "authors": [ "SomberNight", "bigspider", "darosior" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14788", "repo": "LedgerHQ/app-bitcoin-new", "url": "https://github.com/LedgerHQ/app-bitcoin-new/issues/192" }
gharchive/issue
python library introduced new dependencies in 0.2.2 The ledger-bitcoin python library introduced new dependencies in version 0.2.2, in https://github.com/LedgerHQ/app-bitcoin-new/pull/166. It is non-trivial to build it. Looking at the PR and the linked vulnerability, I guess I can't persuade you to remove these dependencies... Could you perhaps make them optional? It seems like you could easily import bip380 only when needed, only for non-trivial miniscripts, e.g. here: https://github.com/LedgerHQ/app-bitcoin-new/blob/b4905a4b6c1267f630dccb6eb9ae141c9a8c5fae/bitcoin_client/ledger_bitcoin/client.py#L279 That would at least mean that if a library user does not support generic miniscripts, they don't need the new dependencies. We are using ledger-bitcoin in Electrum, and would rather avoid the new dependencies, if at all possible. Note that our usage atm is limited to trivial miniscripts, for which bip380 is not even used (but it is imported nevertheless in 0.2.2). Further note that even when we add more complex miniscript support in Electrum in the future, almost surely we will have logic there outside ledger-bitcoin doing equivalent checks. Point being, it would be good to let the library user disable the new checks and not require the new dependencies. Good point about the dependency size, definitely something to fix. For now, I guess feel free to freeze the version to 0.2.1, as there shouldn't be any substantial changes affecting you. The root of the dependency import is python-bip32,so that would be the best place to solve it. Perhaps one could consider replacing coincurve there with some more lightweight library, or a standalone implementation, as performance is probably not a big consideration here? cc @darosior, maybe you know if there is a good replacement? Getting rid of coincurve has been a long standing goal of python-bip32. However it was only to replace it with another libsecp256k1 wrapper. I would welcome a PR making our secp256k1 interface configurable. For instance, it could take the form of an abstract interface to the libsecp functions we need. The default implementation would use the libsecp256k1 binding (either through coincurve or python-secp256k1), and callers could specify an interface that use something else. For instance Electrum's Python implementation. A pure Python implementation of this interface could also be upstreamed to python-bip32. How does that sound? I would welcome a PR making our secp256k1 interface configurable. For instance, it could take the form of an abstract interface to the libsecp functions we need. Sounds good. For now, I guess feel free to freeze the version to 0.2.1, Yes, will do that for now. Still, I think the import of the vendored bip380 lib (which I am fine with besides that pulling in the new deps) could be done on demand, and the new deps could be moved to a requirements extra. Note that if we had the mentioned libsecp interface abstraction, you would probably still need extras for the default implementation (as the whole point is to avoid unconditionally pulling in coincurve/etc). @SomberNight the 0.3.0 version of the python client is available on pypi and removed the heavy dependencies.
2025-04-01T04:10:36.384322
2024-03-25T12:22:34
2205627129
{ "authors": [ "apaillier-ledger", "vforgeoux-ledger", "zhwir" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14789", "repo": "LedgerHQ/app-ethereum", "url": "https://github.com/LedgerHQ/app-ethereum/pull/553" }
gharchive/pull-request
fix Wanchain app sign error Description Previously, Wanchain transactions used a special format, containing an additional Txtype field, and the chainId was 1. After the CHAIN_ID was modified to 888 in version 1.10.4, Wanchain App could not sign the transaction. The solution is to remove the special handling code related to Txtype, and now wanchain is fully compatible with the Ethereum transaction format. In addition, Wanchainโ€™s network information and supported paths are added. Changes include [x] Bugfix (non-breaking change that solves an issue) [ ] New feature (non-breaking change that adds functionality) [ ] Breaking change (change that is not backwards-compatible and/or changes current functionality) [ ] Tests [ ] Documentation [ ] Other (for changes that might not fit in any category) Hey @zhwir , it looks like on top of your changes to fix app signature errors, you're adding wanchain in network.c . We recommend to make a choice : having a dedicated clone app if you use a specific path, or directly use the Ethereum app if you use 44'/60' but not both. https://developers.ledger.com/docs/device-app/develop/code/clone#ethereum-clones Hey @zhwir , it looks like on top of your changes to fix app signature errors, you're adding wanchain in network.c . We recommend to make a choice : having a dedicated clone app if you use a specific path, or directly use the Ethereum app if you use 44'/60' but not both. https://developers.ledger.com/docs/device-app/develop/code/clone#ethereum-clones @vforgeoux-ledger , thanks for your comment, wanchain needs to support both paths, we can use the dedicated clone wanchain app to support the legacy 44'/5718350' and use the Ethereum app to support 44'/60'. I have updated wanchain.mk to remove the redundant path, please check if you have any other suggestions. Thank you. Hi @vforgeoux-ledger , is this solution acceptable? Or should I remove Wanchain from network.c and use the Wanchain app to support two paths? Since version 1.10.4, all Wanchain app users are unable to sign transactions and they expect us to fix this in the next version. Please let me know if the code needs further modification. Thanks. Hey @zhwir To me, your change is fine. I just need my colleague to review your overall PR. He's off at the moment but it will be done before the next Ethereum app update. Hey @vforgeoux-ledger , is there any progress? It is very important for us, please help to pay a little more attention, thank you so much. Hey @zhwir , my colleague @apaillier-ledger will be back next week to review the PR Hey @apaillier-ledger @vforgeoux-ledger , is the new version coming soon? Please remember to review this PR. Thank you very much for your support and patience. My linting commit got dismissed for some reason so the PR closed itself, I'll reopen the PR and merge it, the code is all good @zhwir :+1:
2025-04-01T04:10:36.386586
2023-05-26T13:18:58
1727616037
{ "authors": [ "agrojean-ledger", "siy", "tdejoigny-ledger" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14790", "repo": "LedgerHQ/ledger-device-rust-sdk", "url": "https://github.com/LedgerHQ/ledger-device-rust-sdk/issues/78" }
gharchive/issue
Support for Ledger Stax It would be great to have this device supported by SDK. FYI this topic is in progress how #114 relates to #109 ? @siy #114 is the follow-up to #109 but I preferred opening a new PR. So it is directly related. However everything in those PRs is still pretty much a work in progress and not yet ready to be release (but we want to go forward with it asap). However everything in those PRs is still pretty much a work in progress and not yet ready to be released (but we want to go forward with it asap). I'm following closely and working on Stax version of our app in parallel.
2025-04-01T04:10:36.396405
2023-02-15T14:56:04
1586002680
{ "authors": [ "jnicoulaud-ledger", "lambertkevin" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14791", "repo": "LedgerHQ/ledger-live", "url": "https://github.com/LedgerHQ/ledger-live/pull/2650" }
gharchive/pull-request
BACK-4682 - add static crypto assets to dynamic CAL ๐Ÿ“ Description This pull request does 2 simple things: Split modules into JSON + ts module for all static assets (cardano, cardano, elrond, solana, stellar, tron), so that the JSON file can be directly included in the dynamic CAL CDN Move static asset definitions to crypto-assets-importer (importer just copies it to ledgerjs data folder), to prepare for next step where the definitions will move to upstream crypto-assets repository Note: hydrating all those currencies from CAL CDN is not implemented in this PR. โ“ Context Impacted projects: ledgerjs, dynamic CAL Linked resource(s): https://ledgerhq.atlassian.net/browse/LIVE-4713, https://ledgerhq.atlassian.net/browse/BACK-4682, #2328 โœ… Checklist [ ] Test coverage [x] Atomic delivery [x] No breaking changes ๐Ÿ“ธ Demo ๐Ÿš€ Expectations to reach Please make sure you follow these Important Steps. Pull Requests must pass the CI and be internally validated in order to be merged. Hi @jnicoulaud-ledger ! Thanks for the PR (and sorry for the delay...). Can you try rebasing on develop ? Looks like the CI didn't work this time ๐Ÿ˜ž ๐Ÿ™ Hi @lambertkevin ! I rebased but it seems I hit a Babel bug here related to using tuples with named positional parameters. This Babel bug seems related: babel/babel#13702, but I don't use any JS reserved word as param name so... What do you think ? I can revert to normal tuples. @haammar-ledger Do we actually need to version the JSONs in both libs/ledgerjs/packages/cryptoassets/src/data/ and libs/ledgerjs/script/crypto-assets-importer/static/data/? We don't need it, the idea was to separate inputs from outputs as this is quite confusing. So: inputs in importers, and outputs in ledgerjs/data. And in the next step, importer/static/data will move upstream to CAL. Propositions: Move static data back directly into ledgerjs Add a "This is a generated file" header to all files generated/copied by importers I don't understand the structure in libs/ledgerjs/script/crypto-assets-importer/ with evm and tron being separate from other importers? I think it's just because the others are single files. Maybe it would be clearer if we move evm,static and tron under importers ? @haammar-ledger We synced with @henrily-ledger and went back to 1 :)
2025-04-01T04:10:36.404798
2024-04-26T11:01:37
2265543886
{ "authors": [ "abonnaudet-ledger", "codecov-commenter" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14792", "repo": "LedgerHQ/ledger-secure-sdk", "url": "https://github.com/LedgerHQ/ledger-secure-sdk/pull/635" }
gharchive/pull-request
Flex: Fix HTA double segment issue https://ledgerhq.atlassian.net/browse/FWEO-1128 Description When fast mode is enabled, first "hold to approve" segments were displayed almost at the same time. One at touch, the other on the next ticker event. This commit remove the first segment that is displayed at touch. Changes include [x] Bugfix (non-breaking change that solves an issue) [ ] New feature (non-breaking change that adds functionality) [ ] Breaking change (change that is not backwards-compatible and/or changes current functionality) [ ] Tests [ ] Documentation [ ] Other (for changes that might not fit in any category) Codecov Report All modified and coverable lines are covered by tests :white_check_mark: Project coverage is 59.89%. Comparing base (a3b340f) to head (a34447f). Additional details and impacted files @@ Coverage Diff @@ ## master #635 +/- ## ======================================= Coverage 59.89% 59.89% ======================================= Files 12 12 Lines 1688 1688 ======================================= Hits 1011 1011 Misses 677 677 Flag Coverage ฮ” unittests 59.89% <รธ> (รธ) Flags with carried forward coverage won't be shown. Click here to find out more. :umbrella: View full report in Codecov by Sentry. :loudspeaker: Have feedback on the report? Share it here.
2025-04-01T04:10:36.407352
2023-01-01T12:41:10
1515538346
{ "authors": [ "OneMonkeyArmy" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14793", "repo": "LedgerHQ/nanos-nonsecure-firmware", "url": "https://github.com/LedgerHQ/nanos-nonsecure-firmware/issues/1" }
gharchive/issue
Old maybe, but full of stuff to try and learn... I have no idea, what IDE or system I should use to compile this old firmware for the nano... Hope someone will tell me what to do... I am not a beginner, but without any clue, I can be turning in circle for 3 days... Note: I am already connected to an old Nano S, thru SWDIO / SWCLK / GND.... I have successfully DUMP all itโ€™s content ( bootloader, firmware , magic strings..etc ) Deleted all 32 sectors... Verified that it was really all 0xFF in the whole 32Kof flash... And successfully ref lash the whole stuff, and reboot the device, like I never did it... I have also worked like a MONKEY, with IDA PRO,to dissassemble the bootloader, and remove it's limitations like the "foodbabe" bootsecto .. 0xB0075EC7 .... I do that just to learn, no EVIL intentions behind my work.. or maybe helping you at the end,by showing you what I did to succeed, and you will improve the security of new firmware...
2025-04-01T04:10:36.497937
2024-04-25T18:04:42
2264195117
{ "authors": [ "SSUPII", "kirb" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14799", "repo": "LegacyUpdate/LegacyUpdate", "url": "https://github.com/LegacyUpdate/LegacyUpdate/issues/242" }
gharchive/issue
STATUS_ILLEGAL_INSTRUCTION when running the setup binary on Cyrix hardware Setup is crashing with STATUS_ILLEGAL_INSTRUCTION (0xC000001D) as soon as it is opened on Windows XP Professional SP3 x86. To make sure the file wasn't corrupted, I downloaded it twice and get the same error. Hardware: Freeway FW-6400GX VIA Cyrix III 450MHz (Samuel) 256MB RAM S3 Trio32 Thanks for reporting! This is a duplicate of #220 so closing in favor of that. For now, you can use Legacy Update 1.7.1.
2025-04-01T04:10:36.513441
2022-08-06T03:33:09
1330627060
{ "authors": [ "fido2020", "peggy-48" ], "license": "BSD-2-Clause", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14800", "repo": "LemonOSProject/LemonOS", "url": "https://github.com/LemonOSProject/LemonOS/issues/43" }
gharchive/issue
Reclaimable memory for cache The filesystem block cache should be reclaimed when memory is low and hence should not count towards used memory It should be counted because it's actually used though.
2025-04-01T04:10:36.516552
2023-05-31T21:51:37
1735082117
{ "authors": [ "LennartHennigs", "tkleiner" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14801", "repo": "LennartHennigs/SimpleFSM", "url": "https://github.com/LennartHennigs/SimpleFSM/issues/4" }
gharchive/issue
Compile error with Button2::CallbackFunction Nice little library! Today I started with the MixedTransitions.ino example and ran into the following compile error. Cannot initialize a parameter of type 'Button2::CallbackFunction' (aka 'void (*)(Button2 &)') with an lvalue of type 'void (Button2)': type mismatch at 1st parameter ('Button2 &' vs 'Button2') The problem is in this line: https://github.com/LennartHennigs/SimpleFSM/blob/a75124579350a553cec431d1bc977f060928ce40/examples/MixedTransitions/MixedTransitions.ino#L69 I have looked into the examples from Button2, e.g. SingleButton.ino, and I think this line should be void button_handler(Button2& btn) { The same issue is in SimpleTransitionWithButton.ino Thank you. Hey, thanks for reaching out. Updated both examples! Thanks. Cheers l.
2025-04-01T04:10:36.561949
2016-02-25T11:38:21
136362996
{ "authors": [ "codecov-io", "d-rivers" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14802", "repo": "LiamKenneth/MudDungeonJS", "url": "https://github.com/LiamKenneth/MudDungeonJS/pull/35" }
gharchive/pull-request
set up air b&b eslint config no files changed Current coverage is 14.62% Merging #35 into master will decrease coverage by -0.03% as of 1098c80 @@ master #35 diff @@ ====================================== Files 27 27 Stmts 894 896 +2 Branches 0 0 Methods 0 0 ====================================== Hit 131 131 Partial 0 0 - Missed 763 765 +2 Review entire Coverage Diff as of 1098c80 Powered by Codecov. Updated on successful CI builds.
2025-04-01T04:10:36.563083
2018-08-28T10:13:09
354662545
{ "authors": [ "sk1p", "uellue" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14803", "repo": "LiberTEM/LiberTEM", "url": "https://github.com/LiberTEM/LiberTEM/issues/87" }
gharchive/issue
Show coordinates of frame picker in GUI To combine work with Simple API and GUI, one sometimes has to find out the exact coordinates of a specific frame. The "Pick" mode works nicely to show the frame. Below the picking window, the GUI should show the coordinates of the frame that is picked. While working on the frame picker: it should also position the marker into the center of the picked pixel. Right now, it snaps to the top left.
2025-04-01T04:10:36.568677
2022-05-18T23:43:33
1240739237
{ "authors": [ "dopatraman", "enddynayn", "shannonwells", "wilwade" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14804", "repo": "LibertyDSNP/frequency", "url": "https://github.com/LibertyDSNP/frequency/issues/97" }
gharchive/issue
Test mempool and message size when submitting lengthy parameter values Issues: In storage we are storing a BoundedVec and right now that bound (SchemaMaxBytesBoundedVecLimit) is set in a Config and also used by the schema parameter in register_schema Additionally there is a GovernanceSchemaMaxBytes in storage which can be set and is used in the extrinsic. Questions to answer: Do we really need both values? Yes What happens when a parameter for an extrinsic is a boundedvec and you attempt to register a schema that is larger than SchemaMaxBytesBoundedVecLimit? Does this message get gossiped? Does it panic? Is it truncated? What's the effect on the node/network of attempting to register a lot of very long schemas? We can ask Parity if this is a concern we should take into account. Results of the investigation: Methods: I tried to register schemas of various lengths: 65_000, 493_121, 1_000_000, and all were barred from entering the transaction queue. I ran batches of 1000 messages each, and all were prevented from entering the network with no apparent adverse effects on the network itself. 1. Does this message get gossiped? It appears as if if it does not get gossiped. Messages do not seem to enter the transaction queue. 2. Does it panic? Yes. The error that gets posted is this: panicked at 'Bad input data provided to validate_transaction: Codec error',/frequency/runtime/frequency-rococo/src/lib.rs:799:1 3. Is it truncated? It seems as if the message is barred outright. I did not observe any truncation. Great research!
2025-04-01T04:10:36.572425
2018-12-02T15:27:39
386569851
{ "authors": [ "cmacdonell", "jezcope", "libcce" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14805", "repo": "LibraryCarpentry/librarycarpentry.github.io", "url": "https://github.com/LibraryCarpentry/librarycarpentry.github.io/issues/6" }
gharchive/issue
RSS and Atom displaying correctly? Hi @cmacdonell are the RSS and Atom feeds displaying correctly? I switched the URLs over to librarycarpentry.org. Atom is working for me in Chrome. I receive a blank page in Chrome when I visit the RSS link but when you view the underlying code, you can see the RSS. Here are the two links https://librarycarpentry.org/atom.xml and https://librarycarpentry.org/feed.xml. Others are welcome to respond if they are having issues and/or provide a fix ๐Ÿ˜ƒ Hi @libcce, I haven't had a chance to look yet. Interestingly both Atom and RSS works for me in Chrome. Neither seem to work in Firefox, I get ugly HTML in both cases. @libcce They both display the latest articles for me (Firefox). Have you got a cached version of something maybe? That's good to hear @jezcope! @cmacdonell have you tried clearing your cache? It works for me in Firefox as well. Seems like Chrome is fine as well. @jezcope can you check Chrome as well to see if it is working on your end? I've tried Firefox on another computer and it works fine. I didn't investigate whether it was the cache or not, but I'd suggest we leave this for now. Thanks @cmacdonell! Finally closing this issue (I thought I had already)!
2025-04-01T04:10:36.591923
2020-10-30T00:48:54
732798694
{ "authors": [ "HBadertscher", "dbrgn" ], "license": "CC0-1.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14806", "repo": "LibrePCB-Libraries/Microchip.lplib", "url": "https://github.com/LibrePCB-Libraries/Microchip.lplib/pull/16" }
gharchive/pull-request
PIC10(L)F32x(T)-?/MC: Fix error in description The package is DFN-8, not DIP-8! @HBadertscher can you confirm? Hi, oh yes this was probably a copy-paste mistake, sorry about that! All /MC should be DFN, and all /P should be DIP. Small side note: the error is in the PIC10F204/6-\?/MC and not the PIC10(L)F32x(T)-?/MC, right?
2025-04-01T04:10:36.610085
2022-08-15T22:23:35
1339581444
{ "authors": [ "AbdealiLoKo", "Drowze", "com6056", "prologic", "rschmied", "tnarik" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14807", "repo": "Librecov/librecov", "url": "https://github.com/Librecov/librecov/issues/203" }
gharchive/issue
Updated docs? Found this project after exploring https://github.com/danhper/opencov, awesome you are keeping it going! Are there any updated docs for getting a deployment of this going though? Seems like quite a few changes have been made but no updates to the README. Right now I'm struggling to get the initial setup done since mix isn't in the Docker image so I can't run mix ecto.setup. Thanks! In addition, running via docker is not possible as the image is unavailable at the moment. I've pushed a Docker build of this to my Docker Hub account at prologic/librecov @prologic I'm actively looking at this as well -- would be interested in a solution. My goal is to run this on a local docker instance for our little project. Need to run standalone, HA is not really an issue. I got the containers via compose up but it never listens on anything that looks consumable... Inside of the librecov container I get processes running, but there's no process listening on tcp/4000 or tcp/80. This is all new territory for me (especially the erlang piece), so i don't even know where to start troubleshootin... @rschmied Unfortunately I failed. I was not able to get this software running, or I think I got it running but it is quite bad to use and doesn't even work. ๐Ÿ˜ข thanks, @prologic -- i wonder if @yknx4 could share some insight. obviously, there is some interest to get this off the ground and a little more clarity on the documentation would certainly help me and others! I also found this project looking at opencov Wondering if anyone has been able to get it working recently ... This is such a nice project - considering learning Elixir because of this ๐Ÿ‘€
2025-04-01T04:10:36.618687
2022-08-11T00:19:23
1335335300
{ "authors": [ "Borda", "zolekode" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14808", "repo": "Lightning-AI/lightning-transformers", "url": "https://github.com/Lightning-AI/lightning-transformers/issues/282" }
gharchive/issue
How to fix ModuleNotFoundError: No module named 'habana_frameworks.torch' โ“ Questions and Help I am getting this error after following the instructions on installing habana for pytorch and running my script. OS: Ubuntu 20.04 Packaging: Pip Anyone has an idea on how to resolve it? Hi, I believe you should be using the Habana docker image where all needed libraries shall be already prepared for you: vault.habana.ai/gaudi-docker/1.5.0/ubuntu20.04/habanalabs/pytorch-installer-1.11.0:latest
2025-04-01T04:10:36.622587
2022-07-01T09:19:30
1291147460
{ "authors": [ "justusschock", "lanlanlan3", "rohitgr7" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14809", "repo": "Lightning-AI/lightning", "url": "https://github.com/Lightning-AI/lightning/issues/13484" }
gharchive/issue
load_from_checkpoint has different validation results when save ckpt, it has some metrics like val_acc=0.77. but when load_from_checkpoint, trainer.validate(model), I got val_acc=0.02 code๏ผš from main_pl import LitModel ckpt_path = '...val_acc=0.77.ckpt' model = LitModel.load_from_checkpoint(ckpt_path) '''model.eval()''' trainer = Trainer(gpus=-1) '''or trainer = Trainer(gpus=-1, resume_from_checkpoint=ckpt_path)''' trainer.validate(model) # val_acc=0.02 env: mac m1 python 3.9 and linux python 3.10 torch 1.11 cc @awaelchli @ananthsub @ninginthecloud @rohitgr7 @otaj @carmocca @jjenniferdai Hey, thanks for the issue. Can you give us your model definition or a minimum example of the boring model to reproduce this? please consider opening either a discussion or an issue. Answered here. We can continue the discussion there. Closing this for now. Thanks @rohitgr7 didn't see this
2025-04-01T04:10:36.626026
2024-07-11T15:33:17
2403495746
{ "authors": [ "bhargavyagnik", "rasbt" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14810", "repo": "Lightning-AI/litgpt", "url": "https://github.com/Lightning-AI/litgpt/pull/1574" }
gharchive/pull-request
โœ… Issue #1557: Fixed warning during model download and conversion. Resolved Issue #1557 posted by @Andrei-Aksionov regarding warnings from depricated codebase. Removed depricated params scripts/download.py passed Storage obj instead of ptr in utils.py Tests: โœ… test_convert_hf_checkpoint.py โœ… test_convert_lit_checkpoint.py โœ… test_convert_pretrained_checkpoint.py Also tested manually downloading the model for warnings. Thanks for the PR! What if we just check the pytorch version instead? I added a small modification for this. And in the future we can then also remove this when it's supported on macOS. Great, seems to pass now! Thanks again for the contribution!
2025-04-01T04:10:36.638237
2015-08-28T09:57:33
103688896
{ "authors": [ "efotopoulou", "lfarid" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14812", "repo": "LinDA-tools/LindaWorkbench", "url": "https://github.com/LinDA-tools/LindaWorkbench/issues/197" }
gharchive/issue
Able to add as string At step 4 , sometimes the vocabulary the end user wants to use are not retrievable, but still the end user may feel obligated to use specific vocabularies in order to achieve an interlinking he has in mind. It would be really practical to be able to add a url as string in order to have the desirable result. For example add http://purl.org/linked-data/sdmx/2009/dimension/refPeriod as simple string. yes, we have this on the list of todo's. see issue https://github.com/LinDA-tools/transformation/issues/9. this will allow the user to manually enter a predicate's URI (also for reconciling entities and for classes).
2025-04-01T04:10:36.673180
2024-09-26T16:47:50
2551088160
{ "authors": [ "Qluxzz", "joadan" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14816", "repo": "Linq2GraphQL/Linq2GraphQL.Client", "url": "https://github.com/Linq2GraphQL/Linq2GraphQL.Client/pull/43" }
gharchive/pull-request
Add description as doc comments if available Add description as <summary> doc comment to class/field if available. I used dotnet-t4 to compile the template since I'm on linux, but there seems to be small differences between that and what Visual Studio uses to compile the template, so you might want to recompile them Thank you, I re-generated the template and pushed a new version to nuget
2025-04-01T04:10:36.677496
2022-05-22T19:47:49
1244349186
{ "authors": [ "LinusEkstrom", "ethanschofer", "vansterhant" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14817", "repo": "LinusEkstrom/AddOn.Episerver.Settings", "url": "https://github.com/LinusEkstrom/AddOn.Episerver.Settings/issues/34" }
gharchive/issue
Sidebar Gadget does not appear After installing the NuGet package, the sidebar gadget does not appear. Is this a config step I am missing? This is AMS version 12.6. Im not sure what other packages might affect this? No, there is no further configuration needed. The gadget should appear automatically in the group with blocks and media. (screenshot from a vanilla CMS 12.6 alloy template site) Gadgets are only plugged in for areas (aka a sidebar) that has not been customized by the user (aka editor). Try resetting your views in your editorial settings and see if it appears then or if you still have a problem with this. Was this resolved @ethanschofer? I'm not sure. I rolled my own solution. I do have a much better understanding of plug ins now, so it's possible I could make it work? But I haven't tried Ethan On Oct 10, 2022, 6:13 AM -0400, Linus Ekstrรถm @.***>, wrote: Was this resolved @ethanschofer? โ€” Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.Message ID: @.***> Ok, closing this issue then for now.
2025-04-01T04:10:36.687637
2018-08-15T15:23:36
350852175
{ "authors": [ "andrewyatz", "jonchang" ], "license": "bsd-2-clause", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14818", "repo": "Linuxbrew/brew", "url": "https://github.com/Linuxbrew/brew/issues/811" }
gharchive/issue
Linuxbrew no longer uses a bottle when installing binutils Not sure if this has been raised before (please slap me down otherwise). Recently I have noticed that binutils attempts to recompile itself on my Linuxbrew installations. I do not install to the ~linuxbrew/.brew location and am using a non-standard root. However I am sure that binutils could be installed using a relocatable bottle. Has anything changed recently in brew to stop this feature? You can try brew install --force-bottle binutils Hi. That does work and brew doesn't complain. However it's really odd that brew now chooses not to use the binutils bottle. On Wed, 15 Aug 2018 at 16:29, Jonathan Chang<EMAIL_ADDRESS>wrote: You can try brew install --force-bottle binutils โ€” You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/Linuxbrew/brew/issues/811#issuecomment-413234381, or mute the thread https://github.com/notifications/unsubscribe-auth/AAK5GVZdtD7FX4_d_dND3rEBzO3-7Upfks5uRD5ogaJpZM4V-TLV . Yes, it's a known issue, Homebrew could be better at detecting whether bottles could be relocated. Actually, binutils is a special case, check out https://github.com/Linuxbrew/brew/wiki/Build-a-portable-bottle Fixed in 5041f9e7a6 Fantastic thank you :) On Wed, 15 Aug 2018 at 17:00, Jonathan Chang<EMAIL_ADDRESS>wrote: Fixed in Linuxbrew/homebrew-core@5041f9e https://github.com/Linuxbrew/homebrew-core/commit/5041f9e7a6b2ab41c75827fc8fd05b29a09939f7 โ€” You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/Linuxbrew/brew/issues/811#issuecomment-413244105, or mute the thread https://github.com/notifications/unsubscribe-auth/AAK5Gb_cumszTPZtHSQlA3ZcC2NAhOyrks5uREWkgaJpZM4V-TLV .
2025-04-01T04:10:36.689738
2019-01-01T09:45:03
395069543
{ "authors": [ "maxim-belkin", "sjackman" ], "license": "bsd-2-clause", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14819", "repo": "Linuxbrew/docker", "url": "https://github.com/Linuxbrew/docker/pull/46" }
gharchive/pull-request
CentOS 7: install 'file' CentOS 7 image is missing the file command which is needed for installing bottles. Checked: CentOS 6 has file Alternative is to HOMEBREW_BUILD_FROM_SOURCE=1 HOMEBREW_NO_AUTO_UPDATE=1 brew install file-formula what is used in CentOS 5, so maybe that one would be better... ? I prefer the solution of installing file using the host package manager. Thanks for catching this error, Maxim!
2025-04-01T04:10:36.694649
2023-09-27T09:03:18
1915040471
{ "authors": [ "Linwei-Chen", "Mitsualfa" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14820", "repo": "Linwei-Chen/LIS", "url": "https://github.com/Linwei-Chen/LIS/issues/3" }
gharchive/issue
About Pytorch version I tried to reproduce your work, but the lower version of Pytorch/CUDA computing power does not match the graphics card of RTX3090. The higher version of Pytorch has difficulty installing mmcv-full (requires 1.3.8~1.4.0) that is compatible with mmdet. I would like to ask which version of Python/CUDA/mmcv you used before? torch 1.7.1, this site may help: https://mmcv.readthedocs.io/en/v1.4.1/get_started/installation.html
2025-04-01T04:10:36.876579
2021-05-08T00:39:17
880005595
{ "authors": [ "scala-steward" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14822", "repo": "LolHens/http4s-errors", "url": "https://github.com/LolHens/http4s-errors/pull/26" }
gharchive/pull-request
Update munit-tagless-final to 0.1.2 Updates de.lolhens:munit-tagless-final from 0.0.1 to 0.1.2. GitHub Release Notes - Version Diff I'll automatically update this PR to resolve conflicts as long as you don't change it yourself. If you'd like to skip this version, you can just close this PR. If you have any feedback, just mention me in the comments below. Configure Scala Steward for your repository with a .scala-steward.conf file. Have a fantastic day writing Scala! Files still referring to the old version number The following files still refer to the old version number (0.0.1). You might want to review and update them manually. build.sbt Ignore future updates Add this to your .scala-steward.conf file to ignore future updates of this dependency: updates.ignore = [ { groupId = "de.lolhens", artifactId = "munit-tagless-final" } ] labels: test-library-update, semver-minor, old-version-remains Superseded by #33.
2025-04-01T04:10:36.877747
2024-01-26T09:59:17
2101923505
{ "authors": [ "MarGraz", "Piedone" ], "license": "BSD-3-Clause", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14823", "repo": "Lombiq/NodeJs-Extensions", "url": "https://github.com/Lombiq/NodeJs-Extensions/pull/93" }
gharchive/pull-request
Update NVM SetupWindows.md Added the next step in the "Setting up Node.js with NVM for Windows" section, because the next section title "Additional configuration", could appear as not mandatory for a new Orchard user Thank you!
2025-04-01T04:10:37.012345
2021-09-20T18:40:48
1001278555
{ "authors": [ "Louisvdw", "dakoal", "riegler-econsulting" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14827", "repo": "Louisvdw/dbus-serialbattery", "url": "https://github.com/Louisvdw/dbus-serialbattery/issues/49" }
gharchive/issue
DalyBMS 8cell: wrong value min Volt The values on a Raspberry Pi v2.73 with (since?) v0.8beta1 are not the same like the PC Windows App with the same USB-UART cable. (with v0.8 the BMS is not running) Between the two screenshots are only some seconds apart. Min/max difference is according v0.8beta1 since days about 0.008V - and according PC-App/Android App 0.003 V. The min Volt seems to be too low. additional info: There are two things I can think of that can influence this on the Daly. The Daly use seperate commands for each value you want, but the baud rate (speed) is actually low for the overhead. So while for most BMS we retrieve data every second, for the Daly we can only update every 2 seconds or else the data from the previous request has not finished yet. So this 2 second delay could be influencing the values that you see. The other thing is that the Daly has a command to return just the min/max cell voltages (so not all the cells) and this is what we currently use in the driver. I think this is most likely the reason for the difference as I don't know how fast the min/max values are updated against the cell values inside the BMS. I'll look at making a test version with the individual cell voltages at that you can check against. I am waiting also for the individual cell voltages.
2025-04-01T04:10:37.021010
2023-11-29T20:25:52
2017362058
{ "authors": [ "absidue", "dnicolson" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14828", "repo": "LuanRT/YouTube.js", "url": "https://github.com/LuanRT/YouTube.js/pull/546" }
gharchive/pull-request
Fix array indexes in generated classes Before: InnertubeError: DecoratedAvatarView not found! This is a bug, want to help us fix it? Follow the instructions at https://github.com/LuanRT/YouTube.js/blob/main/docs/updating-the-parser.md or report it at https://github.com/LuanRT/YouTube.js/issues! Introspected and JIT generated this class in the meantime: class DecoratedAvatarView extends YTNode { ... sources: { 0: { url: data.avatar.avatarViewModel.image.sources.0.url, width: data.avatar.avatarViewModel.image.sources.0.width, height: data.avatar.avatarViewModel.image.sources.0.height }, After: sources: { 0: { url: data.avatar.avatarViewModel.image.sources[0].url, width: data.avatar.avatarViewModel.image.sources[0].width, height: data.avatar.avatarViewModel.image.sources[0].height This doesn't fully fix the issue, as sources is created as an object instead of array. It's probably better if you add proper support for arrays, instead of abusing the fallback handlers.
2025-04-01T04:10:37.045925
2015-06-09T11:40:55
86567690
{ "authors": [ "LukeDeighton", "athulcek" ], "license": "apache-2.0", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14829", "repo": "LukeDeighton/WheelView", "url": "https://github.com/LukeDeighton/WheelView/issues/13" }
gharchive/issue
layout size cannot be set the layout is taking half of the screen i used with this property bt i was not able to resize the height of the whole view android:layout_width="match_parent" android:layout_height="wrap_content" android:background="#fff" app:repeatItems="true" app:rotatableWheelDrawable="true" app:selectionAngle="90.0" app:selectionPadding="4dp" app:wheelColor="#fff" app:wheelItemCount="10" app:wheelItemRadius="40dp" app:wheelPadding="13dp" app:wheelPosition="bottom" app:wheelRadius="276dp" I've added view measurements now so using "wrap_content" should work on the latest commit. Try it and let me know if it solves your problem.
2025-04-01T04:10:37.052517
2021-03-02T09:59:25
819852821
{ "authors": [ "LukeMathWalker", "bbros-dev", "gihrig" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14830", "repo": "LukeMathWalker/zero-to-production", "url": "https://github.com/LukeMathWalker/zero-to-production/issues/57" }
gharchive/issue
Chapter 1: Rust format defaults I found this a handy thing to add Rust format defaults to the project rustfmt.toml rustfmt --print-config default rustfmt.toml Not sure where this would be best placed. What is the value of having a rustfmt.toml if you are not customising the configuration though? ๐Ÿค” Just adding to the conversation... Before switching from PHP, Java, JS to Rust as my main language about a year ago, I was into tweaking format style to the nth degree. Not having that ability in Rust (or not knowing about it), I have accepted the value of all-inclusive standards for trivial things, like style, and accepted that I have plenty to learn about Rust. I don't need to spend time on trivial details the compiler will dispense with anyway. The hardest things for me were required semicolons everywhere, accept return values, and 4 space indents, thankfully it's 4 spaces not tabs ;-). Over the past year plus, I remain sufficiently challenged by Rust details, that this trivia just doesn't register any more. Having a few more mental cpu cycles to spend on the important stuff has become much appreciated. I have not missed .eslintignore, .eslingt.js, `.editorconfig', etc., not much anyway. My understanding is 02P is aimed at people new to rust, and at least people who are going through the Rust book or Rust by example What is the value of having a rustfmt.toml if you are not customising the configuration though? This addresses the premise behind your question. How do you know the extent to which rustfmt defaults can be configured? How do you know what rustfmt imposes? This cuts out a lot of experimentation - just scanning that list gives good overview. @gihrig makes a valid point - leave it alone. However, some orgs have coding styles and standard that are entrenched (i.e. strong resistance to change). It helps to let people know to what extent the rustfmt can accommodate their existing conventions or existing requirements. Or do they need to look for another crate? * How do you know (with minimal effort) the extent to which rustfmt defaults can be configured? * How do you know (with minimal effort) what rustfmt imposes? This cuts out a lot of experimentation - just scanning that list gives good overview. @gihrig makes a valid point - leave it alone. However, some orgs have coding styles and standard that are entrenched (i.e. strong resistance to change). It helps to let people know to what extent the rustfmt can accommodate their existing conventions or existing requirements. Or do they need to look for another crate? I 100% agree that people should know that rustfmt is customisable - that's why in that very section we link to https://github.com/rust-lang/rustfmt#configuring-rustfmt On the flip side, @gihrig is right: the Rust community encourages to stick with the default configuration and avoid tweaking it. Adding a default rustfmt.toml file gives the impression that each Rust project should have one as a best practice, which is not the case.
2025-04-01T04:10:37.098253
2024-09-25T20:15:04
2548916958
{ "authors": [ "bradgrantham-lunarg", "davidd-lunarg" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14831", "repo": "LunarG/gfxreconstruct", "url": "https://github.com/LunarG/gfxreconstruct/issues/1761" }
gharchive/issue
Vulkan dump resources: don't exit() process VulkanReplayDumpResourcesBase::QueueSubmit calls exit in some failure cases. Try to find a way to avoid the exit call and continue (e.g., return/log an error, throw an exception, etc). Fixed in #1852
2025-04-01T04:10:37.112462
2024-03-17T23:59:21
2190958960
{ "authors": [ "Mycheze", "jzohrab", "yannayp" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14832", "repo": "LuteOrg/lute-v3", "url": "https://github.com/LuteOrg/lute-v3/issues/348" }
gharchive/issue
Option to not have a pop-up for well known terms Is your feature request related to a problem? Please describe. I've been using Lute to learn Portuguese from a very early stage of my journey with the language. Many basic words are now marked as Well Known, but they keep popping up when hovering over a text, which can be distracting and ultimately unnecessary (as they are well known). Describe the solution you'd like An ideal solution would be an option to turn off pop-ups when hovering over Well Known (and Ignored) terms. Of course, it should be optional and be off by default, probably. Describe alternatives you've considered Maybe this feature should be built-in for the "Focus mode"? Additional context Silly question, but could you just ... not hover over them? :-) I understand what you're saying, just interested in your motivation for the question. Thanks for the issue! Yeah, of course, I try not to hover too much. It's just a habit that a lot of people have when reading on the computer, to follow along with the cursor. And I think it makes sense as a feature, but obviously not top priority. I second this option. I think it'd be a good idea for it to be hidden in the settings, since for most people, default behavior is good. I think that there should just be a delay option for each learning level. When I hover over a 1-2, I want to see it right away, but with a 3 or 4, I sometimes hover over without really meaning it and see the word instantly when I'd rather think for a second. For ignored words, I probably always want to be able to see them (if there's anything there) and then known words I would want like a 3 second delay before seeing the translation. Yannayp is right that it's just a habit to follow along with what I'm reading. And a little delay would mean that I could do that without switching my brain into "just read the English" mode. I think that there should just be a delay option for each learning level. Avoid word "just" because it trivializes how tough things can be to do! :-) (insert El Risitas picture here) But it's a nice idea. I hear you on the switching brain thing. For an initial step I think a setting to say "only show the popup for this status or lower" would do. Oh yeah, I try to remove "just" wherever I can. In this case, it was more in response to the other option of having it not appear. I don't think that's needed, "just" a delay would be fantastic. But my post would have been just fine without the word just ๐Ÿ™ Thanks for the reminder!
2025-04-01T04:10:37.136813
2020-02-17T12:35:56
566271300
{ "authors": [ "thvhauwe" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14840", "repo": "Lyceum/LyceumBase.jl", "url": "https://github.com/Lyceum/LyceumBase.jl/issues/28" }
gharchive/issue
Cannot find name corresponding to UUID Following the intial setup at https://docs.lyceum.ml/dev/, I'm running into an issue trying to install any Lyceum package (v1.3) pkg> add LyceumBase Updating registry at `~/.julia/registries/LyceumRegistry` Updating git-repo `https://github.com/Lyceum/LyceumRegistry` Resolving package versions... ERROR: cannot find name corresponding to UUID 9da8a3cd-07a3-59c0-a743-3fdc52c30d11 in a registry (v1.3) pkg> add Lyceum Resolving package versions... ERROR: cannot find name corresponding to UUID 189a3867-3050-52da-a836-e630ba90ab69 in a registry (v1.3) pkg> add LyceumMuJoCo Resolving package versions... ERROR: cannot find name corresponding to UUID 39de3d68-74b9-583c-8d2d-e117c070f3a9 in a registry This is in a freshly installed Julia 1.3.1 REPL. OS: NAME="Ubuntu" VERSION="16.04.6 LTS (Xenial Xerus)" ID=ubuntu ID_LIKE=debian PRETTY_NAME="Ubuntu 16.04.6 LTS" VERSION_ID="16.04" HOME_URL="http://www.ubuntu.com/" SUPPORT_URL="http://help.ubuntu.com/" BUG_REPORT_URL="http://bugs.launchpad.net/ubuntu/" VERSION_CODENAME=xenial UBUNTU_CODENAME=xenial I've tracked down this bug with pkg. Apparantly, its no good to add custom registeries to a fresh installed Julia. Resolved by added the default registery before adding Lyceum. Reference: https://forum.mimiframework.org/t/error-installing-mimi-under-v1-3-1/109/3
2025-04-01T04:10:37.139063
2019-04-14T10:57:58
432968204
{ "authors": [ "ildyria", "neurosatan" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14841", "repo": "LycheeOrg/Lychee-Laravel", "url": "https://github.com/LycheeOrg/Lychee-Laravel/issues/199" }
gharchive/issue
Feature request: SQLite why not add SQLite? as an option because I keep just a photo gallery and cumbersome mysql I would not really like to use. P.S. Sorry my bad eng. Lychee Laravel is database agnostic and you can use SQLite with it. you just need to specify this in the .env. Worth reading: https://github.com/LycheeOrg/Lychee-Laravel/blob/master/config/database.php#L36
2025-04-01T04:10:37.180270
2023-04-05T13:55:24
1655668767
{ "authors": [ "quentinovega" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14845", "repo": "MAIF/daikoku", "url": "https://github.com/MAIF/daikoku/issues/539" }
gharchive/issue
plan can be filter by js script add props to html element to easily filter plan (like "data-plan-name") FIX by c556ad7898ffe7e252de36b2a9e61743bfdd85df
2025-04-01T04:10:37.571927
2023-01-05T18:18:23
1521179203
{ "authors": [ "chdev77", "maximilianobaez" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14846", "repo": "MGenteluci/cloudformation-deploy-action", "url": "https://github.com/MGenteluci/cloudformation-deploy-action/issues/10" }
gharchive/issue
mkdir: can't create directory '/github/home/.aws': File exists I added your action to my CI with the command: - name: Deploy to AWS via CloudFormation uses<EMAIL_ADDRESS> env: TEMPLATE: ${{ env.AWS_TEMPLATE_FILE_DEV }} AWS_STACK_NAME: ${{ env.AWS_STACK_NAME }} AWS_REGION: ${{ env.AWS_REGION }} AWS_ACCESS_KEY_ID: ${{secrets.AWS_ACCESS_KEY_ID}} AWS_SECRET_ACCESS_KEY: ${{secrets.AWS_SECRET_ACCESS_KEY}} AWS_DEPLOY_BUCKET: ${{ env.AWS_S3_BUCKET }} I get the error when after the first execution: mkdir: can't create directory '/github/home/.aws': File exists This is the complete error: runner/_work/_temp/_runner_file_commands":"/github/file_commands" -v "/opt/devops/actions-runner/_work/lambda-test/lambda-test":"/github/workspace" 35b096:080abb3e763f4f4c8170952287575800 mkdir: cannot create directory '/github/home/.aws': File exists Same issue here... mkdir: cannot create directory '/github/home/.aws': File exists
2025-04-01T04:10:37.585368
2018-09-27T12:21:23
364449263
{ "authors": [ "FabianIsensee", "yianzhongguo" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14847", "repo": "MIC-DKFZ/BraTS2017", "url": "https://github.com/MIC-DKFZ/BraTS2017/issues/9" }
gharchive/issue
What is the brain_mask @FabianIsensee Hi, Isensee. I am reading your code and have learned a lot. But I do not make it clear when I find the line"brain_mask= (t1_img != t1_img[0, 0, 0]) & (t1c_img != t1c_img[0, 0, 0]) & (t2_img != t2_img[0, 0, 0]) & ( flair_img != flair_img[0, 0, 0])". What is the brain_mask, thank you! Hi, the brain mask is just the mask of all nonzero voxels. The brats data is brain extracted meaning that the outside is all zeros. I am using that brain mask to do normalization within the brain region only and not having skewed mean and standard deviation due to background @FabianIsensee Thank you! I have checked the original image of brats2017. I know it now the image[0,0,0] is 0 just like what you said.
2025-04-01T04:10:37.594960
2015-06-16T08:31:48
88656985
{ "authors": [ "bmeldal", "colin-combe" ], "license": "apache-2.0", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14848", "repo": "MICommunity/interaction-viewer", "url": "https://github.com/MICommunity/interaction-viewer/issues/67" }
gharchive/issue
labels in legend The 'small molecule' label should say 'bioactive entity' as the little triangle is used to represented both small molecules and polysaccharides, right? (And anything that had 'bioactive entity' as its type.) The 'your complex' label in the legend should probably give the EBi accession number for the complex? Yes, I think you are right! Sorry, didn't see this question until now... This is changed in the demo file for the interaction-viewer project (my branch), but not in the Complex Portal. The code that draws the legend in the Complex Portal is not part of the interaction-viewer (though it was a cut'n'paste from the demo file). I'm gonna close this issue because (strictly speaking) I think it's outside the scope of this project. Max @tschaka1904, I will log this in our temp bug file. Thanks for updating your branch, Colin. Birgit
2025-04-01T04:10:37.658762
2024-01-18T13:45:49
2088324379
{ "authors": [ "Cena980", "TasabeehAhmed" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14849", "repo": "MIT-Emerging-Talent/2024-group-05-collaboration-practice", "url": "https://github.com/MIT-Emerging-Talent/2024-group-05-collaboration-practice/issues/7" }
gharchive/issue
Challenge: is_palindrome_after_char_deletion A function that determines whether a given string can be turned into a palindrome by deleting at most one character. Well Done!!
2025-04-01T04:10:37.660253
2018-03-24T13:11:58
308261216
{ "authors": [ "MIfeanyi" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14850", "repo": "MIfeanyi/ariel", "url": "https://github.com/MIfeanyi/ariel/issues/7" }
gharchive/issue
Implement xml and Jason parsing This will most likely be implemented via third party library Potential XML is RapidXML I believe chaiscript has the ability to with from_JSON. More testing needed & review on IMGUI integration
2025-04-01T04:10:37.665441
2024-10-13T10:24:36
2583854587
{ "authors": [ "eshaalal", "vivekvardhan2810" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14851", "repo": "ML-Fusion-Lab/ML-Fusion-Lab-Website", "url": "https://github.com/ML-Fusion-Lab/ML-Fusion-Lab-Website/issues/536" }
gharchive/issue
[Add toggling in contact us page]: Is your feature request related to a problem? Please describe. In other pages toggling from dark to light is available i would like to do the same here Describe the solution you'd like. No response Describe alternatives you've considered. No response Additional context. No response Show us the magic with screenshots No response Checklist [X] I have checked the Existing Issues [X] I have read the Contributing Guidelines [X] I want to work on this issue. (optional) @eshaalal make sure to complete it within 2 days. sure @eshaalal from next time also please mention the issue number while creating the Pull Request. Yes, I will keep that in mind
2025-04-01T04:10:37.670183
2023-02-19T14:49:37
1590716056
{ "authors": [ "EthanMarx", "alecgunny" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14852", "repo": "ML4GW/BBHNet", "url": "https://github.com/ML4GW/BBHNet/pull/305" }
gharchive/pull-request
Parallelize timeslide waveforms Minimum working example of waveform parallelization as described in #298 [ ] Unit tests on deploy [x] Fix subfile getting written with indents [x] Concatenation of hdf5 files at the end - use condor_watch_q like pyomicron to wait for jobs to finish? [ ] Incorporation of new typeo arg will clean things up [x] Running into errors with condor due to how pinto modifies LD_LIBRARY_PATH Re this condor issue: /usr/lib64/libcondor_utils_9_0_17.so: undefined symbol: _ZN7classad7ClassAdC1Ev It looks like pre-pending the conda envs lib is confusing the linker. Prepending/usr/lib64 seems to be a bandaid solution to the problem, but I'm not sure if this will in turn mess with the torch/cuda shared libraries this solution was initially meant to address. Take a look at ML4GW/typeo#15 when you have a chance and let me know if the args option looks like it might help whatever it is you're building. Haven't thought as much about the nuts and bolts but figure something like this might help with what we're looking for This would definitely help clean up the condor file argument construction Think this is worth reviewing and merging at this point. Integration with the rest of the inference scripts can be done via other PR's before the inference-refactoring branch is merged into main. @alecgunny Tests are now failing here due to the latest pinto fix (i.e. looking for libs in the base conda env). Is there syntax for turning on the ld_lib_path appending from the command line?
2025-04-01T04:10:37.678542
2022-02-19T15:52:27
1144761757
{ "authors": [ "TheGreyDiamond", "klaernie" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14853", "repo": "MMM-CalendarExt2/MMM-CalendarExt2", "url": "https://github.com/MMM-CalendarExt2/MMM-CalendarExt2/pull/148" }
gharchive/pull-request
Removes unused dependency on base64 This removes the dependency on base64 Could you please take a look again, there were some changes to the general coding conventions, which now produces a confusing diff. I decided to rebase this change myself. This keeps the spirit of the request, but also incorporates the changes made in the meantime on our end.
2025-04-01T04:10:37.692319
2023-01-24T11:10:47
1554776675
{ "authors": [ "codecov-commenter", "fynsta" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14854", "repo": "MOTIS-Mitfahr-App/flutter-app", "url": "https://github.com/MOTIS-Mitfahr-App/flutter-app/pull/88" }
gharchive/pull-request
Fix factories @istzen and I noticed that the factory foreign key logic was not perfect yet. The following usecase would not work: Setting the related object to null via NullableParameter, but still wanting to set an id. For example: drive: NullableParameter(null), driveId: 1 Codecov Report Base: 19.37% // Head: 19.37% // No change to project coverage :thumbsup: Coverage data is based on head (3b8f871) compared to base (952ca19). Patch has no changes to coverable lines. Additional details and impacted files @@ Coverage Diff @@ ## main #88 +/- ## ======================================= Coverage 19.37% 19.37% ======================================= Files 83 83 Lines 4460 4460 ======================================= Hits 864 864 Misses 3596 3596 Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. :umbrella: View full report at Codecov. :loudspeaker: Do you have feedback about the report comment? Let us know in this issue.
2025-04-01T04:10:37.708465
2018-02-03T02:19:40
294074931
{ "authors": [ "smiba" ], "license": "apache-2.0", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14855", "repo": "MPOS/php-mpos", "url": "https://github.com/MPOS/php-mpos/pull/2668" }
gharchive/pull-request
CRSF token extension and minor fixes Been using this CRSF token solution for a while and its been working great, no longer "Token Expired" issues. Personally I have it at 60 minutes, but since its at 2 minutes now I decided that 15 minutes is a pretty big leap already Also fixed how "Est. Next Difficulty" gets shown for coins with a continuously changing difficulty on each block or when the Next Difficulty refresh happens at the next block Will update this in a proper PR in a moment
2025-04-01T04:10:37.745959
2021-05-31T20:14:53
907696003
{ "authors": [ "NickolajA", "cfsnate" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14856", "repo": "MSEndpointMgr/ModernDriverManagement", "url": "https://github.com/MSEndpointMgr/ModernDriverManagement/pull/104" }
gharchive/pull-request
add MDMDeploymentType adds the ability to force the deployment type Internal/External to resolve issues when deploying after reboot with Always On Device VPN on OSD CGW offsite imaging I'm a little keen on understanding the scenario a little better for why this would be necessary. Would you please elaborate a bit more?
2025-04-01T04:10:37.749702
2015-03-20T12:45:26
63222040
{ "authors": [ "kasturiswain" ], "license": "apache-2.0", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14857", "repo": "MSOpenTech/msopentech-tools-for-intellij", "url": "https://github.com/MSOpenTech/msopentech-tools-for-intellij/issues/242" }
gharchive/issue
Plugin not showing the signed in id in Create Storage Account window even the user has signed in through AD Authentication Environment: Windows/IntelliJ14 Steps: Sign in Subscription through AAD Authentication. Click Azure. Right Click storage. Click Create Storage Account. Create Storage Account window is not showing the signed in id. PFB the screen shot in plugin. Story board. This issue is fixed and functionality is working as expected in the latest build.
2025-04-01T04:10:37.751713
2015-05-27T08:22:46
81344648
{ "authors": [ "EvgenyAgafonchikov", "mkostin" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14858", "repo": "MSOpenTech/opencv", "url": "https://github.com/MSOpenTech/opencv/issues/49" }
gharchive/issue
Updating OCV structure cuda_test and ocl_test (and possible some other) files contain some macros and functions that are required for other tests. So these files are added even in case there is no CUDA and OpenCL support on the platform. Investigation shows that some of cuda_test/ocl_test features non-blocked by CUDA/OCL absence are used by some cuda/ocl-unrelated OpenCV tests. However this structure may obfuscate developer especially while working on system w/o CUDA/OCL support as WinRT. Current suggestion is to split commonly supported and CUDA/OCL-specific code. Need to be discussed with Itseez before implementations as may require big effort so need to be sure if they consider such change as required. :+1:
2025-04-01T04:10:37.760062
2024-08-13T21:38:12
2464269772
{ "authors": [ "MShekow", "ezequiassouzamx", "skeltonsc" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14859", "repo": "MShekow/outlook-google-calendar-sync", "url": "https://github.com/MShekow/outlook-google-calendar-sync/issues/6" }
gharchive/issue
getting 400 message": "Invalid attendee email" error with some events in the "Create event (SyncBlocker) in cal 2" step Hello, thank you for this very cool little flow, but we are getting some strange errors. that I hope you can assist with? Some of our meetings when syncing to google from outlook will fail with this error in the "Create event (SyncBlocker) in cal 2" step { "statusCode": 400, "headers": { "Cache-Control": "no-store, must-revalidate, no-cache, max-age=0", "Pragma": "no-cache", "Vary": "Origin,X-Origin,Referer", "X-XSS-Protection": "0", "X-Frame-Options": "SAMEORIGIN", "X-Content-Type-Options": "nosniff", "Alt-Svc": "h3=":443"; ma=2592000,h3-29=":443"; ma=2592000", "Timing-Allow-Origin": "*", "x-ms-apihub-cached-response": "true", "x-ms-apihub-obo": "false", "Date": "Tue, 13 Aug 2024 14:00:48 GMT", "Content-Length": "231", "Content-Type": "application/json", "Expires": "Mon, 01 Jan 1990 00:00:00 GMT" }, "body": { "error": { "errors": [ { "domain": "global", "reason": "invalid", "message": "Invalid attendee email." } ], "code": 400, "message": "Invalid attendee email." } } } a sample of one of the failed attendees "newEvent/attendees"<EMAIL_ADDRESS> Hello, adding to this thread, I shortened the "attendee email" which was originally longer than 255 characters, and the flow error didn't occur again. The original attendee email was: @concat('do-not-delete-this-attendee-or-sync-breaks@', variables('eventId'), '.invalid'). I changed it to: @concat('d@', variables('eventId'), '.invalid'). hello, thats correct, I have shortened it as well as changing the two slice sections to count the correct values now its now working correctly :) slice( split(item()?['requiredAttendees'], ';')[0], 43, // length of "do-not-delete-this-attendee-or-sync-breaks@" sub(length(split(item()?['requiredAttendees'], ';')[0]), 8) // 8 is the length of ".invalid" ) There is now an updated flow available at https://github.com/MShekow/ng-outlook-google-calendar-sync/ which might fix your problem. The https://github.com/MShekow/outlook-google-calendar-sync flow has been deprecated and is no longer maintained.
2025-04-01T04:10:37.841163
2019-04-17T02:16:45
434061036
{ "authors": [ "Sebanisu" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14860", "repo": "MaKiPL/OpenVIII", "url": "https://github.com/MaKiPL/OpenVIII/pull/49" }
gharchive/pull-request
TextureHandler support for highres textures/mods begins I made a new class called TextureHandler to deal with choosing the best texture(s) to draw/load. if the texture is a single texture replacement you can just cast to Texture2D. If the texture was divided into chunks you need to use TextureHandler.Draw() The highres variants of textures are drawing now with correct colors and scale. I added some limited support for modded textures. // still needs pallet support. modded textures are at FF8DIR+$"..//..//textures//{basename.Substring(0, 2)}//{basename}//" or something like that. If the folder exists I search a file with the basename in the filename. The next TODO is to support iconfl00.TEX -> iconfl03.TEX those are based on the SP1 and icon.tex so I hasn't changed it yet. I think I'll try to inherit SP2 class it does share some of the building blocks. I did add support for cards but the images in the menu files are just the parts that show in the menus. The ones that show in the triple triad game are in FF8.exe in TIM textures per the forums. I tweaked the menu again it had some scaling issues after my last change to it. Seems to have the desired effect now. Most of the morning was on https://github.com/MaKiPL/OpenVIII/issues/48 where I fixed the colors for 16bit images by porting code from vincent tim. Oh on Sunday night I changed the thread to a task. https://github.com/MaKiPL/OpenVIII/issues/38 I wonder if we need to worry about ram usage. having modded textures on approx 1gb of ram or more. Though I thought those things were stored on the graphics card.
2025-04-01T04:10:37.877330
2018-11-26T20:02:08
384488393
{ "authors": [ "LarryBarker", "MrCrankHank", "laclance", "patrickbrouwers" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14861", "repo": "Maatwebsite/Laravel-Excel", "url": "https://github.com/Maatwebsite/Laravel-Excel/issues/1917" }
gharchive/issue
[BUG] llegal offset type when trying to use ExcelFake to test importing with UploadedFile. Prerequisites [x] Able to reproduce the behavior outside of your code, the problem is isolated to Laravel Excel. [x] Checked that your issue isn't already filed. [x] Checked if no PR was submitted that fixes this problem. Versions PHP version: 7.2.11 Laravel version: 5.6 Package version: 3.1 Description When trying to use ExcelFake to test importing with UploadedFile I get llegal offset type error on line 88 in the ExcelFake import function $this->imported[$disk ?? 'default'][$filePath] = $import; According to the function's docs $filePath can be a string or an UploadedFile but obviously an UploadedFile can not be a key to an array, hence the error. Steps to Reproduce protected function getTestFile($fileName) { $file = new UploadedFile( base_path('tests/files/' . $fileName), $fileName, $this->getMimeType($fileName), null, true ); return ['file' => $file]; } public function test_user_can_import_terms() { Excel::fake(); $response = $this->post(route('terms.import'), $this->getTestFile('create_terms.xls')); Excel::assertImported('create_terms.xls', 'testing', function(TermsImport $import) { return true; }); } ` Expected behavior: I presume the import function should get the file path from the UploadedFile object to use as the array key. Actual behavior: llegal offset type error I just tried the exact same thing and have the exact same error: [2019-01-01 03:45:45] testing.ERROR: Illegal offset type {"userId":249,"email":"[email protected]","exception":"[object] (ErrorException(code: 0): Illegal offset type at L:\PhpstormProjects\pswain\vendor\maatwebsite\excel\src \Fakes\ExcelFake.php:88) [stacktrace] Uploadedfile fakes are currently not supported yet. Feel free to PR it Where does the first block of code go? In the test class? I don't have getMimeType available?
2025-04-01T04:10:37.881130
2019-05-23T02:48:00
447425573
{ "authors": [ "GlennM", "LiuXiaoQ-0309" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14862", "repo": "Maatwebsite/Laravel-Excel", "url": "https://github.com/Maatwebsite/Laravel-Excel/issues/2204" }
gharchive/issue
how to use ajxa to download Prerequisites Versions PHP version: 7.0 Laravel version: 3.1 Package version: Description how to use ajxa to download Additional Information Any additional information, configuration or data that might be necessary to reproduce the issue. how to use ajxa to download Please fill in the issue template. Please fill in the issue template. done This comment may point you in the right direction. Closing due to resolved/inactivity. Please feel free to reopen (and fill in the issue template) if you need additional support.
2025-04-01T04:10:37.883259
2014-11-27T06:54:20
50260296
{ "authors": [ "Vishal0203", "patrickbrouwers", "renciebautista" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14863", "repo": "Maatwebsite/Laravel-Excel", "url": "https://github.com/Maatwebsite/Laravel-Excel/issues/282" }
gharchive/issue
return rows that are empty the function $reader->get() return rows even empty Same problem. I dont want to run a loop to disregard the null data. Library should be able to do that $reader->ignoreEmpty() Still leaves a few null keys. [ { "id": "11K81A0558", "email"<EMAIL_ADDRESS> "role": "Administrator" }, { "id": "11K81A0559", "email"<EMAIL_ADDRESS> "role": "Staff" }, { "id": null }, { "id": null, "email": null } ] This packages just wraps PHPExcel, nothing we can do about that.
2025-04-01T04:10:37.909269
2023-04-14T20:41:12
1668925744
{ "authors": [ "katefive", "npapargyr" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14864", "repo": "Mael-zys/T2M-GPT", "url": "https://github.com/Mael-zys/T2M-GPT/issues/22" }
gharchive/issue
ValueError: Imaginary component Thanks for sharing such an amazing work! While running the train_vq.py, after the warm-up iters I am getting the following error: 2023-04-14 22:31:45,883 INFO Training on t2m, motions are with 22 joints Reading checkpoints/t2m/Comp_v6_KLD005/opt.txt Loading Evaluation Model Wrapper (Epoch 28) Completed!! 100%|โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 23384/23384 [00:22<00:00, 1033.65it/s] 0%| | 0/1460 [00:00<?, ?it/s]Total number of motions 20942 100%|โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 1460/1460 [00:02<00:00, 497.87it/s] Pointer Pointing at 0 2023-04-14 22:32:37,786 INFO Warmup. Iter 200 : lr 0.00004 Commit. 0.29882 PPL. 82.50 Recons. 0.70914 2023-04-14 22:32:59,786 INFO Warmup. Iter 400 : lr 0.00008 Commit. 1.13799 PPL. 128.26 Recons. 0.50615 2023-04-14 22:33:22,425 INFO Warmup. Iter 600 : lr 0.00012 Commit. 2.25313 PPL. 217.44 Recons. 0.39540 2023-04-14 22:33:43,956 INFO Warmup. Iter 800 : lr 0.00016 Commit. 3.23983 PPL. 270.35 Recons. 0.32911 Traceback (most recent call last): File "/media/npapargyr/C07AF5B07AF5A2F8/WD_2TB_Elements/t2m-gpt/train_vq.py", line 132, in <module> best_fid, best_iter, best_div, best_top1, best_top2, best_top3, best_matching, writer, logger = eval_trans.evaluation_vqvae( File "/home/npapargyr/anaconda3/envs/onnx/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs) File "/media/npapargyr/C07AF5B07AF5A2F8/WD_2TB_Elements/t2m-gpt/utils/eval_trans.py", line 98, in evaluation_vqvae fid = calculate_frechet_distance(gt_mu, gt_cov, mu, cov) File "/media/npapargyr/C07AF5B07AF5A2F8/WD_2TB_Elements/t2m-gpt/utils/eval_trans.py", line 547, in calculate_frechet_distance raise ValueError('Imaginary component {}'.format(m)) ValueError: Imaginary component 2.1971031947393037e+113 Any ideas? I have the same issue. Did you solve this problem?
2025-04-01T04:10:37.947656
2015-12-06T16:00:37
120642183
{ "authors": [ "CADbloke", "punker76", "taori", "xxMUROxx" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14870", "repo": "MahApps/MahApps.Metro", "url": "https://github.com/MahApps/MahApps.Metro/issues/2254" }
gharchive/issue
Resource with the name {x} cannot be found I'm getting this error in design time for my static resources. Markup: <UserControl x:Class="ImmoCrawler.Client.Views.Sections.CrawlerDataKeyMasterView" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:d="http://schemas.microsoft.com/expression/blend/2008" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" xmlns:cal="http://www.caliburnproject.org" xmlns:vm="clr-namespace:ImmoCrawler.Client.ViewModels.Sections" xmlns:controls="http://metro.mahapps.com/winfx/xaml/controls" cal:Bind.AtDesignTime="True" x:Name="Self" d:DataContext="{d:DesignInstance vm:CrawlerDataKeyMasterViewModel}" mc:Ignorable="d" d:DesignHeight="300" d:DesignWidth="300"> <DockPanel> <Button cal:Message.Attach="Save" Content="Speichern" DockPanel.Dock="Bottom" Margin="0,10,0,0"></Button> <TextBox Text="{Binding Path=InsertionName, Mode=TwoWay, ValidatesOnNotifyDataErrors=True, UpdateSourceTrigger=PropertyChanged}" controls:TextBoxHelper.Watermark="Feldname" ToolTip="Feldname" DockPanel.Dock="Bottom" FontSize="20"> <TextBox.InputBindings> <KeyBinding Key="Enter" Command="{Binding Path=InsertCommand}"></KeyBinding> </TextBox.InputBindings> </TextBox> <ListView ItemsSource="{Binding Path=Keys}"> <ListView.ItemContainerStyle> <Style TargetType="{x:Type ListViewItem}" BasedOn="{StaticResource MetroListViewItem}" > <Setter Property="Margin" Value="0,0,0,3" /> </Style> </ListView.ItemContainerStyle> <ListView.ItemTemplate> <DataTemplate DataType="vm:CrawlerKey"> <Grid> <Grid.ColumnDefinitions> <ColumnDefinition Width="*"/> <ColumnDefinition Width="Auto"/> </Grid.ColumnDefinitions> <TextBlock Grid.ColumnSpan="2" Padding="4" FontSize="18" Text="{Binding Path=Name}" Foreground="{StaticResource TextBrush}" Background="{StaticResource AccentColorBrush}"></TextBlock> <Border Grid.Column="1" Padding="10" BorderBrush="Black" BorderThickness="1,0,0,0"> <Border.InputBindings> <MouseBinding MouseAction="LeftClick" CommandParameter="{Binding Path=.}" Command="{Binding ElementName=Self, Path=DataContext.RemoveCommand}" /> </Border.InputBindings> <Rectangle Fill="White" Width="10" Height="10" > <Rectangle.OpacityMask> <VisualBrush Visual="{StaticResource appbar_close}" Stretch="Fill" /> </Rectangle.OpacityMask> </Rectangle> <Border.Style> <Style TargetType="Border"> <Setter Property="Background" Value="{StaticResource AccentColorBrush4}" /> <Style.Triggers> <Trigger Property="IsMouseOver" Value="True"> <Setter Property="Background" Value="{StaticResource ControlsValidationBrush}" /> </Trigger> </Style.Triggers> </Style> </Border.Style> </Border> </Grid> </DataTemplate> </ListView.ItemTemplate> </ListView> </DockPanel> </UserControl> The errors are reported for "AccentColorBrush" and "TextBrush" If i replace StaticResource with DynamicResource for those 2 keys, the errors disappear but the rendered result is incorrect. It does render just fine when the project is running with both dynamic and static resource for the 2. design time with DynamicResource actual rendering This error started happening with the upgrade from 1.1.2 to 1.1.3 alpha 240, in case that helps tracking down the problem. Don't bother with design time issues. In my case during design time sometimes the colors are displayed with correct background sometimes not. Seems to be a VS issue. Furthermore I suggest you to use DynamicResources this gives you the possibility to change theme during runtime. "Don't bother" is not how issues are resolved. it works perfectly fine with 1.1.2. It does not with 1.1.3 alpha 240. So its obviously NOT a VS issue. I also mentioned that i did try DynamicResources, which results in incorrect rendering as you can see in the 2 screenshots i provided @taori @xxMUROxx Please, not another VS designer discussion here... @taori Where are your resources defines? After changing to new MahApps it's good to clear the VS designer cache. The VS designer always killing me :-D @punker76 They are defined in the App.xaml file pretty much exactly like suggested in the "get started" docus. VS designer cache can be cleared seperately? I've only used clean solution. that usually resolved designer issues for me solution cleared Yup, i remember back in 2012 it was even more of a pain than now. Back then restarting VS fixed issues but clearing the solution didn't always fix it :) @punker76 Jump in = pull requests accepted? I'm willing to do it if you can give me a brief description of how to detect such design time issues (Haven't done it before). My guess is i have to launch devenv with debugger attached and just listen to every exception? @punker76 alright... clean solution != delete bin/obj. sorry then just leaving this here ... https://marketplace.visualstudio.com/items?itemName=Johan20D.KilltheWPFDesigner#overview
2025-04-01T04:10:37.949690
2018-01-24T20:12:31
291344148
{ "authors": [ "Daniel-Cong", "matmiranda" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14871", "repo": "MahApps/MahApps.Metro", "url": "https://github.com/MahApps/MahApps.Metro/issues/3174" }
gharchive/issue
How to set the masquerade in "ShowInputAsync"? Example: var mySettings = new MetroDialogSettings { Mask = "&&&&-&&&&-&&&&-&&&&" }; var key = await this.ShowInputAsync("Hello!", "Type license key:", mySettings); Is it possible to set the mask in the textbox? Also it will be great if we can put default value inside the Input Box. Currently, we can only display a title, a prompt, and an empty Input Box.
2025-04-01T04:10:37.951986
2023-09-28T03:28:55
1916639859
{ "authors": [ "dahifi", "deeeed", "squillace91" ], "license": "BSD-2-Clause", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14872", "repo": "MahmoudAshraf97/whisper-diarization", "url": "https://github.com/MahmoudAshraf97/whisper-diarization/issues/90" }
gharchive/issue
python setup.py egg_info did not run successfully. When running pip install -r requirements.txt I am getting the following: Anyone has any leads? It's a failure in onnx, which is a nemo dependency. I had a similar issue and saw that they recommend python 3.10.12. I was using 3.13.x, so I bumped it down and it worked. Next time you have an issue, copy the terminal output into GPT and if you still get stuck share the text, not images please. is there a fix for it?
2025-04-01T04:10:37.977668
2013-11-10T04:02:47
22399096
{ "authors": [ "Duffycola", "kgrigsby59" ], "license": "bsd-3-clause", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14873", "repo": "MailCore/MC2CocoaPodsSample", "url": "https://github.com/MailCore/MC2CocoaPodsSample/issues/1" }
gharchive/issue
Unable to build on Mavericks I'm running Xcode 5.0.1 on the latest Mavericks. When doing a pod install I got the following: kengrigsbysimac:MC2CocoaPodsSample-master kgrigsby$ pod install Analyzing dependencies Downloading dependencies Installing FXKeychain (1.4) Installing GTMHTTPFetcher (0.0.1) Installing MailCore2 (0.3.pre1) Installing SBJson (3.2) Installing ctemplate (2.2.1) Installing gtm-oauth2 (0.0.2) Installing icu4c (51.2) Installing libetpan (1.3.pre2) Installing libsasl2 (2.1.25) Installing tidy-html5 (0.0.1) configure: error: in `/Users/kgrigsby/Desktop/MC2CocoaPodsSample-master/Pods/ctemplate': configure: error: cannot run C++ compiled programs. If you meant to cross compile, use `--host'. See `config.log' for more details make: *** No rule to make target `src/htmlparser/htmlparser_fsm.h'. Stop. make: *** No rule to make target `src/htmlparser/jsparser_fsm.h'. Stop. configure: error: in `/Users/kgrigsby/Desktop/MC2CocoaPodsSample-master/Pods/icu4c/source': configure: error: cannot run C compiled programs. If you meant to cross compile, use `--host'. See `config.log' for more details make: *** No rule to make target `install-headers'. Stop. make: *** No rule to make target `install-headers'. Stop. make: *** No rule to make target `install-headers'. Stop. make: *** No rule to make target `install-headers'. Stop. make: *** No rule to make target `install-headers'. Stop. make: *** No rule to make target `install-headers'. Stop. make: *** No rule to make target `install-headers'. Stop. make: *** No rule to make target `install-headers'. Stop. make: *** No rule to make target `install-headers'. Stop. make: *** No rule to make target `install-headers'. From now on use `MC2CocoaPodsSample.xcworkspace`. [!] [libsasl2 (2.1.25)] The pre install hook of the specification DSL has been deprecated, use the `resource_bundles` or the `prepare_command` attributes. [!] [ctemplate (2.2.1)] The pre install hook of the specification DSL has been deprecated, use the `resource_bundles` or the `prepare_command` attributes. [!] [icu4c (51.2)] The pre install hook of the specification DSL has been deprecated, use the `resource_bundles` or the `prepare_command` attributes. I was able to compile and run the sample using Xcode 6.2 on latest Mavericks. However, when I open the storyboard, xcode hangs (have to force kill).
2025-04-01T04:10:38.029299
2021-11-29T22:23:27
1066565384
{ "authors": [ "Malax", "edmorley", "lillianzhang331" ], "license": "BSD-3-Clause", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14874", "repo": "Malax/libcnb.rs", "url": "https://github.com/Malax/libcnb.rs/pull/213" }
gharchive/pull-request
add Errors documentation Fixes #53 We need to add an "Errors section to document the types of errors that can be returned from a function. This will help developers write code to handle the errors properly. https://rust-lang.github.io/rust-clippy/master/index.html#missing_errors_doc https://rust-lang.github.io/api-guidelines/documentation.html#function-docs-include-error-panic-and-safety-considerations-c-failure GUS-W-10164880. As discussed in person, I'll close this @lillianzhang331. we should instead: drop the docs changes from this PR and then permanently disable this rule. ie: Move the rule up a few lines in libcnb/src/lib.rs and libcnb-data/src/lib.rs, so it's no longer in the "to triage" section, and then add a reason like
2025-04-01T04:10:38.030845
2023-02-26T03:17:21
1599914983
{ "authors": [ "andrew-carpenter-met", "jiallombardo" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14875", "repo": "Malin001/Legilimens-Hogwarts-Legacy-Collectible-Finder", "url": "https://github.com/Malin001/Legilimens-Hogwarts-Legacy-Collectible-Finder/issues/7" }
gharchive/issue
Incorrect Manor Cape YouTube Link The Manor Cape YouTube link for "Collection Chest #7 (Dungeon)" is the correct video, but the incorrect timecode. It should be: https://youtu.be/gYs24rpRPZ0&t=722. Same issue, it appears that the link you generate is missing the "t" parameter in the request url, here's an example screenshot:
2025-04-01T04:10:38.054123
2021-01-28T07:19:29
795742934
{ "authors": [ "d-m-u" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14876", "repo": "ManageIQ/manageiq-cross_repo-tests", "url": "https://github.com/ManageIQ/manageiq-cross_repo-tests/pull/288" }
gharchive/pull-request
test replace_children ancestry fix https://github.com/ManageIQ/manageiq/pull/20990 it's expected to fail anyway as i'm not addressing the widget issues: 3117 examples, 4 failures Failed examples: rspec ./spec/requests/widgets_spec.rb:90 # Widgets API Widgets generate_content action with an appropriate role generate_content for group generates multiple widget contents rspec ./spec/requests/widgets_spec.rb:83 # Widgets API Widgets generate_content action with an appropriate role generate_content for group generates single widget content rspec ./spec/requests/widgets_spec.rb:106 # Widgets API Widgets generate_content action with an appropriate role generate_content for user generates single widget content rspec ./spec/requests/widgets_spec.rb:115 # Widgets API Widgets generate_content action with an appropriate role generate_content for user generates multiple widget contents it's expected to fail anyway as i'm not addressing the widget issues: 3117 examples, 4 failures Failed examples: rspec ./spec/requests/widgets_spec.rb:90 # Widgets API Widgets generate_content action with an appropriate role generate_content for group generates multiple widget contents rspec ./spec/requests/widgets_spec.rb:83 # Widgets API Widgets generate_content action with an appropriate role generate_content for group generates single widget content rspec ./spec/requests/widgets_spec.rb:106 # Widgets API Widgets generate_content action with an appropriate role generate_content for user generates single widget content rspec ./spec/requests/widgets_spec.rb:115 # Widgets API Widgets generate_content action with an appropriate role generate_content for user generates multiple widget contents
2025-04-01T04:10:38.073408
2019-07-19T20:00:28
470515797
{ "authors": [ "Fryguy", "lodgenbd" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14877", "repo": "ManageIQ/manageiq", "url": "https://github.com/ManageIQ/manageiq/issues/19011" }
gharchive/issue
Cloud OpenStack Provider Question: Add Image Function? Question: should the "add image" function be working for an OpenStack provider? Neither as an admin nor a user with "add image" permissions am I able to access any functionality to add an image. I do see the permission exists in roles (miq_product_features.yml): - :name: Add Image :description: Add Image :feature_type: control :identifier: image_create @agrare @borod108 Thoughts here? @miq-bot move-issue manageiq-providers-openstack
2025-04-01T04:10:38.077033
2021-01-27T20:06:20
795406014
{ "authors": [ "Fryguy", "agrare" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14878", "repo": "ManageIQ/manageiq", "url": "https://github.com/ManageIQ/manageiq/issues/20986" }
gharchive/issue
Failing specs on Mac but not Linux There might be more, but these 3 tests consistently fail on a Mac, but pass in Travis... 1) EvmDatabaseOps#backup without enough free space Failure/Error: expect(MiqQueue.where(:class_name => "MiqEvent", :method_name => "raise_evm_event").count).to eq(1) expected: 1 got: 0 (compared using ==) # ./spec/lib/evm_database_ops_spec.rb:53:in `block (3 levels) in <top (required)>' # /Users/jfrey/.gem/ruby/2.6.6/gems/webmock-3.11.1/lib/webmock/rspec.rb:37:in `block (2 levels) in <top (required)>' 2) EvmDatabaseOps#dump without enough free space Failure/Error: expect(MiqQueue.where(:class_name => "MiqEvent", :method_name => "raise_evm_event").count).to eq(1) expected: 1 got: 0 (compared using ==) # ./spec/lib/evm_database_ops_spec.rb:133:in `block (3 levels) in <top (required)>' # /Users/jfrey/.gem/ruby/2.6.6/gems/webmock-3.11.1/lib/webmock/rspec.rb:37:in `block (2 levels) in <top (required)>' 3) MiqEnvironment with linux platform Host Info local_ip_address Failure/Error: expect(described_class.local_ip_address).to eq(`hostname -i`.chomp.split.first) expected: nil got: "<IP_ADDRESS>" (compared using ==) # ./spec/lib/miq_environment_spec.rb:18:in `block (4 levels) in <top (required)>' # /Users/jfrey/.gem/ruby/2.6.6/gems/webmock-3.11.1/lib/webmock/rspec.rb:37:in `block (2 levels) in <top (required)>' Failed examples: rspec ./spec/lib/evm_database_ops_spec.rb:47 # EvmDatabaseOps#backup without enough free space rspec ./spec/lib/evm_database_ops_spec.rb:126 # EvmDatabaseOps#dump without enough free space rspec ./spec/lib/miq_environment_spec.rb:17 # MiqEnvironment with linux platform Host Info local_ip_address @Fryguy I don't think this is mac/linux as these three fail for me as well. It is more likely a travis environmental / config thing The last one I actually get <IP_ADDRESS> not nil: 3) MiqEnvironment with linux platform Host Info local_ip_address Failure/Error: expect(described_class.local_ip_address).to eq(`hostname -i`.chomp.split.first) expected: "<IP_ADDRESS>" got: "<IP_ADDRESS>" (compared using ==) # ./spec/lib/miq_environment_spec.rb:18:in `block (4 levels) in <top (required)>' # /home/grare/adam/.gem/gems/webmock-3.11.1/lib/webmock/rspec.rb:37:in `block (2 levels) in <top (required)>' @Fryguy I don't think this is mac/linux as these three fail for me as well. It is more likely a travis environmental / config thing The last one I actually get <IP_ADDRESS> not nil: 3) MiqEnvironment with linux platform Host Info local_ip_address Failure/Error: expect(described_class.local_ip_address).to eq(`hostname -i`.chomp.split.first) expected: "<IP_ADDRESS>" got: "<IP_ADDRESS>" (compared using ==) # ./spec/lib/miq_environment_spec.rb:18:in `block (4 levels) in <top (required)>' # /home/grare/adam/.gem/gems/webmock-3.11.1/lib/webmock/rspec.rb:37:in `block (2 levels) in <top (required)>' Thanks @agrare I updated the title Thanks @agrare I updated the title
2025-04-01T04:10:38.078779
2016-08-11T13:34:49
170647060
{ "authors": [ "agrare" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14879", "repo": "ManageIQ/manageiq", "url": "https://github.com/ManageIQ/manageiq/pull/10408" }
gharchive/pull-request
Skip Media&Other items in vCloud Catalogs Purpose or Intent VMware vCloud Catalogs contain two types of items, vApp Templates and Media&Other. Currently we only want to inventory vApp Templates because you cannot provision anything from the Media&Other items. This will skip all Media&Other items in a vCloud catalog. @miq-bot add-label providers/vmware/cloud Looks good to me but I wrote it :smile: @blomquisg can you take a look? @miq-bot assign blomquisg
2025-04-01T04:10:38.082220
2016-11-08T20:36:23
188096949
{ "authors": [ "carbonin", "gtanzillo" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14880", "repo": "ManageIQ/manageiq", "url": "https://github.com/ManageIQ/manageiq/pull/12513" }
gharchive/pull-request
[DARGA] Backport pglogical replication enhancements for replication set locking This is a backport for some issues with pglogical replication in the darga release. In particular this backports the locking mechanism for editing the replication set which solves worker failures when the workers are adding or removing tables from the replication set concurrently. There were some additional dependent changes that needed to also come back for this cherry-pick to apply cleanly. The PRs backported by this PR are: https://github.com/ManageIQ/manageiq/pull/11595 https://github.com/ManageIQ/manageiq/pull/12030 https://github.com/ManageIQ/manageiq/pull/12280 https://github.com/ManageIQ/manageiq/pull/12318 https://bugzilla.redhat.com/show_bug.cgi?id=1391997 @gtanzillo please review /cc @chessbyte What should I do with the labels on the referenced PRs? ๐Ÿ‘ Looks good!
2025-04-01T04:10:38.087704
2017-03-09T13:37:36
213039246
{ "authors": [ "agrare", "durandom", "masayag", "simaishi" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14881", "repo": "ManageIQ/manageiq", "url": "https://github.com/ManageIQ/manageiq/pull/14245" }
gharchive/pull-request
Set timeout for inventory refresh calls The settings.yml contains timeout only for RHV service calls. However when services are invoked as part of the refresh process, no timeout is provided for these calls, letting the Net::HTTP 60 seconds default timeout to be the affective timeout. The 60 seconds default timeout is too short, therefore the default timeout for inventory service calls to RHV is set to 1 hour. http://bugzilla.redhat.com/1430722 @miq-bot add_label blocker, euwe/yes. providers/rhevm @miq-bot add_label blocker, euwe/yes, providers/rhevm @miq-bot assign @borod108 Wow, nasty bug @masayag here this option gets set @oourfali wdyt regarding docs? I think new options should also have new docs :) @masayag is this timeout per API call? 1 hour seems like a very long time for an API call @agrare This timeout is per API call, same timeout is set for the :service (which is an explicit call for the API not as part of the refresh). We hit that on a setup which had a slow network, where a request for all of the vms took over 10 minutes. I wouldn't expect to hit timeout in any part of the refresh process. I assume that after the first timeout the PartialRefreshException will be thrown and the refresh process will be terminated due to that timeout. Makes sense to have the same timeout as the other service, but if you need a 1hr timeout per API call I think we're going to have bigger problems haha Euwe backport details: $ git log -1 commit 2b87bd1158d4148e6ed69642b1a63bbed37bcb32 Author: Adam Grare<EMAIL_ADDRESS>Date: Fri Mar 10 09:21:49 2017 -0500 Merge pull request #14245 from masayag/rhv_inventory_timeout Set timeout for inventory refresh calls (cherry picked from commit 0a7255b7ae6e3e94925173931466078b3da09e61) https://bugzilla.redhat.com/show_bug.cgi?id=1431620
2025-04-01T04:10:38.089778
2017-03-17T08:41:13
214939206
{ "authors": [ "borod108", "chessbyte" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14882", "repo": "ManageIQ/manageiq", "url": "https://github.com/ManageIQ/manageiq/pull/14373" }
gharchive/pull-request
[DARGA] Backport e3e19a9660ed8db07f34a6ad7b669343e7e79cd3 Attempt to hotfix https://bugzilla.redhat.com/show_bug.cgi?id=1432719 @borod108 you need to change the title to something human-readable. If you need to maintain the commit sha, please add it to the description. Is this a backport of another PR? What is this?? @chessbyte sorry, I never did a backport and have no idea what is the procedure, is there a doc on how to do it right? (I just manually applied the changes to Darga). I changed the title.
2025-04-01T04:10:38.094286
2017-11-16T18:41:10
274620787
{ "authors": [ "d-m-u", "eclarizio", "gmcculloug", "simaishi" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14883", "repo": "ManageIQ/manageiq", "url": "https://github.com/ManageIQ/manageiq/pull/16487" }
gharchive/pull-request
Adds field unique validator check to dialog This adds a uniqueness check on dialog field names (from this bz: https://bugzilla.redhat.com/show_bug.cgi?id=1491790) so we don't end up with dialogs with two equal field names which goes kaput. Linked PR adds back in flash messages on dialog edit screen that went kaput when new dialog editor came online. Without it, this check fails silently and is yuck. Will be a terrible experience unless this gets in soon too: ( ๐ŸŽ‰ ) https://github.com/ManageIQ/manageiq-ui-classic/pull/2772 (and by soon I mean basically at the same time) @miq-bot assign @gmcculloug @eclarizio can you take a ๐Ÿ‘€ at this one please? Dunno what to do about the errors, if you've ideas, please lemme know! @miq-bot add_label bug @eclarizio The problem with this currently is that we have no flash messages in new dialog editor, so this error is never shown anywhere. @d-m-u Right, I realize that's why it's marked as WIP, but just wanted to give the ๐Ÿ‘ approval for if/when the UI side gets sorted out. @miq-bot add_label bug gaprindashvili/yes ? @simaishi I set the gaprindashvili/yes flag. Thanks Gaprindashvili backport details: $ git log -1 commit 89ebd845ee7f8bcaa93960840c66844afa634496 Author: Greg McCullough<EMAIL_ADDRESS>Date: Tue Nov 21 09:51:41 2017 -0500 Merge pull request #16487 from d-m-u/unique_dialog_field_names Adds field unique validator check to dialog (cherry picked from commit ece6e891290beda5b9945a00014ecd53431479d6) https://bugzilla.redhat.com/show_bug.cgi?id=1518267
2025-04-01T04:10:38.099041
2018-02-02T10:57:40
293849788
{ "authors": [ "agrare", "miha-plesko" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14884", "repo": "ManageIQ/manageiq", "url": "https://github.com/ManageIQ/manageiq/pull/16937" }
gharchive/pull-request
Allow EMS to prevent worker from starting With this commit we allow EMS to decide whether it wants to prevent its worker from being run. This is useful in case when worker depends on user input when adding EMS. E.g. user can opt-in to pick "None" option for EventMonitoring and in such case EMS can now prevent EventCatcher worker from spawning. Previously, workers were spawned no matter what which is both uneffective and causes troubles when such "invalid" worker is continously failing. BZ: https://bugzilla.redhat.com/show_bug.cgi?id=1527209 Followup PR: https://github.com/ManageIQ/manageiq-providers-nuage/pull/63 @miq-bot assign @agrare @miq-bot add_label enhancement @miha-plesko this is better handled in the nuage event catcher instead of this base mixin. You can define a has_required_role? there which can check the authentication for the ems. It is per ems and not for all event catchers or all nuage event catchers As suggested by @agrare the logic to achieve this was implemented in each provider that supports "None" option for eventing: VMware vCloud: https://github.com/ManageIQ/manageiq-providers-vmware/pull/199 Nuage: https://github.com/ManageIQ/manageiq-providers-nuage/pull/69 Therefore I'm closing this PR.
2025-04-01T04:10:38.104473
2018-05-19T01:47:45
324591701
{ "authors": [ "branic", "d-m-u", "jrafanie" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14885", "repo": "ManageIQ/manageiq", "url": "https://github.com/ManageIQ/manageiq/pull/17446" }
gharchive/pull-request
evm:export:service_dialogs -- Set the current user to admin This PR sets the current user when exporting Service Dialogs. When exporting service dialogs an error is logged to evm.log about User.current_user being nil when the id of the current_user is referenced. In some cases this causes Service Dialogs to not be exported. Links [Optional] evm:export:service_dialogs blows up on a nil user Steps for Testing/QA [Optional] Create a Service Dialog with a dynamic field Create a directory to store the exported dialogs mkdir /tmp/dialogs In a seperate terminal window tail -f evm.log Export Service Dialogs via the rake command bin/rake evm:export:service_dialogs -- --directory /tmp/dialogs inspect evm.log for the errors in the referenced issue. With this fix there will be no errors and a yaml file for each Service Dialog will be present in the export directory. /cc @jrafanie @gmcculloug @eclarizio I think hardcoding admin here is ok as there is no rbac or tenancy enforcement on the Service Dialogs. Any user can see any Service Dialog (as long as they can get to the menu option). If that is going to change in the future I don't mind changing this to allow a userid to be passed in. I'd still want to default to admin if there wasn't a userid supplied though. @gmcculloug Can you review? ๐ŸšŒ ๐Ÿ’ฅ I'm concerned there could be other places that might not be passing the user to the automate_queue_hash method. I think this is fixed by https://github.com/ManageIQ/manageiq/pull/17436 I like @d-m-u fix better. I had tried to do the same thing before adding the admin user to the rake task, but couldn't get the syntax correct.
2025-04-01T04:10:38.106095
2019-12-04T19:07:19
532876852
{ "authors": [ "djberg96" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14886", "repo": "ManageIQ/manageiq", "url": "https://github.com/ManageIQ/manageiq/pull/19579" }
gharchive/pull-request
Add queue_name to CloudVolumeBack ems operations, plus specs This PR adds a queue_name to the queue options for the CloudVolumeBackup.restore_queue and CloudVolumeBackup#delete_queue methods. I've also added some specs - apparently none existed for this model - and added some comments on those methods. Part of #19543 @miq-bot add_reviewer @agrare
2025-04-01T04:10:38.110251
2016-05-12T12:48:16
154474038
{ "authors": [ "KevinLoiseau", "abellotti", "imtayadeway", "jrafanie" ], "license": "Apache-2.0", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14887", "repo": "ManageIQ/manageiq", "url": "https://github.com/ManageIQ/manageiq/pull/8647" }
gharchive/pull-request
[API] reports scheduled Hi guys ! Just a PR to add actions on Reports schedule through the api : create schedule (implemented like an action on route /reports) list schedules (on route reports/:report_id/schedules) I'm not sure that is the best way for implement this functionality, I will be listening you, feel free to tell me if I'm doing something wrong. Thanks. Nice PR Kevin !! Minor comments. If you can resolve the rubocop warnings too, that'll be appreciated. Thanks. @abellotti, @imtayadeway thank you for your reviews, I work on it ASAP. Closing/reopening since #8684 was merged to fix master @KevinLoiseau looks great, just a couple of minor issues. Over to.... @miq-bot assign @abellotti @KevinLoiseau please rebase/update methods->verbs in api.yml and repush. Thanks. @KevinLoiseau can you update this PR to use the miq_report_schedule_add identifier in the api.yml for the new schedule action for reports and repush. Thanks.
2025-04-01T04:10:38.114306
2017-02-13T13:38:03
207220820
{ "authors": [ "mickaelpois", "pagedegeek" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14888", "repo": "Mangopay/mangopay2-ruby-sdk", "url": "https://github.com/Mangopay/mangopay2-ruby-sdk/pull/69" }
gharchive/pull-request
ExtendedPayIn supported Implement: https://docs.mangopay.com/endpoints/v2.01/payins#e847_view-card-details-for-a-payin-web :) data = MangoPay::PayIn::Card::Web.extended(12639078) Thanks @pagedegeek ! Note that very soon, FAILED transactions status will be supported too (if card info provided by customer)
2025-04-01T04:10:38.120221
2018-03-09T14:18:14
303855529
{ "authors": [ "rec" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14889", "repo": "ManiacalLabs/BiblioPixel", "url": "https://github.com/ManiacalLabs/BiblioPixel/pull/654" }
gharchive/pull-request
Give Address an empty constructor and state Spun off from the tatters of the general controllers. :-D I needed it for a while but I don't really need this at this time, but it's quite elegant to have an empty Address which does nothing and no doubt will be convenient in the future. Ping! This isn't so important but want to get it out of client.
2025-04-01T04:10:38.123781
2022-02-08T08:06:31
1126917608
{ "authors": [ "MichealLea", "naveen521kk" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14890", "repo": "ManimCommunity/ManimPango", "url": "https://github.com/ManimCommunity/ManimPango/issues/75" }
gharchive/issue
M1 Mac Installation Problem Initially, I followed the instructions of the Anaconda Installations command lines in Installation of manim. My computer is Macbook Air (M1 chip, 8 cores GPU, 16 Gb Memory). Here, is the problem described as an import error when I tried to run a brief code manimgl example_scenes.py OpeningManimExample OUTPUT: Traceback (most recent call last): File "/Users/micheallea/miniforge3/envs/manim/lib/python3.8/site-packages/manimpango/__init__.py", line 14, in <module> from .cmanimpango import * # noqa: F403,F401 ImportError: dlopen(/Users/micheallea/miniforge3/envs/manim/lib/python3.8/site-packages/manimpango/cmanimpango.cpython-38-darwin.so, 2): Library not loaded: @rpath/libpangocairo-1.0.0.dylib Referenced from: /Users/micheallea/miniforge3/envs/manim/lib/python3.8/site-packages/manimpango/cmanimpango.cpython-38-darwin.so Reason: image not found During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/Users/micheallea/miniforge3/envs/manim/bin/manimgl", line 33, in <module> sys.exit(load_entry_point('manimgl', 'console_scripts', 'manimgl')()) File "/Users/micheallea/miniforge3/envs/manim/bin/manimgl", line 25, in importlib_load_entry_point return next(matches).load() File "/Users/micheallea/miniforge3/envs/manim/lib/python3.8/importlib/metadata.py", line 77, in load module = import_module(match.group('module')) File "/Users/micheallea/miniforge3/envs/manim/lib/python3.8/importlib/__init__.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "<frozen importlib._bootstrap>", line 1014, in _gcd_import File "<frozen importlib._bootstrap>", line 991, in _find_and_load File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "<frozen importlib._bootstrap>", line 1014, in _gcd_import File "<frozen importlib._bootstrap>", line 991, in _find_and_load File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 671, in _load_unlocked File "<frozen importlib._bootstrap_external>", line 843, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/Users/micheallea/manim/manimlib/__init__.py", line 14, in <module> from manimlib.animation.numbers import * File "/Users/micheallea/manim/manimlib/animation/numbers.py", line 2, in <module> from manimlib.mobject.numbers import DecimalNumber File "/Users/micheallea/manim/manimlib/mobject/numbers.py", line 3, in <module> from manimlib.mobject.svg.text_mobject import Text File "/Users/micheallea/manim/manimlib/mobject/svg/text_mobject.py", line 15, in <module> import manimpango File "/Users/micheallea/miniforge3/envs/manim/lib/python3.8/site-packages/manimpango/__init__.py", line 35, in <module> raise ImportError(msg) ImportError: ManimPango could not import and load the necessary shared libraries. This error may occur when ManimPango and its dependencies are improperly set up. Please make sure the following versions are what you expect: * ManimPango v0.3.1, Python v3.8.11 If you believe there is a greater problem, feel free to contact us or create an issue on GitHub: * Discord: https://discord.gg/mMRrZQW * GitHub: https://github.com/ManimCommunity/ManimPango/issues Original error: dlopen(/Users/micheallea/miniforge3/envs/manim/lib/python3.8/site-packages/manimpango/cmanimpango.cpython-38-darwin.so, 2): Library not loaded: @rpath/libpangocairo-1.0.0.dylib Referenced from: /Users/micheallea/miniforge3/envs/manim/lib/python3.8/site-packages/manimpango/cmanimpango.cpython-38-darwin.so Reason: image not found If you are using conda, install pango from conda-forge(conda install pango), first and then install ManimPango, use pip install manimpango --force --no-cache to reinstall. Closing due to inactivity.
2025-04-01T04:10:38.155832
2018-05-29T20:19:28
327472421
{ "authors": [ "cuttlefish", "zunware" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14891", "repo": "MapStory/story-tools-composer", "url": "https://github.com/MapStory/story-tools-composer/pull/198" }
gharchive/pull-request
Fixes saving and loading from server Fixes storypins date saving. Isn't the better solution here to fix the names on the server?
2025-04-01T04:10:38.263983
2023-10-26T19:09:15
1964208350
{ "authors": [ "bbest" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14892", "repo": "MarineSensitivities/server", "url": "https://github.com/MarineSensitivities/server/issues/3" }
gharchive/issue
Setup software and review plan Setup: shared drive for data and documents; Zotero bibliography; Github repositories for versioning code and publishing html documents as well as project management with issues, milestones and roadmaps; and a virtual machine in a cloud provider using Docker for serving a spatial database (PostGIS), RStudio and Shiny applications. server The server software is for setting up web services outside those of Github (e.g. serving website, docs and R package) using Docker (see the docker-compose.yml; with reverse proxying from subdomains to ports by Caddy): rstudio integrated development environment (IDE) to code and debug directly on the server More info.. shiny interactive applications e.g., shiny.marinesensitivities.org/map More info.. pgadmin PostGreSQL database administration interface More info.. api custom API: using R plumber More info.. swagger generic database API: using PostGREST More info.. tile spatial database API: using pg_tileserv for serving vector tiles More info.. Reference Server Setup on AWS as EC2 instance at allocated IP address <IP_ADDRESS> Note: changed MarineSensitivities -> MarineSensitivity
2025-04-01T04:10:38.267756
2017-03-27T07:43:12
217161170
{ "authors": [ "MarioIannotta", "RajChanchal" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14893", "repo": "MarioIannotta/MIBlurPopup", "url": "https://github.com/MarioIannotta/MIBlurPopup/issues/3" }
gharchive/issue
Tabbar doesn't hide Thanks for the MIBlurPopup. I have a little problem. When showing the popup, tabbar shows up and I am unable to hide it in popupbar view controller too. I have resolved this by showing the pop as: MIBlurPopup.show(popView, on: viewController.tabBarController!) i.e. presenting the pop on tabbarController Hi, this is the expected behavior, the popup is shown above the given view controller, if you want also to hide the tab bar, you're solution will work perfectly. Mario.
2025-04-01T04:10:38.270234
2016-10-04T14:00:07
180901297
{ "authors": [ "MarioIannotta", "ktxc15" ], "license": "MIT", "license_source": "github-api", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14894", "repo": "MarioIannotta/MIPivotPageController", "url": "https://github.com/MarioIannotta/MIPivotPageController/issues/2" }
gharchive/issue
Add badge number Can I add badge number to icon, how to do it Hi, right now it is not possible. Hi, right now it's not possible. Hi @ktxc15 I just want to let you know that with the last update you can easily add the badge each page. You will find usefull this methods of the MIPivotRootPage protocol @objc optional var badgeValueForPivotPage: String? { get } @objc optional var shouldHideBadgeOnPageFocus: Bool { get } In order to refresh the badge you'll need the method refreshBadge(forRootPageAtIndex miPivotRootPageIndex: Int) of MIPivotPageController Have a nice day, Mario.
2025-04-01T04:10:38.304312
2017-07-03T14:40:01
240192503
{ "authors": [ "MarkPieszak", "davidsekar" ], "license": "mit", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14895", "repo": "MarkPieszak/aspnetcore-angular2-universal", "url": "https://github.com/MarkPieszak/aspnetcore-angular2-universal/issues/327" }
gharchive/issue
Enable Server side script debugging I'm using transfer Http module for three webapi calls that happen on application initial load. The transfer state for the last call succeeds and is maintained in [Transfer_state]. But the first two webapi calls results in unexpected token error. For debugging this issue, I tried to enable LaunchWithdebugging which resulted in error https://github.com/aspnet/JavaScriptServices/issues/1084 Kindly a code snippet in the repo that could allow debugging the Server Side execution of scripts Do these calls work normally if you disable server-side rendering for a second and just try running them in the browser? What exact errors come up in the Node console when these errors happen? Yes, the client side part of the application works fine, no matter SSR is enabled or disabled. My issue is related to following SO question: https://stackoverflow.com/questions/43825058/vs-code-running-asp-net-spa-with-angular-4-and-using-nodeservices-options-launc NodeServices is part of JavaScriptServices, Steve is the expert there. I'd assume it's a problem with higher versions of Node (7+) that might be culprit! I'll close this out (as it's an open issue in JSServices) but i'll keep an eye out and see if that's the problem! Just an update: Transfer-http calls were failing due to the file encoding issue. The two JSON files had encoding as UTF8 with BOM. I changed it to just UTF8. Everything started working fine. Thanks for your reply, indeed server side script debugging is not working with Node.js 7+. Thanks, David. https://github.com/MarkPieszak/aspnetcore-angular2-universal/issues/327 Ahh well glad you got it settled! Cheers
2025-04-01T04:10:38.305730
2017-02-08T11:17:53
206168198
{ "authors": [ "MarkRoddy", "moinahmed001" ], "license": "apache-2.0", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14896", "repo": "MarkRoddy/brstest", "url": "https://github.com/MarkRoddy/brstest/pull/22" }
gharchive/pull-request
To show method names when running tests A flag verbosity has been added to show method names when running tests @moinahmed001 thanks for your contribution! @MarkRoddy no worries
2025-04-01T04:10:38.309423
2018-02-23T19:34:11
299825385
{ "authors": [ "coveralls", "cryptomental", "pelsasser" ], "license": "apache-2.0", "license_source": "bigquery", "license_type": "permissive", "provenance": "gharchive-dolma-0004.json.gz:14897", "repo": "MarketProject/MarketProtocol", "url": "https://github.com/MarketProject/MarketProtocol/pull/51" }
gharchive/pull-request
Feature/test oraclize gas within limit Solves #50 . Coverage increased (+4.04%) to 88.679% when pulling e0fe61eeff46de72b7df7ed6ba4fd9819ea0de5f on cryptomental:feature/test-oraclize-gas-within-limit into f60454131cefddbbddbdca80d3df18170c2d7ffe on MarketProject:master. And the coverage increased +4% :) If needed, I can squash commits into one and remove the commented out logs. tested this with truffle and ganache and both worked. Will attempt a test net in the future, but that can be pushed down the line a bit. Thanks, this is awesome! @pelsasser great! I tried to 'submit work' on gitcoin but it failed https://etherscan.io/tx/0x83efb8c29abfd6d51e93a56e1d23259b38597eca153b7fa726106c326f7e80a6 unfortunately I cannot re-submit it again. I would be grateful if you could have a look and see if this can be closed. @cryptomental - please should me a message with your ETH address and I can get everything squared out. Thanks again