Official but neglected master project https://github.com/python-ivi/python-ivi
Main contributor fork https://github.com/alexforencich/python-ivi
All the 108 forks on github https://github.com/alexforencich/python-ivi/network/members
The project is a Python module, and so far it seems to be the most complete implementation of the IVI standard for controlling measuring instruments, open and multiplatform. I'm interested mostly in those forks that include Rigol instruments because this is the only brand of SCPI instruments I have so far.
Is it reasonable to try to merge all the 108 forks (many don't have the Rigol instruments I need)?
Would merging all that be a one hour task, a one month, or rather impossible?
What to do to get a reasonably updated version?
Official but neglected master project https://github.com/python-ivi/python-ivi
Main contributor fork https://github.com/alexforencich/python-ivi
All the 108 forks on github https://github.com/alexforencich/python-ivi/network/members
The project is a Python module, and so far it seems to be the most complete implementation of the IVI standard for controlling measuring instruments, open and multiplatform. I'm interested mostly in those forks that include Rigol instruments because this is the only brand of SCPI instruments I have so far.
Is it reasonable to try to merge all the 108 forks (many don't have the Rigol instruments I need)?
Would merging all that be a one hour task, a one month, or rather impossible?
What to do to get a reasonably updated version?
Yeah. That's something I pointed out a while ago regarding how people use github in general and how inefficient it is when projects get split up across tens or hundreds of forks. Nobody will ever care or even have the time to consolidate any of that. It's just not doable for maintainers. For mere "users", you have to pick one of the forks that's closest to what you need and stick to it.
Even if you managed to properly merge 108 branches, which could take weeks, while not introducing horrible and intractable bugs, that would be a one-time affair. And now what? Would you keep doing that on regular basis watching modifications across 108 forks (which may each have several active branches?)
I have no secret sauce for that. I tend to only use open-source projects that have active official maintainers and stick to the official project. Too much of a mess otherwise. For the rest, I write my own stuff.
I don't think Github lets us take a birds eye view of all branches and all forks, that would certainly help answer your question.It has something, it's under the 'Insights' tab of each repo:
But here's an interesting factoid about Github, all commits, on every single fork, are all actually in the original repo. The forks are "virtual" repos so to speak, they are not distinct repos as is often assumed.Are you saying a Github fork is not like a Git clone, in creating a copy of the original repository that references the original, like what the documentation says? Do you have a reference for how this works on a Git level? This seems like an awful lot of effort for them to go through beyond the basic Git functionality just to safe some duplication. It would also imply all forks disappear when the original repository is deleted.
But here's an interesting factoid about Github, all commits, on every single fork, are all actually in the original repo. The forks are "virtual" repos so to speak, they are not distinct repos as is often assumed.
! This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
...
It would also imply all forks disappear when the original repository is deleted.
...
But here's an interesting factoid about Github, all commits, on every single fork, are all actually in the original repo. The forks are "virtual" repos so to speak, they are not distinct repos as is often assumed.Are you saying a Github fork is not like a Git clone, in creating a copy of the original repository that references the original, like what the documentation says? Do you have a reference for how this works on a Git level? This seems like an awful lot of effort for them to go through beyond the basic Git functionality just to safe some duplication. It would also imply all forks disappear when the original repository is deleted.
If I were them I'd just implement some object-level deduplication agnostic of any repository boundaries.
and this is the latest commit on the jkc-sw repo:
https://github.com/jkc-sw/python-ivi/commit/5f8356ebd39041673b5546d77bc202295550fbe1
Now replace the "jkc-sw" in the url with "python-ivi":
https://github.com/python-ivi/python-ivi/commit/5f8356ebd39041673b5546d77bc202295550fbe1
and github will show you the same commit but also report:Quote! This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
This suggests that the jkc-sw and python-ivi repos are sharing the same physical git repo.
% git clone https://github.com/python-ivi/python-ivi.git
Cloning into 'python-ivi'...
remote: Enumerating objects: 4901, done.
remote: Total 4901 (delta 0), reused 0 (delta 0), pack-reused 4901
Receiving objects: 100% (4901/4901), 1.25 MiB | 3.84 MiB/s, done.
Resolving deltas: 100% (3985/3985), done.
% cd python-ivi
% git show 5f8356ebd39041673b5546d77bc202295550fbe1
fatal: bad object 5f8356ebd39041673b5546d77bc202295550fbe1
send SCPI commands to SOCKET instead
To sum up: write your own code and you'll have wasted a lot less time. ;D
To sum up: write your own code and you'll have wasted a lot less time. ;D
That's always what it looks like at the beginning.
Surely writing it from scratch will be faster and lead to a better result. Sometimes it's true, but it's equally common to after some weeks of work discover that reinventing the wheel is a lot more complicated than you thought, and the system you built might have ended up equally complicated as the complicated thing you tried to avoid.
PS C:Temp> git clone https://github.com/python-ivi/python-ivi.git
Cloning into 'python-ivi'...
remote: Enumerating objects: 4901, done.
remote: Total 4901 (delta 0), reused 0 (delta 0), pack-reused 4901Receiving objects: 100% (4901/4901), 1.21 MiB | 1.91 MiB/s
Receiving objects: 100% (4901/4901), 1.25 MiB | 1.65 MiB/s, done.
Resolving deltas: 100% (3985/3985), done.
PS C:Temp> cd .\python-ivi\
PS C:Temp\python-ivi> git show 5f8356
fatal: ambiguous argument '5f8356': unknown revision or path not in the working tree.
Use '--' to separate paths from revisions, like this:
'git <command> [<revision>...] -- [<file>...]'
PS C:Temp\python-ivi> git fetch https://github.com/jkc-sw/python-ivi.git
remote: Enumerating objects: 20, done.
remote: Counting objects: 100% (12/12), done.
remote: Total 20 (delta 12), reused 12 (delta 12), pack-reused 8
Unpacking objects: 100% (20/20), 4.52 KiB | 53.00 KiB/s, done.
From https://github.com/jkc-sw/python-ivi
* branch HEAD -> FETCH_HEAD
PS C:Temp\python-ivi> git show 5f8356
commit 5f8356ebd39041673b5546d77bc202295550fbe1
Author: jkctaiwan <guanru0919@yahoo.com>
Date: Fri Mar 29 14:39:16 2019 -0700
explicit on verb and header
diff --git a/ivi/tektronix/tektronixMDO4104C.py b/ivi/tektronix/tektronixMDO4104C.py
index 3ba6fbd..dcec4e3 100644
--- a/ivi/tektronix/tektronixMDO4104C.py
+++ b/ivi/tektronix/tektronixMDO4104C.py
@@ -70,10 +70,10 @@ class tektronixMDO4104C(tektronixMDO4000, tektronixMDOAFG):
self._write(":data:width 2")
self._write(":data:start 1")
self._write(":data:stop 1e10")
- # eanble verbosity
+ self._write(":VERBose ON")^M
self._write(":HEADer ON")
...rest of commit...
PS C:Temp\python-ivi>
nc 192.168.1.3 5555
aaa@bbb:~$ nc 192.168.1.3 5555
# then from inside nc, type
*IDN?
# and if all OK, your oscilloscope will answer with something like
RIGOL TECHNOLOGIES,DS1104Z,DS1ZA123456789,00.04.05.SP2