Running reviews on Open Exchange
For several weeks I'm creating reviews on OEX.
So I'd like to explain to you the criteria that I apply to find my ratings.
Of course, each reviewer is an independent person and has his own criteria and his own opinion.
And that's good and important! As Winston Churchill once said:
"If 2 people always have the same opinion, then 1 of them is superfluous"
- There are packages that I skip and don't review
- Pure product announcements - Packages where requirements exceed my resources and capacities (e.g. external SMS service, distributed HW, cloud-based installations, .. ) - Packages on private repositories that can't be downloaded. - Packages where I just don't understand what the subject to achieve is
- I check Description in OEX and/or README.md on GitHub
- Is there an explanation of the goal that I understand - Is there a description of how to install the package - Is there a description of how to test/verify the package All these points can be covered explicitly or by a link to an article in Developer Community. Only Empty is not accepted.
- Is a Docker container prepared?
For a single class the cleanup after testing might be easy and acceptable But if a more complex setup is required (e.g. Interoperability, Node.js, ....) only a container multiple containers can guaranty to deinstall all components and keep your testing environment clean - "docker-compose build" is the first check and often the first disappointment - "docker-compose up -d" is the next test - "docker-compose logs" often detects startup issues. If I can identify the problem and find an acceptable workaround I will fork the repo, fix it and place a PullRequest.
- If there is no Dockerfile nor docker-compose visible
I use an appropriate template, install and start it. Then I try importing the package This often works also for packages that are not explicitly dedicated to IRIS. As mentioned earlier, clean-up is simple: Remove docker image, remove cloned repo. BINGO!
- Now I try to exercise the examples or demos
Existence of a UnitTest clearly raises the rating - are demo/test data sets available and a guide on how to make use of it - is there eventually also a video that shows the expected behavior A hint for videos: Please speak slowly and simple and as precise as possible and make clear breaks Most consumers are non-native speakers and may struggle with complicated lingual constructs. The focus is on technology not on language excellence.
- Take the packages from ISC Training as examples of professional setup
I tried also packages for the Health* products, even as speaking "medical" is not so much my world. But I was surprised to learn and refresh a lot of features around the base technology: Interoperability and its operation and management and troubleshooting.