Re: Running Calcite integration tests in docker
Regarding Geode, initially i have tried to to ingest the test data via the
Geode's REST/JSON endpoint but bumped into this bug:
https://issues.apache.org/jira/browse/GEODE-3971 (still unresolved).
As consequence i had to write a custom ingestion using Geode Java API. But
since i had to compile and run this tool, it might no much sense to deploy
and maintain the geode cluster via scripts (or docker image). Instead it
is simpler to embed the entire Geode cluster into the same (standalone)
application. Because this is a Spring Boot app if we deploy the pre-build
jar to a public maven repo we can easy create a Docker image that runs it
in one line (e.g. java -jar ./....)
Any ideas where we can host this project and where to release it?
On 23 April 2018 at 14:12, Francis Chuang <francischuang@xxxxxxxxxx> wrote:
> Thanks, Michael!
> I noticed that I forgot the link to my fork in my original message. Here
> is my fork if someone wants to hack on it a bit more:
> On 23/04/2018 9:58 PM, Michael Mior wrote:
>> Thanks for raising this Francis. I was hoping to find more time to spend
>> this but unfortunately that hasn't happened.
>> 1. That's a question for Christian Tzolov. I'm not too familiar with
>> 2. You are correct that the VM contains several different database servers
>> with various ports exposed. I'm not sure what the situation is with
>> 3. Maven is definitely not strictly necessary although some of the
>> dependencies currently pull in datasets that are used for some of the DBs
>> before building the VM.
>> 4. I don't really have a strong preference either way. I'm sure someone
>> else can speak to why this was separated in the first place.
>> Michael Mior
>> 2018-04-23 7:11 GMT-04:00 Francis Chuang <francischuang@xxxxxxxxxx>:
>> There is currently an issue open for this in the calcite-test-dataset
>>> repository, however, I would like to hear more from the wider
>>> regarding this.
>>> I have created a `switch-to-docker` branch on my fork and committed a
>>> docker-compose.yml under the docker folder, but ran into a few roadblocks
>>> and didn't have any more time to investigate.
>>> I am currently investigating using docker-composer to orchestrate and set
>>> up the containers.
>>> 1. I am not very familiar with Apache Geode. I was able to start the
>>> server and locator using the official docker image, but there does not
>>> appear to be anyway to import data. In the current repository, there's
>>> java code in `geode-standalone-cluster`. Why do/did we need to write Java
>>> code to stand up a geode cluster? Does anyone know if there are any
>>> standalone tools (preferably something with built binaries) that we can
>>> to directly ingest the JSON data?
>>> 2. From my reading of the integration test instructions, the
>>> calcite-test-dataset spins up a VM with databases preloaded with data
>>> the main calcite repository runs tests against. HSQLDB and H2 does not
>>> any open ports in the VM that's spun up. How does does Calcite run tests
>>> against HSQLDB and H2?
>>> 3. What is the role of maven in the calcite-test-dataset repository? I
>>> a lot of POMs in various subfolders such as mysql, postgresql, etc.
>>> However, I am not sure what these do. If maven is used to spin up the VM,
>>> perhaps we could remove the dependency on it and just run a
>>> up` to start the network of containers.
>>> 4. Is there any interest in bringing the contents of calcite-test-dataset
>>> directly into the Calcite repo? The repo zips up to 1.5MB, so it might
>>> bring to much bloat to the Calcite repo.
>>>  https://github.com/vlsi/calcite-test-dataset/issues/8
>>>  https://calcite.apache.org/docs/howto.html#running-integration-tests
Christian Tzolov <http://www.linkedin.com/in/tzolov> | Principle Software
Engineer | Spring.io <https://spring.io/> | Pivotal <http://pivotal.io/> |