[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

nodejs client can't connect to two nodes with different private ip addresses in different dcs

Hi Guys -

I recently ran in to a problem (for the 2nd time) where my nodejs app for some reason refuses to connect to one node in my C* cluster. I noticed that in both cases, the node that was not receiving any client connections had the same private ip as another node in the cluster, but in a different datacenter. That prompted me to poke around the client code a bit, and I think I found the problem:

Since `endpoint` is the `rpc_address` of the node, if I'm reading this right, the client will silently ignore other nodes that happen to have the same private ip.

The first time I had this problem, I simply removed the node from the cluster and added a new one, with a different private ip. Now that I suspect I have found the problem, I'm wondering if there is a simpler solution.

I realize this is specific to the nodejs client, but I thought I'd see if anyone else here has ran in to this. It would be great if I could get the nodejs client to ignore nodes in the remote data centers. I've already tried adding this to the client config, but it doesn't resolve the problem:
pooling: {
coreConnectionsPerHost: {
[distance.local]: 2,
[distance.remote]: 0

Any suggestions?

- Mike