Jump to content
OpenSplice DDS Forum


  • Content count

  • Joined

  • Last visited

About burnout

  • Rank

Profile Information

  • Gender
    Not Telling
  • Company
    Thales R&T
  1. burnout

    Sthutdown hook

    Hi all, I am using the OpenSplice community edition 6.4.14 I am encountering two problems Related to volatile data: I have a dataWriter always writing volatile instances on a topic marked as persistent, and also the Durabiity QoS of the dataReader is volatile. The problem is that as soon as the class in charge of creating and run the dataReader is stopped, and restarted again, some historical data (that were marked as volatile) is received, whereas it should not. I also tried different combination of Durability QoS, but the result did not change. Any hint ? The second problem is to the usage of shutdownHook: in few words, this don't work with OpenSplice as soon as a domain connection is established (right after creating a participant). The ^C event is not caught and so I cannot, eventually, release any resources. Has anyone ever encountered such a problem ? If something is not clear, do no hesitate to ask, Cheers, G.
  2. Hi everyone, I recently started to use the Vortex Opensplice version. We are developing a small application in order to play around it. But I am encountering a weird error, that I have never seen previously. This is related to the durability service, and what happens is that persistent data are not actually written on disk, but only the relative *_meta.xml. Also, everytime I restart my application, without having previously deleted the content of the StoreDirectory, I get the following error: ======================================================================================== Report : ERROR Date : Thu Nov 19 17:56:47 W. Europe Standard Time 2015 Description : Unable to resolve persistent data version. Node : 9727H12 Process : java.exe <9224> Thread : durability 8684 Internals : V6.5.0p1/72b8738/b16d292/persistentStoreReadTopicXML/d_storeXML.c/3871/0/1447952207.578610300 ======================================================================================== Report : ERROR Date : Thu Nov 19 17:56:47 W. Europe Standard Time 2015 Description : Unable to insert persistent data from disk for group 'topic.DDSBBItem'. Reason: '6'. Removing data for this group... Node : 9727H12 Process : java.exe <9224> Thread : durability 8684 Internals : V6.5.0p1/72b8738/b16d292/DurabilityService/d_storeXML.c/4907/0/1447952207.578610300 ======================================================================================== Where "topic" is actaully the name of the partition I am using, while DDSBBItem is the topic name. Thanks! Burnout
  3. burnout

    DataReader settings for first read opeartion

    Hi you! Thanks for the answer. What I want, is that as a new DataReader is created, and I attached a dedicated listener (that just execute a read operation on the on_data_available method), in case I have some sample previously written, this DataReader does not read these samples automatically ( I have to implement a dedicated function to accomplish it ). I don't know if what I want is reachable with the settings of the policies of the topic, or in some other ways. do you mean you want to read the historical data with a different mechanism to that of the "live" volatile data? Yes, it's actually what I have to obtain: a function that ask for the samples (persistent/transient, doesn't change) written prior the creation of the DataReader. But this is not hard to implement, I can just do a read operation only once. About the instance, what do you mean ? If I understood your point, I don't have a precise indication which instance is historical and which is not...or, is it possible to have it ? It would solve my problem... I hope I have been more clear this time, in case it doesn't don o hesitate to ask further explanations. Cheers G.
  4. Hi all! In my application I require that the just created DataReader(s) with a listener attached (that for the moment, has only the method on_data_available implemented), do not automatically read the samples that were previously written by a DataWriter (because I have to implement a dedicated procedure for doing it). What is the best solution for accomplish it ? Thanks in advance for every answer I will receive, Regards G.
  5. Hi all, I am building an OpenSplice application, and I created the IDL topic I need. Into that, I have to use some sequence, that the pre-processor convert as array. Bi the way, at the moment, I cant figure out which the best solution for adding/removing elements from this sequence is. Is there an already implemented solutions in IDL ? Shall I create separately in my codes these ? Thank you all in advance. Best regards.