NOTICE: This version of the NSF Unidata web site (archive.unidata.ucar.edu) is no longer being updated.
Current content can be found at unidata.ucar.edu.

To learn about what's going on, see About the Archive Site.

Re: [netcdf-java] Fwd: How to use the Java Api

You are going to have to look at our S3 code in detail
and use it as a "template" to try to get similar code
for using hdfs.  I am sorry that Unidata will probably
not be able to provide any detailed assistance as we do
not have the resources.
=Dennis Heimbigner
 Unidata

On 8/3/2016 6:17 AM, shashi kumar wrote:
> Dear Dennis,
> 
> I am interested in doing this enhancement but , after going trough the
> below url's :
> 
> Netcdf java supports HTTP :
> 
> http://www.unidata.ucar.edu/software/thredds/current/netcdf-java/reference/httpservices.html
> http://www.unidata.ucar.edu/software/thredds/current/netcdf-java/reference/
> 
> I understand that there is a httpclient communication happening in the
> netcdf JAR.
> 
> Could you please shower your thought .
> 
> Regards,
> Sashi
> 
> 
> On Thu, Jul 7, 2016 at 11:11 PM, Dennis Heimbigner <dmh@xxxxxxxx> wrote:
> 
>> After some research, it appears that the hdfs uses
>> its own tcp stack and does not use APache httpclient.
>> So, it cannot be directly used with netcdf-java.
>> I think the proper approach is use our Amazon S3 code
>> as a template to build hdfs access. If you are interested
>> in doing this, we would welcome the effort.
>> =Dennis Heimbigner
>>  Unidata
>>
>> On 7/5/2016 10:51 AM, Sean Arms wrote:
>>> Greetings Shashi!
>>>
>>> netCDF-Java uses a wrapper around the apache http client library,
>>> called httpservices.
>>> While httpservices knows about the apache http client, it does not know
>>> about the hadoop http client, which is the cause of the error you are
>>> seeing.
>>>
>>> Dennis - do you know what it would take to support http access to hadoop
>> (
>>> hdfs) via httpservices?
>>>
>>> Sean
>>>
>>>
>>> On Sat, Jul 2, 2016 at 1:37 AM, shashi kumar <shashi.fengshui@xxxxxxxxx>
>>> wrote:
>>>
>>>> Dear Sean,
>>>>
>>>> I am getting the below error when i use the netcdf-all jar on my hadoop
>>>> project for analysing the nc file . Kindly help me in identifying the
>> issue
>>>> .
>>>>
>>>> ucar.httpservices.HTTPException: ucar.httpservices.HTTPException:
>> org.apache.http.conn.UnsupportedSchemeException: hdfs protocol is not
>> supported
>>>>      at ucar.httpservices.HTTPMethod.execute(HTTPMethod.java:335)
>>>>      at
>> ucar.nc2.dataset.NetcdfDataset.checkIfDods(NetcdfDataset.java:864)
>>>>      at
>> ucar.nc2.dataset.NetcdfDataset.disambiguateHttp(NetcdfDataset.java:820)
>>>>      at
>> ucar.nc2.dataset.NetcdfDataset.openOrAcquireFile(NetcdfDataset.java:706)
>>>>      at
>> ucar.nc2.dataset.NetcdfDataset.openDataset(NetcdfDataset.java:427)
>>>>      at
>> ucar.nc2.dataset.NetcdfDataset.acquireDataset(NetcdfDataset.java:528)
>>>>      at ucar.nc2.dt.grid.GridDataset.open(GridDataset.java:117)
>>>>      at ucar.nc2.dt.grid.GridDataset.open(GridDataset.java:103)
>>>>      at
>> tvl.bd.climate.recordreader.MyRecordReader.initialize(MyRecordReader.java:46)
>>>>      at
>> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:521)
>>>>      at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
>>>>      at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
>>>>      at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>>      at java.security.AccessController.doPrivileged(Native Method)
>>>>      at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>      at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>>>>      at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>>> Caused by: ucar.httpservices.HTTPException:
>> org.apache.http.conn.UnsupportedSchemeException: hdfs protocol is not
>> supported
>>>>      at ucar.httpservices.HTTPSession.execute(HTTPSession.java:1136)
>>>>      at ucar.httpservices.HTTPMethod.execute(HTTPMethod.java:326)
>>>>      ... 16 more
>>>> Caused by: org.apache.http.conn.UnsupportedSchemeException: hdfs
>> protocol is not supported
>>>>      at
>> org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:108)
>>>>      at
>> org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353)
>>>>      at
>> org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380)
>>>>      at
>> org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
>>>>      at
>> org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184)
>>>>      at
>> org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88)
>>>>      at
>> org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
>>>>      at
>> org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:184)
>>>>      at
>> org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:71)
>>>>      at ucar.httpservices.HTTPSession.execute(HTTPSession.java:1134)
>>>>
>>>>
>>>> Regards,
>>>>
>>>> Sashi
>>>>
>>>>
>>>
>>
> 



  • 2016 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdf-java archives: