Dataset Viewer
Auto-converted to Parquet
id
sequence
project
string
origin_file
sequence
test_list
sequence
prob_info
list
type
sequence
node
sequence
language
string
toolfunc_count
int64
func_count
int64
pytest_info
dict
[ "cloudnetpy.cloudnetpy.utils.cumsumr", "cloudnetpy.cloudnetpy.categorize.atmos_utils.calc_adiabatic_lwc" ]
cloudnetpy
[ "cloudnetpy/utils.py", "cloudnetpy/categorize/atmos_utils.py" ]
[ "tests/unit/test_atmos_utils.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 1151, "func_start_lineno": 532, "func_end_lineno": 549, "func_code": "def cumsumr(array: np.ndarray, axis: int = 0) -> np.ndarray:\n \"\"\"Finds cumulative sum that resets on 0.\n\n Args:\n array: Input array.\n axis: Axis where the sum is calculated. Default is 0.\n\n Returns:\n Cumulative sum, restarted at 0.\n\n Examples:\n >>> x = np.array([0, 0, 1, 1, 0, 0, 0, 1, 1, 1])\n >>> cumsumr(x)\n [0, 0, 1, 2, 0, 0, 0, 1, 2, 3]\n\n \"\"\"\n cums = array.cumsum(axis=axis)\n return cums - np.maximum.accumulate(cums * (array == 0), axis=axis)" }, { "class_start_lineno": 1, "class_end_lineno": 357, "func_start_lineno": 302, "func_end_lineno": 318, "func_code": "def calc_adiabatic_lwc(lwc_dz: np.ndarray, height: np.ndarray) -> np.ndarray:\n \"\"\"Calculates adiabatic liquid water content (kg m-3).\n\n Args:\n lwc_dz: Liquid water content change rate (kg m-3 m-1) calculated at the\n base of each cloud and filled to that cloud.\n height: Height vector (m).\n\n Returns:\n Liquid water content (kg m-3).\n\n \"\"\"\n is_cloud = lwc_dz != 0\n cloud_indices = utils.cumsumr(is_cloud, axis=1)\n dz = utils.path_lengths_from_ground(height) * np.ones_like(lwc_dz)\n dz[cloud_indices < 1] = 0\n return utils.cumsumr(dz, axis=1) * lwc_dz" } ]
[ "function_empty" ]
[ "cloudnetpy.utils.cumsumr", "cloudnetpy.categorize.atmos_utils.calc_adiabatic_lwc" ]
Python
2
2
{ "total_num": 5, "base_passed_num": 4 }
[ "cloudnetpy.cloudnetpy.utils.binvec", "cloudnetpy.cloudnetpy.utils.rebin_1d", "cloudnetpy.cloudnetpy.utils.rebin_2d", "cloudnetpy.cloudnetpy.cloudnetarray.CloudnetArray::rebin_data" ]
cloudnetpy
[ "cloudnetpy/utils.py", "cloudnetpy/utils.py", "cloudnetpy/utils.py", "cloudnetpy/cloudnetarray.py" ]
[ "tests/unit/test_categorize.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 1151, "func_start_lineno": 124, "func_end_lineno": 140, "func_code": "def binvec(x: np.ndarray | list) -> np.ndarray:\n \"\"\"Converts 1-D center points to bins with even spacing.\n\n Args:\n x: 1-D array of N real values.\n\n Returns:\n ndarray: N + 1 edge values.\n\n Examples:\n >>> binvec([1, 2, 3])\n [0.5, 1.5, 2.5, 3.5]\n\n \"\"\"\n edge1 = x[0] - (x[1] - x[0]) / 2\n edge2 = x[-1] + (x[-1] - x[-2]) / 2\n return np.linspace(edge1, edge2, len(x) + 1)" }, { "class_start_lineno": 1, "class_end_lineno": 1151, "func_start_lineno": 195, "func_end_lineno": 231, "func_code": "def rebin_1d(\n x_in: np.ndarray,\n array: np.ndarray | ma.MaskedArray,\n x_new: np.ndarray,\n statistic: str = \"mean\",\n *,\n mask_zeros: bool = True,\n) -> ma.MaskedArray:\n \"\"\"Rebins 1D array.\n\n Args:\n x_in: 1-D array with shape (n,).\n array: 1-D input data with shape (m,).\n x_new: 1-D target vector (center points) with shape (N,).\n statistic: Statistic to be calculated. Possible statistics are 'mean', 'std'.\n Default is 'mean'.\n mask_zeros: Whether to mask 0 values in the returned array. Default is True.\n\n Returns:\n Re-binned data with shape (N,).\n\n \"\"\"\n edges = binvec(x_new)\n result = np.zeros(len(x_new))\n array_screened = ma.masked_invalid(array, copy=True) # data may contain nan-values\n mask = ~array_screened.mask\n if ma.any(array_screened[mask]):\n result, _, _ = stats.binned_statistic(\n x_in[mask],\n array_screened[mask],\n statistic=statistic,\n bins=edges,\n )\n result[~np.isfinite(result)] = 0\n if mask_zeros:\n return ma.masked_equal(result, 0)\n return ma.array(result)" }, { "class_start_lineno": 1, "class_end_lineno": 1151, "func_start_lineno": 143, "func_end_lineno": 192, "func_code": "def rebin_2d(\n x_in: np.ndarray,\n array: ma.MaskedArray,\n x_new: np.ndarray,\n statistic: Literal[\"mean\", \"std\"] = \"mean\",\n n_min: int = 1,\n *,\n mask_zeros: bool = True,\n) -> tuple[ma.MaskedArray, list]:\n \"\"\"Rebins 2-D data in one dimension.\n\n Args:\n x_in: 1-D array with shape (n,).\n array: 2-D input data with shape (n, m).\n x_new: 1-D target vector (center points) with shape (N,).\n statistic: Statistic to be calculated. Possible statistics are 'mean', 'std'.\n Default is 'mean'.\n n_min: Minimum number of points to have good statistics in a bin. Default is 1.\n mask_zeros: Whether to mask 0 values in the returned array. Default is True.\n\n Returns:\n tuple: Rebinned data with shape (N, m) and indices of bins without enough data.\n \"\"\"\n edges = binvec(x_new)\n result = np.zeros((len(x_new), array.shape[1]))\n array_screened = ma.masked_invalid(array, copy=True) # data may contain nan-values\n for ind, values in enumerate(array_screened.T):\n mask = ~values.mask\n if ma.any(values[mask]):\n result[:, ind], _, _ = stats.binned_statistic(\n x_in[mask],\n values[mask],\n statistic=statistic,\n bins=edges,\n )\n result[~np.isfinite(result)] = 0\n if mask_zeros is True:\n masked_result = ma.masked_equal(result, 0)\n else:\n masked_result = ma.array(result)\n\n # Fill bins with not enough profiles\n x_hist, _ = np.histogram(x_in, bins=edges)\n empty_mask = x_hist < n_min\n masked_result[empty_mask, :] = ma.masked\n empty_indices = list(np.nonzero(empty_mask)[0])\n if len(empty_indices) > 0:\n logging.debug(\"No data in %s bins\", len(empty_indices))\n\n return masked_result, empty_indices" }, { "class_start_lineno": 14, "class_end_lineno": 211, "func_start_lineno": 61, "func_end_lineno": 84, "func_code": " def rebin_data(\n self, time: np.ndarray, time_new: np.ndarray, *, mask_zeros: bool = True\n ) -> list:\n \"\"\"Rebins `data` in time.\n\n Args:\n time: 1D time array.\n time_new: 1D new time array.\n mask_zeros: Whether to mask 0 values in the returned array. Default is True.\n\n Returns:\n Time indices without data.\n\n \"\"\"\n if self.data.ndim == 1:\n self.data = utils.rebin_1d(time, self.data, time_new, mask_zeros=mask_zeros)\n bad_indices = list(np.where(self.data == ma.masked)[0])\n else:\n if not isinstance(self.data, ma.MaskedArray):\n self.data = ma.masked_array(self.data)\n self.data, bad_indices = utils.rebin_2d(\n time, self.data, time_new, mask_zeros=mask_zeros\n )\n return bad_indices" } ]
[ "function_empty" ]
[ "cloudnetpy.utils.binvec", "cloudnetpy.utils.rebin_1d", "cloudnetpy.utils.rebin_2d", "cloudnetpy.cloudnetarray.CloudnetArray.rebin_data" ]
Python
4
4
{ "total_num": 4, "base_passed_num": 0 }
[ "cloudnetpy.cloudnetpy.utils.binvec", "cloudnetpy.cloudnetpy.utils.rebin_2d", "cloudnetpy.cloudnetpy.cloudnetarray.CloudnetArray::rebin_data" ]
cloudnetpy
[ "cloudnetpy/utils.py", "cloudnetpy/utils.py", "cloudnetpy/cloudnetarray.py" ]
[ "tests/unit/test_cloudnetarray.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 1151, "func_start_lineno": 124, "func_end_lineno": 140, "func_code": "def binvec(x: np.ndarray | list) -> np.ndarray:\n \"\"\"Converts 1-D center points to bins with even spacing.\n\n Args:\n x: 1-D array of N real values.\n\n Returns:\n ndarray: N + 1 edge values.\n\n Examples:\n >>> binvec([1, 2, 3])\n [0.5, 1.5, 2.5, 3.5]\n\n \"\"\"\n edge1 = x[0] - (x[1] - x[0]) / 2\n edge2 = x[-1] + (x[-1] - x[-2]) / 2\n return np.linspace(edge1, edge2, len(x) + 1)" }, { "class_start_lineno": 1, "class_end_lineno": 1151, "func_start_lineno": 143, "func_end_lineno": 192, "func_code": "def rebin_2d(\n x_in: np.ndarray,\n array: ma.MaskedArray,\n x_new: np.ndarray,\n statistic: Literal[\"mean\", \"std\"] = \"mean\",\n n_min: int = 1,\n *,\n mask_zeros: bool = True,\n) -> tuple[ma.MaskedArray, list]:\n \"\"\"Rebins 2-D data in one dimension.\n\n Args:\n x_in: 1-D array with shape (n,).\n array: 2-D input data with shape (n, m).\n x_new: 1-D target vector (center points) with shape (N,).\n statistic: Statistic to be calculated. Possible statistics are 'mean', 'std'.\n Default is 'mean'.\n n_min: Minimum number of points to have good statistics in a bin. Default is 1.\n mask_zeros: Whether to mask 0 values in the returned array. Default is True.\n\n Returns:\n tuple: Rebinned data with shape (N, m) and indices of bins without enough data.\n \"\"\"\n edges = binvec(x_new)\n result = np.zeros((len(x_new), array.shape[1]))\n array_screened = ma.masked_invalid(array, copy=True) # data may contain nan-values\n for ind, values in enumerate(array_screened.T):\n mask = ~values.mask\n if ma.any(values[mask]):\n result[:, ind], _, _ = stats.binned_statistic(\n x_in[mask],\n values[mask],\n statistic=statistic,\n bins=edges,\n )\n result[~np.isfinite(result)] = 0\n if mask_zeros is True:\n masked_result = ma.masked_equal(result, 0)\n else:\n masked_result = ma.array(result)\n\n # Fill bins with not enough profiles\n x_hist, _ = np.histogram(x_in, bins=edges)\n empty_mask = x_hist < n_min\n masked_result[empty_mask, :] = ma.masked\n empty_indices = list(np.nonzero(empty_mask)[0])\n if len(empty_indices) > 0:\n logging.debug(\"No data in %s bins\", len(empty_indices))\n\n return masked_result, empty_indices" }, { "class_start_lineno": 14, "class_end_lineno": 211, "func_start_lineno": 61, "func_end_lineno": 84, "func_code": " def rebin_data(\n self, time: np.ndarray, time_new: np.ndarray, *, mask_zeros: bool = True\n ) -> list:\n \"\"\"Rebins `data` in time.\n\n Args:\n time: 1D time array.\n time_new: 1D new time array.\n mask_zeros: Whether to mask 0 values in the returned array. Default is True.\n\n Returns:\n Time indices without data.\n\n \"\"\"\n if self.data.ndim == 1:\n self.data = utils.rebin_1d(time, self.data, time_new, mask_zeros=mask_zeros)\n bad_indices = list(np.where(self.data == ma.masked)[0])\n else:\n if not isinstance(self.data, ma.MaskedArray):\n self.data = ma.masked_array(self.data)\n self.data, bad_indices = utils.rebin_2d(\n time, self.data, time_new, mask_zeros=mask_zeros\n )\n return bad_indices" } ]
[ "function_empty" ]
[ "cloudnetpy.utils.binvec", "cloudnetpy.utils.rebin_2d", "cloudnetpy.cloudnetarray.CloudnetArray.rebin_data" ]
Python
3
3
{ "total_num": 17, "base_passed_num": 15 }
[ "cloudnetpy.cloudnetpy.concat_lib._Concat::_write_initial_data", "cloudnetpy.cloudnetpy.concat_lib._Concat::concat_data" ]
cloudnetpy
[ "cloudnetpy/concat_lib.py", "cloudnetpy/concat_lib.py" ]
[ "tests/unit/test_concat_lib.py", "tests/unit/test_copernicus.py", "tests/unit/test_galileo.py", "tests/unit/test_mira.py" ]
[ { "class_start_lineno": 122, "class_end_lineno": 253, "func_start_lineno": 173, "func_end_lineno": 202, "func_code": " def _write_initial_data(self, variables: list | None, ignore: list | None) -> None:\n for key in self.first_file.variables:\n if (\n variables is not None\n and key not in variables\n and key not in self.common_variables\n and key != self.concat_dimension\n ):\n continue\n if ignore and key in ignore:\n continue\n\n auto_scale = False\n self.first_file[key].set_auto_scale(auto_scale)\n array = self.first_file[key][:]\n dimensions = self.first_file[key].dimensions\n fill_value = getattr(self.first_file[key], \"_FillValue\", None)\n var = self.concatenated_file.createVariable(\n key,\n array.dtype,\n dimensions,\n zlib=True,\n complevel=3,\n shuffle=False,\n fill_value=fill_value,\n )\n auto_scale = False\n var.set_auto_scale(auto_scale)\n var[:] = array\n _copy_attributes(self.first_file[key], var)" }, { "class_start_lineno": 122, "class_end_lineno": 253, "func_start_lineno": 151, "func_end_lineno": 171, "func_code": " def concat_data(\n self,\n variables: list | None,\n ignore: list | None,\n allow_vary: list | None,\n ) -> list:\n \"\"\"Concatenates data arrays.\"\"\"\n self._write_initial_data(variables, ignore)\n output = [self.first_filename]\n if len(self.filenames) > 1:\n for filename in self.filenames[1:]:\n try:\n self._append_data(filename, allow_vary)\n except RuntimeError as e:\n if \"NetCDF: HDF error\" in str(e):\n msg = f\"Caught a NetCDF HDF error. Skipping file '{filename}'.\"\n logging.exception(msg)\n continue\n raise\n output.append(filename)\n return output" } ]
[ "Development" ]
[ "cloudnetpy.concat_lib._Concat._write_initial_data", "cloudnetpy.concat_lib._Concat.concat_data" ]
Python
0
2
{ "total_num": 71, "base_passed_num": 32 }
[ "cloudnetpy.cloudnetpy.datasource.DataSource::getvar", "cloudnetpy.cloudnetpy.datasource.DataSource::_init_time" ]
cloudnetpy
[ "cloudnetpy/datasource.py", "cloudnetpy/datasource.py" ]
[ "tests/unit/test_datasource.py" ]
[ { "class_start_lineno": 16, "class_end_lineno": 236, "func_start_lineno": 60, "func_end_lineno": 80, "func_code": " def getvar(self, *args) -> np.ndarray:\n \"\"\"Returns data array from the source file variables.\n\n Returns just the data (and no attributes) from the original\n variables dictionary, fetched from the input netCDF file.\n\n Args:\n *args: possible names of the variable. The first match is returned.\n\n Returns:\n ndarray: The actual data.\n\n Raises:\n RuntimeError: The variable is not found.\n\n \"\"\"\n for arg in args:\n if arg in self.dataset.variables:\n return self.dataset.variables[arg][:]\n msg = f\"Missing variable {args[0]} in the input file.\"\n raise RuntimeError(msg)" }, { "class_start_lineno": 16, "class_end_lineno": 236, "func_start_lineno": 152, "func_end_lineno": 160, "func_code": " def _init_time(self) -> np.ndarray:\n time = self.getvar(\"time\")\n if len(time) == 0:\n msg = \"Empty time vector\"\n raise ValidTimeStampError(msg)\n if max(time) > 25:\n logging.debug(\"Assuming time as seconds, converting to fraction hour\")\n time = utils.seconds2hours(time)\n return time" } ]
[ "function_empty", "Development" ]
[ "cloudnetpy.datasource.DataSource.getvar", "cloudnetpy.datasource.DataSource._init_time" ]
Python
1
2
{ "total_num": 9, "base_passed_num": 6 }
[ "cloudnetpy.cloudnetpy.instruments.disdrometer.parsivel._read_fmi", "cloudnetpy.cloudnetpy.instruments.disdrometer.parsivel.parsivel2nc" ]
cloudnetpy
[ "cloudnetpy/instruments/disdrometer/parsivel.py", "cloudnetpy/instruments/disdrometer/parsivel.py" ]
[ "tests/unit/test_disdrometer.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 713, "func_start_lineno": 618, "func_end_lineno": 657, "func_code": "def _read_fmi(content: str):\n r\"\"\"Read format used by Finnish Meteorological Institute and University of\n Helsinki.\n\n Format consists of sequence of the following:\n - \"[YYYY-MM-DD HH:MM:SS\\n\"\n - output of \"CS/PA\" command without non-printable characters at the end\n - \"]\\n\"\n \"\"\"\n output: dict[str, list] = {\"_datetime\": []}\n for m in re.finditer(\n r\"\\[(?P<year>\\d+)-(?P<month>\\d+)-(?P<day>\\d+) \"\n r\"(?P<hour>\\d+):(?P<minute>\\d+):(?P<second>\\d+)\"\n r\"(?P<output>[^\\]]*)\\]\",\n content,\n ):\n try:\n record = _read_typ_op4a(m[\"output\"].splitlines())\n except ValueError:\n continue\n\n for key, value in record.items():\n if key not in output:\n output[key] = [None] * len(output[\"_datetime\"])\n output[key].append(value)\n for key in output:\n if key not in record and key != \"_datetime\":\n output[key].append(None)\n\n output[\"_datetime\"].append(\n datetime.datetime(\n int(m[\"year\"]),\n int(m[\"month\"]),\n int(m[\"day\"]),\n int(m[\"hour\"]),\n int(m[\"minute\"]),\n int(m[\"second\"]),\n )\n )\n return output" }, { "class_start_lineno": 1, "class_end_lineno": 713, "func_start_lineno": 23, "func_end_lineno": 77, "func_code": "def parsivel2nc(\n disdrometer_file: str | PathLike | Iterable[str | PathLike],\n output_file: str,\n site_meta: dict,\n uuid: str | None = None,\n date: str | datetime.date | None = None,\n telegram: Sequence[int | None] | None = None,\n timestamps: Sequence[datetime.datetime] | None = None,\n) -> str:\n \"\"\"Converts OTT Parsivel-2 disdrometer data into Cloudnet Level 1b netCDF\n file.\n\n Args:\n disdrometer_file: Filename of disdrometer file or list of filenames.\n output_file: Output filename.\n site_meta: Dictionary containing information about the site. Required key\n is `name`.\n uuid: Set specific UUID for the file.\n date: Expected date of the measurements as YYYY-MM-DD.\n telegram: List of measured value numbers as specified in section 11.2 of\n the instrument's operating instructions. Unknown values are indicated\n with None. Telegram is required if the input file doesn't contain a\n header.\n timestamps: Specify list of timestamps if they are missing in the input file.\n\n Returns:\n UUID of the generated file.\n\n Raises:\n DisdrometerDataError: Timestamps do not match the expected date, or unable\n to read the disdrometer file.\n\n Examples:\n >>> from cloudnetpy.instruments import parsivel2nc\n >>> site_meta = {'name': 'Lindenberg', 'altitude': 104, 'latitude': 52.2,\n 'longitude': 14.1}\n >>> uuid = parsivel2nc('parsivel.log', 'parsivel.nc', site_meta)\n\n \"\"\"\n if isinstance(date, str):\n date = datetime.date.fromisoformat(date)\n if isinstance(disdrometer_file, str | PathLike):\n disdrometer_file = [disdrometer_file]\n disdrometer = Parsivel(disdrometer_file, site_meta, telegram, date, timestamps)\n disdrometer.sort_timestamps()\n disdrometer.remove_duplicate_timestamps()\n disdrometer.mask_invalid_values()\n if len(disdrometer.data[\"time\"].data) < 2:\n msg = \"Too few data points\"\n raise DisdrometerDataError(msg)\n disdrometer.convert_units()\n disdrometer.add_meta()\n attributes = output.add_time_attribute(ATTRIBUTES, disdrometer.date)\n output.update_attributes(disdrometer.data, attributes)\n return output.save_level1b(disdrometer, output_file, uuid)" } ]
[ "function_empty" ]
[ "cloudnetpy.instruments.disdrometer.parsivel._read_fmi", "cloudnetpy.instruments.disdrometer.parsivel.parsivel2nc" ]
Python
2
2
{ "total_num": 54, "base_passed_num": 0 }
[ "cloudnetpy.cloudnetpy.utils.l2norm_weighted", "cloudnetpy.cloudnetpy.products.drizzle_error._calc_error" ]
cloudnetpy
[ "cloudnetpy/utils.py", "cloudnetpy/products/drizzle_error.py" ]
[ "tests/unit/test_drizzle.py", "tests/unit/test_drizzle_error.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 1151, "func_start_lineno": 504, "func_end_lineno": 529, "func_code": "def l2norm_weighted(\n values: tuple,\n overall_scale: float,\n term_weights: tuple,\n) -> ma.MaskedArray:\n \"\"\"Calculates scaled and weighted Euclidean distance.\n\n Calculated distance is of form: scale * sqrt((a1*a)**2 + (b1*b)**2 + ...)\n where a, b, ... are terms to be summed and a1, a2, ... are optional weights\n for the terms.\n\n Args:\n values: Tuple containing the values.\n overall_scale: Scale factor for the calculated Euclidean distance.\n term_weights: Weights for the terms. Must be single float or a list of numbers\n (one per term).\n\n Returns:\n Scaled and weighted Euclidean distance.\n\n TODO: Use masked arrays instead of tuples.\n\n \"\"\"\n generic_values = ma.array(values, dtype=object)\n weighted_values = ma.multiply(generic_values, term_weights)\n return overall_scale * l2norm(*weighted_values)" }, { "class_start_lineno": 1, "class_end_lineno": 188, "func_start_lineno": 140, "func_end_lineno": 153, "func_code": "def _calc_error(\n scale: float,\n weights: tuple,\n error_input: tuple,\n *,\n add_mu: bool = False,\n add_mu_small: bool = False,\n) -> ma.MaskedArray:\n error = utils.l2norm_weighted(error_input, scale, weights)\n if add_mu is True:\n error = utils.l2norm(error, MU_ERROR)\n if add_mu_small is True:\n error = utils.l2norm(error, MU_ERROR_SMALL)\n return error" } ]
[ "function_empty", "Development" ]
[ "cloudnetpy.utils.l2norm_weighted", "cloudnetpy.products.drizzle_error._calc_error" ]
Python
1
2
{ "total_num": 103, "base_passed_num": 70 }
[ "cloudnetpy.cloudnetpy.categorize.droplet.interpolate_lwp", "cloudnetpy.cloudnetpy.categorize.droplet.find_liquid" ]
cloudnetpy
[ "cloudnetpy/categorize/droplet.py", "cloudnetpy/categorize/droplet.py" ]
[ "tests/unit/test_droplet.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 245, "func_start_lineno": 225, "func_end_lineno": 238, "func_code": "def interpolate_lwp(obs: ClassData) -> np.ndarray:\n \"\"\"Linear interpolation of liquid water path to fill masked values.\n\n Args:\n obs: The :class:`ClassData` instance.\n\n Returns:\n Liquid water path where the masked values are filled by interpolation.\n\n \"\"\"\n if obs.lwp.all() is ma.masked:\n return np.zeros(obs.time.shape)\n ind = ma.where(obs.lwp)\n return np.interp(obs.time, obs.time[ind], obs.lwp[ind])" }, { "class_start_lineno": 1, "class_end_lineno": 245, "func_start_lineno": 52, "func_end_lineno": 121, "func_code": "def find_liquid(\n obs: ClassData,\n peak_amp: float = 1e-6,\n max_width: float = 300,\n min_points: int = 3,\n min_top_der: float = 1e-7,\n min_lwp: float = 0,\n min_alt: float = 100,\n) -> np.ndarray:\n \"\"\"Estimate liquid layers from SNR-screened attenuated backscatter.\n\n Args:\n obs: The :class:`ClassData` instance.\n peak_amp: Minimum value of peak. Default is 1e-6.\n max_width: Maximum width of peak. Default is 300 (m).\n min_points: Minimum number of valid points in peak. Default is 3.\n min_top_der: Minimum derivative above peak, defined as\n (beta_peak-beta_top) / (alt_top-alt_peak). Default is 1e-7.\n min_lwp: Minimum value from linearly interpolated lwp (kg m-2)\n measured by the mwr. Default is 0.\n min_alt: Minimum altitude of the peak from the ground. Default is 100 (m).\n\n Returns:\n 2-D boolean array denoting liquid layers.\n\n References:\n The method is based on Tuononen, M. et.al, 2019,\n https://acp.copernicus.org/articles/19/1985/2019/.\n\n \"\"\"\n\n def _is_proper_peak() -> bool:\n conditions = (\n npoints >= min_points,\n peak_width < max_width,\n top_der > min_top_der,\n is_positive_lwp,\n peak_alt > min_alt,\n )\n return all(conditions)\n\n lwp_int = interpolate_lwp(obs)\n beta = ma.copy(obs.beta)\n height = obs.height\n\n is_liquid = np.zeros(beta.shape, dtype=bool)\n base_below_peak = utils.n_elements(height, 200)\n top_above_peak = utils.n_elements(height, 150)\n difference = ma.array(np.diff(beta, axis=1))\n beta_diff = difference.filled(0)\n beta = beta.filled(0)\n peak_indices = _find_strong_peaks(beta, peak_amp)\n\n for n, peak in zip(*peak_indices, strict=True):\n lprof = beta[n, :]\n dprof = beta_diff[n, :]\n try:\n base = ind_base(dprof, peak, base_below_peak, 4)\n top = ind_top(dprof, peak, height.shape[0], top_above_peak, 4)\n except IndexError:\n continue\n npoints = np.count_nonzero(lprof[base : top + 1])\n peak_width = height[top] - height[base]\n peak_alt = height[peak] - height[0]\n top_der = (lprof[peak] - lprof[top]) / (height[top] - height[peak])\n is_positive_lwp = lwp_int[n] >= min_lwp\n if _is_proper_peak():\n is_liquid[n, base : top + 1] = True\n\n return is_liquid" } ]
[ "function_empty", "Development" ]
[ "cloudnetpy.categorize.droplet.interpolate_lwp", "cloudnetpy.categorize.droplet.find_liquid" ]
Python
1
2
{ "total_num": 18, "base_passed_num": 15 }
[ "cloudnetpy.cloudnetpy.categorize.atmos_utils.fill_clouds_with_lwc_dz", "cloudnetpy.cloudnetpy.products.lwc.Lwc::_init_lwc_adiabatic", "cloudnetpy.cloudnetpy.categorize.atmos_utils.calc_saturation_vapor_pressure", "cloudnetpy.cloudnetpy.categorize.atmos_utils.calc_mixing_ratio", "cloudnetpy.cloudnetpy.categorize.atmos_utils.calc_lwc_change_rate" ]
cloudnetpy
[ "cloudnetpy/categorize/atmos_utils.py", "cloudnetpy/products/lwc.py", "cloudnetpy/products/lwc.py", "cloudnetpy/categorize/atmos_utils.py", "cloudnetpy/categorize/atmos_utils.py", "cloudnetpy/categorize/atmos_utils.py" ]
[ "tests/unit/test_lwc.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 357, "func_start_lineno": 154, "func_end_lineno": 172, "func_code": "def fill_clouds_with_lwc_dz(\n temperature: np.ndarray, pressure: np.ndarray, is_liquid: np.ndarray\n) -> np.ndarray:\n \"\"\"Fills liquid clouds with lwc change rate at the cloud bases.\n\n Args:\n temperature: 2D temperature array (K).\n pressure: 2D pressure array (Pa).\n is_liquid: Boolean array indicating presence of liquid clouds.\n\n Returns:\n Liquid water content change rate (kg m-3 m-1), so that for each cloud the base\n value is filled for the whole cloud.\n\n \"\"\"\n lwc_dz = get_lwc_change_rate_at_bases(temperature, pressure, is_liquid)\n lwc_dz_filled = ma.zeros(lwc_dz.shape)\n lwc_dz_filled[is_liquid] = utils.ffill(lwc_dz[is_liquid])\n return lwc_dz_filled" }, { "class_start_lineno": 120, "class_end_lineno": 167, "func_start_lineno": 146, "func_end_lineno": 152, "func_code": " def _init_lwc_adiabatic(self) -> np.ndarray:\n \"\"\"Returns theoretical adiabatic lwc in liquid clouds (kg/m3).\"\"\"\n lwc_dz = atmos_utils.fill_clouds_with_lwc_dz(\n *self.lwc_source.atmosphere,\n self.is_liquid,\n )\n return atmos_utils.calc_adiabatic_lwc(lwc_dz, self.height)" }, { "class_start_lineno": 120, "class_end_lineno": 167, "func_start_lineno": 134, "func_end_lineno": 140, "func_code": " def __init__(self, lwc_source: LwcSource):\n self.lwc_source = lwc_source\n self.height = lwc_source.getvar(\"height\")\n self.is_liquid = self._get_liquid()\n self.lwc_adiabatic = self._init_lwc_adiabatic()\n self.lwc = self._adiabatic_lwc_to_lwc()\n self._mask_rain()" }, { "class_start_lineno": 1, "class_end_lineno": 357, "func_start_lineno": 245, "func_end_lineno": 266, "func_code": "def calc_saturation_vapor_pressure(temperature: np.ndarray) -> np.ndarray:\n \"\"\"Goff-Gratch formula for saturation vapor pressure over water adopted by WMO.\n\n Args:\n temperature: Temperature (K).\n\n Returns:\n Saturation vapor pressure (Pa).\n\n \"\"\"\n ratio = con.T0 / temperature\n inv_ratio = ratio**-1\n return (\n 10\n ** (\n 10.79574 * (1 - ratio)\n - 5.028 * np.log10(inv_ratio)\n + 1.50475e-4 * (1 - (10 ** (-8.2969 * (inv_ratio - 1))))\n + 0.42873e-3 * (10 ** (4.76955 * (1 - ratio)) - 1)\n + 0.78614\n )\n ) * con.HPA_TO_PA" }, { "class_start_lineno": 1, "class_end_lineno": 357, "func_start_lineno": 269, "func_end_lineno": 280, "func_code": "def calc_mixing_ratio(vapor_pressure: np.ndarray, pressure: np.ndarray) -> np.ndarray:\n \"\"\"Calculates mixing ratio from partial vapor pressure and pressure.\n\n Args:\n vapor_pressure: Partial pressure of water vapor (Pa).\n pressure: Atmospheric pressure (Pa).\n\n Returns:\n Mixing ratio (kg kg-1).\n\n \"\"\"\n return con.MW_RATIO * vapor_pressure / (pressure - vapor_pressure)" }, { "class_start_lineno": 1, "class_end_lineno": 357, "func_start_lineno": 201, "func_end_lineno": 242, "func_code": "def calc_lwc_change_rate(temperature: np.ndarray, pressure: np.ndarray) -> np.ndarray:\n \"\"\"Returns rate of change of condensable water (LWC).\n\n Calculates the theoretical adiabatic rate of increase of LWC\n with height, given the cloud base temperature and pressure.\n\n Args:\n temperature: Temperature of cloud base (K).\n pressure: Pressure of cloud base (Pa).\n\n Returns:\n dlwc/dz (kg m-3 m-1)\n\n References:\n Brenguier, 1991, https://doi.org/10.1175/1520-0469(1991)048<0264:POTCPA>2.0.CO;2\n\n \"\"\"\n svp = calc_saturation_vapor_pressure(temperature)\n svp_mixing_ratio = calc_mixing_ratio(svp, pressure)\n air_density = calc_air_density(pressure, temperature, svp_mixing_ratio)\n\n e = 0.622\n Cp = 1004 # J kg-1 K-1\n Lv = 2.45e6 # J kg-1 = Pa m3 kg-1\n qs = svp_mixing_ratio # kg kg-1\n pa = air_density # kg m-3\n es = svp # Pa\n P = pressure # Pa\n T = temperature # K\n\n # See Appendix B in Brenguier (1991) for the derivation of the following equation\n dqs_dp = (\n -(1 - (Cp * T) / (e * Lv))\n * (((Cp * T) / (e * Lv)) + ((Lv * qs * pa) / (P - es))) ** -1\n * (e * es)\n * (P - es) ** -2\n )\n\n # Using hydrostatic equation to convert dqs_dp to dqs_dz\n dqs_dz = dqs_dp * air_density * -scipy.constants.g\n\n return dqs_dz * air_density" } ]
[ "function_empty", "Development" ]
[ "cloudnetpy.categorize.atmos_utils.fill_clouds_with_lwc_dz", "cloudnetpy.products.lwc.Lwc._init_lwc_adiabatic", "cloudnetpy.products.lwc.Lwc.__init__", "cloudnetpy.categorize.atmos_utils.calc_saturation_vapor_pressure", "cloudnetpy.categorize.atmos_utils.calc_mixing_ratio", "cloudnetpy.categorize.atmos_utils.calc_lwc_change_rate" ]
Python
3
5
{ "total_num": 37, "base_passed_num": 0 }
[ "cloudnetpy.cloudnetpy.utils.rebin_1d", "cloudnetpy.cloudnetpy.cloudnetarray.CloudnetArray::rebin_data", "cloudnetpy.cloudnetpy.utils.binvec" ]
cloudnetpy
[ "cloudnetpy/utils.py", "cloudnetpy/cloudnetarray.py", "cloudnetpy/categorize/mwr.py", "cloudnetpy/utils.py" ]
[ "tests/unit/test_mwr.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 1151, "func_start_lineno": 195, "func_end_lineno": 231, "func_code": "def rebin_1d(\n x_in: np.ndarray,\n array: np.ndarray | ma.MaskedArray,\n x_new: np.ndarray,\n statistic: str = \"mean\",\n *,\n mask_zeros: bool = True,\n) -> ma.MaskedArray:\n \"\"\"Rebins 1D array.\n\n Args:\n x_in: 1-D array with shape (n,).\n array: 1-D input data with shape (m,).\n x_new: 1-D target vector (center points) with shape (N,).\n statistic: Statistic to be calculated. Possible statistics are 'mean', 'std'.\n Default is 'mean'.\n mask_zeros: Whether to mask 0 values in the returned array. Default is True.\n\n Returns:\n Re-binned data with shape (N,).\n\n \"\"\"\n edges = binvec(x_new)\n result = np.zeros(len(x_new))\n array_screened = ma.masked_invalid(array, copy=True) # data may contain nan-values\n mask = ~array_screened.mask\n if ma.any(array_screened[mask]):\n result, _, _ = stats.binned_statistic(\n x_in[mask],\n array_screened[mask],\n statistic=statistic,\n bins=edges,\n )\n result[~np.isfinite(result)] = 0\n if mask_zeros:\n return ma.masked_equal(result, 0)\n return ma.array(result)" }, { "class_start_lineno": 14, "class_end_lineno": 211, "func_start_lineno": 61, "func_end_lineno": 84, "func_code": " def rebin_data(\n self, time: np.ndarray, time_new: np.ndarray, *, mask_zeros: bool = True\n ) -> list:\n \"\"\"Rebins `data` in time.\n\n Args:\n time: 1D time array.\n time_new: 1D new time array.\n mask_zeros: Whether to mask 0 values in the returned array. Default is True.\n\n Returns:\n Time indices without data.\n\n \"\"\"\n if self.data.ndim == 1:\n self.data = utils.rebin_1d(time, self.data, time_new, mask_zeros=mask_zeros)\n bad_indices = list(np.where(self.data == ma.masked)[0])\n else:\n if not isinstance(self.data, ma.MaskedArray):\n self.data = ma.masked_array(self.data)\n self.data, bad_indices = utils.rebin_2d(\n time, self.data, time_new, mask_zeros=mask_zeros\n )\n return bad_indices" }, { "class_start_lineno": 11, "class_end_lineno": 50, "func_start_lineno": 24, "func_end_lineno": 32, "func_code": " def rebin_to_grid(self, time_grid: np.ndarray) -> None:\n \"\"\"Approximates lwp and its error in a grid using mean.\n\n Args:\n time_grid: 1D target time grid.\n\n \"\"\"\n for array in self.data.values():\n array.rebin_data(self.time, time_grid)" }, { "class_start_lineno": 1, "class_end_lineno": 1151, "func_start_lineno": 124, "func_end_lineno": 140, "func_code": "def binvec(x: np.ndarray | list) -> np.ndarray:\n \"\"\"Converts 1-D center points to bins with even spacing.\n\n Args:\n x: 1-D array of N real values.\n\n Returns:\n ndarray: N + 1 edge values.\n\n Examples:\n >>> binvec([1, 2, 3])\n [0.5, 1.5, 2.5, 3.5]\n\n \"\"\"\n edge1 = x[0] - (x[1] - x[0]) / 2\n edge2 = x[-1] + (x[-1] - x[-2]) / 2\n return np.linspace(edge1, edge2, len(x) + 1)" } ]
[ "function_empty" ]
[ "cloudnetpy.utils.rebin_1d", "cloudnetpy.cloudnetarray.CloudnetArray.rebin_data", "cloudnetpy.categorize.mwr.Mwr.rebin_to_grid", "cloudnetpy.utils.binvec" ]
Python
3
3
{ "total_num": 4, "base_passed_num": 3 }
[ "cloudnetpy.cloudnetpy.utils.append_data", "cloudnetpy.cloudnetpy.instruments.radiometrics.RadiometricsCombined::__init__" ]
cloudnetpy
[ "cloudnetpy/utils.py", "cloudnetpy/instruments/radiometrics.py" ]
[ "tests/unit/test_radiometrics.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 1151, "func_start_lineno": 911, "func_end_lineno": 925, "func_code": "def append_data(data_in: dict, key: str, array: np.ndarray) -> dict:\n \"\"\"Appends data to a dictionary field (creates the field if not yet present).\n\n Args:\n data_in: Dictionary where data will be appended.\n key: Key of the field.\n array: Numpy array to be appended to data_in[key].\n\n \"\"\"\n data = data_in.copy()\n if key not in data:\n data[key] = array\n else:\n data[key] = ma.concatenate((data[key], array))\n return data" }, { "class_start_lineno": 227, "class_end_lineno": 289, "func_start_lineno": 233, "func_end_lineno": 246, "func_code": " def __init__(self, objs: list[Radiometrics], site_meta: dict):\n self.site_meta = site_meta\n self.data = {}\n self.date = None\n for obj in objs:\n if obj.ranges != objs[0].ranges:\n msg = \"Inconsistent range between files\"\n raise InconsistentDataError(msg)\n for key in obj.data:\n self.data = utils.append_data(self.data, key, obj.data[key])\n ranges = [float(x) for x in objs[0].ranges]\n self.data[\"range\"] = np.array(ranges) * 1000 # m => km\n self.data[\"height\"] = self.data[\"range\"] + self.site_meta[\"altitude\"]\n self.instrument = instruments.RADIOMETRICS" } ]
[ "function_empty", "Development" ]
[ "cloudnetpy.utils.append_data", "cloudnetpy.instruments.radiometrics.RadiometricsCombined.__init__" ]
Python
1
2
{ "total_num": 17, "base_passed_num": 0 }
[ "cloudnetpy.cloudnetpy.utils.binvec", "cloudnetpy.cloudnetpy.utils.rebin_2d", "cloudnetpy.cloudnetpy.utils.rebin_1d" ]
cloudnetpy
[ "cloudnetpy/utils.py", "cloudnetpy/utils.py", "cloudnetpy/utils.py" ]
[ "tests/unit/test_utils.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 1151, "func_start_lineno": 124, "func_end_lineno": 140, "func_code": "def binvec(x: np.ndarray | list) -> np.ndarray:\n \"\"\"Converts 1-D center points to bins with even spacing.\n\n Args:\n x: 1-D array of N real values.\n\n Returns:\n ndarray: N + 1 edge values.\n\n Examples:\n >>> binvec([1, 2, 3])\n [0.5, 1.5, 2.5, 3.5]\n\n \"\"\"\n edge1 = x[0] - (x[1] - x[0]) / 2\n edge2 = x[-1] + (x[-1] - x[-2]) / 2\n return np.linspace(edge1, edge2, len(x) + 1)" }, { "class_start_lineno": 1, "class_end_lineno": 1151, "func_start_lineno": 143, "func_end_lineno": 192, "func_code": "def rebin_2d(\n x_in: np.ndarray,\n array: ma.MaskedArray,\n x_new: np.ndarray,\n statistic: Literal[\"mean\", \"std\"] = \"mean\",\n n_min: int = 1,\n *,\n mask_zeros: bool = True,\n) -> tuple[ma.MaskedArray, list]:\n \"\"\"Rebins 2-D data in one dimension.\n\n Args:\n x_in: 1-D array with shape (n,).\n array: 2-D input data with shape (n, m).\n x_new: 1-D target vector (center points) with shape (N,).\n statistic: Statistic to be calculated. Possible statistics are 'mean', 'std'.\n Default is 'mean'.\n n_min: Minimum number of points to have good statistics in a bin. Default is 1.\n mask_zeros: Whether to mask 0 values in the returned array. Default is True.\n\n Returns:\n tuple: Rebinned data with shape (N, m) and indices of bins without enough data.\n \"\"\"\n edges = binvec(x_new)\n result = np.zeros((len(x_new), array.shape[1]))\n array_screened = ma.masked_invalid(array, copy=True) # data may contain nan-values\n for ind, values in enumerate(array_screened.T):\n mask = ~values.mask\n if ma.any(values[mask]):\n result[:, ind], _, _ = stats.binned_statistic(\n x_in[mask],\n values[mask],\n statistic=statistic,\n bins=edges,\n )\n result[~np.isfinite(result)] = 0\n if mask_zeros is True:\n masked_result = ma.masked_equal(result, 0)\n else:\n masked_result = ma.array(result)\n\n # Fill bins with not enough profiles\n x_hist, _ = np.histogram(x_in, bins=edges)\n empty_mask = x_hist < n_min\n masked_result[empty_mask, :] = ma.masked\n empty_indices = list(np.nonzero(empty_mask)[0])\n if len(empty_indices) > 0:\n logging.debug(\"No data in %s bins\", len(empty_indices))\n\n return masked_result, empty_indices" }, { "class_start_lineno": 1, "class_end_lineno": 1151, "func_start_lineno": 195, "func_end_lineno": 231, "func_code": "def rebin_1d(\n x_in: np.ndarray,\n array: np.ndarray | ma.MaskedArray,\n x_new: np.ndarray,\n statistic: str = \"mean\",\n *,\n mask_zeros: bool = True,\n) -> ma.MaskedArray:\n \"\"\"Rebins 1D array.\n\n Args:\n x_in: 1-D array with shape (n,).\n array: 1-D input data with shape (m,).\n x_new: 1-D target vector (center points) with shape (N,).\n statistic: Statistic to be calculated. Possible statistics are 'mean', 'std'.\n Default is 'mean'.\n mask_zeros: Whether to mask 0 values in the returned array. Default is True.\n\n Returns:\n Re-binned data with shape (N,).\n\n \"\"\"\n edges = binvec(x_new)\n result = np.zeros(len(x_new))\n array_screened = ma.masked_invalid(array, copy=True) # data may contain nan-values\n mask = ~array_screened.mask\n if ma.any(array_screened[mask]):\n result, _, _ = stats.binned_statistic(\n x_in[mask],\n array_screened[mask],\n statistic=statistic,\n bins=edges,\n )\n result[~np.isfinite(result)] = 0\n if mask_zeros:\n return ma.masked_equal(result, 0)\n return ma.array(result)" } ]
[ "function_empty" ]
[ "cloudnetpy.utils.binvec", "cloudnetpy.utils.rebin_2d", "cloudnetpy.utils.rebin_1d" ]
Python
3
3
{ "total_num": 160, "base_passed_num": 151 }
[ "d3rlpy.d3rlpy.models.encoders.DefaultEncoderFactory::create", "d3rlpy.d3rlpy.models.builders.create_discrete_q_function" ]
d3rlpy
[ "d3rlpy/models/encoders.py", "d3rlpy/models/builders.py" ]
[ "tests_copy/envs/test_wrappers.py", "tests_copy/models/test_builders.py" ]
[ { "class_start_lineno": 209, "class_end_lineno": 265, "func_start_lineno": 224, "func_end_lineno": 238, "func_code": " def create(self, observation_shape: Shape) -> Encoder:\n factory: Union[PixelEncoderFactory, VectorEncoderFactory]\n if len(observation_shape) == 3:\n factory = PixelEncoderFactory(\n activation=self.activation,\n use_batch_norm=self.use_batch_norm,\n dropout_rate=self.dropout_rate,\n )\n else:\n factory = VectorEncoderFactory(\n activation=self.activation,\n use_batch_norm=self.use_batch_norm,\n dropout_rate=self.dropout_rate,\n )\n return factory.create(observation_shape)" }, { "class_start_lineno": 1, "class_end_lineno": 403, "func_start_lineno": 47, "func_end_lineno": 82, "func_code": "def create_discrete_q_function(\n observation_shape: Shape,\n action_size: int,\n encoder_factory: EncoderFactory,\n q_func_factory: QFunctionFactory,\n device: str,\n enable_ddp: bool,\n n_ensembles: int = 1,\n) -> tuple[nn.ModuleList, DiscreteEnsembleQFunctionForwarder]:\n if q_func_factory.share_encoder:\n encoder = encoder_factory.create(observation_shape)\n hidden_size = compute_output_size([observation_shape], encoder)\n # normalize gradient scale by ensemble size\n for p in cast(nn.Module, encoder).parameters():\n p.register_hook(lambda grad: grad / n_ensembles)\n\n q_funcs = []\n forwarders = []\n for _ in range(n_ensembles):\n if not q_func_factory.share_encoder:\n encoder = encoder_factory.create(observation_shape)\n hidden_size = compute_output_size([observation_shape], encoder)\n q_func, forwarder = q_func_factory.create_discrete(\n encoder, hidden_size, action_size\n )\n q_func.to(device)\n if enable_ddp:\n q_func = wrap_model_by_ddp(q_func)\n forwarder.set_q_func(q_func)\n q_funcs.append(q_func)\n forwarders.append(forwarder)\n q_func_modules = nn.ModuleList(q_funcs)\n ensemble_forwarder = DiscreteEnsembleQFunctionForwarder(\n forwarders, action_size\n )\n return q_func_modules, ensemble_forwarder" } ]
[ "Development" ]
[ "d3rlpy.models.encoders.DefaultEncoderFactory.create", "d3rlpy.models.builders.create_discrete_q_function" ]
Python
0
2
{ "total_num": 42, "base_passed_num": 15 }
[ "d3rlpy.d3rlpy.dataset.transition_pickers.BasicTransitionPicker::__call__", "d3rlpy.d3rlpy.metrics.evaluators.make_batches", "d3rlpy.d3rlpy.metrics.evaluators.TDErrorEvaluator::__call__", "d3rlpy.d3rlpy.models.encoders.DefaultEncoderFactory::create", "d3rlpy.d3rlpy.models.builders.create_discrete_q_function" ]
d3rlpy
[ "d3rlpy/dataset/transition_pickers.py", "d3rlpy/metrics/evaluators.py", "d3rlpy/metrics/evaluators.py", "d3rlpy/models/encoders.py", "d3rlpy/models/builders.py" ]
[ "tests_copy/metrics/test_evaluators.py" ]
[ { "class_start_lineno": 43, "class_end_lineno": 72, "func_start_lineno": 49, "func_end_lineno": 72, "func_code": " def __call__(self, episode: EpisodeBase, index: int) -> Transition:\n _validate_index(episode, index)\n\n observation = retrieve_observation(episode.observations, index)\n is_terminal = episode.terminated and index == episode.size() - 1\n if is_terminal:\n next_observation = create_zero_observation(observation)\n next_action = np.zeros_like(episode.actions[index])\n else:\n next_observation = retrieve_observation(\n episode.observations, index + 1\n )\n next_action = episode.actions[index + 1]\n\n return Transition(\n observation=observation,\n action=episode.actions[index],\n reward=episode.rewards[index],\n next_observation=next_observation,\n next_action=next_action,\n terminal=float(is_terminal),\n interval=1,\n rewards_to_go=episode.rewards[index:],\n )" }, { "class_start_lineno": 1, "class_end_lineno": 548, "func_start_lineno": 52, "func_end_lineno": 68, "func_code": "def make_batches(\n episode: EpisodeBase,\n window_size: int,\n transition_picker: TransitionPickerProtocol,\n) -> Iterator[TransitionMiniBatch]:\n n_batches = len(episode) // window_size\n if len(episode) % window_size != 0:\n n_batches += 1\n for i in range(n_batches):\n head_index = i * window_size\n last_index = min(head_index + window_size, episode.transition_count)\n transitions = [\n transition_picker(episode, index)\n for index in range(head_index, last_index)\n ]\n batch = TransitionMiniBatch.from_transitions(transitions)\n yield batch" }, { "class_start_lineno": 71, "class_end_lineno": 121, "func_start_lineno": 93, "func_end_lineno": 121, "func_code": " def __call__(\n self,\n algo: QLearningAlgoProtocol,\n dataset: ReplayBufferBase,\n ) -> float:\n total_errors = []\n episodes = self._episodes if self._episodes else dataset.episodes\n for episode in episodes:\n for batch in make_batches(\n episode, WINDOW_SIZE, dataset.transition_picker\n ):\n # estimate values for current observations\n values = algo.predict_value(batch.observations, batch.actions)\n\n # estimate values for next observations\n next_actions = algo.predict(batch.next_observations)\n next_values = algo.predict_value(\n batch.next_observations, next_actions\n )\n\n # calculate td errors\n mask = (1.0 - batch.terminals).reshape(-1)\n rewards = np.asarray(batch.rewards).reshape(-1)\n if algo.reward_scaler:\n rewards = algo.reward_scaler.transform_numpy(rewards)\n y = rewards + algo.gamma * next_values * mask\n total_errors += ((values - y) ** 2).tolist()\n\n return float(np.mean(total_errors))" }, { "class_start_lineno": 209, "class_end_lineno": 265, "func_start_lineno": 224, "func_end_lineno": 238, "func_code": " def create(self, observation_shape: Shape) -> Encoder:\n factory: Union[PixelEncoderFactory, VectorEncoderFactory]\n if len(observation_shape) == 3:\n factory = PixelEncoderFactory(\n activation=self.activation,\n use_batch_norm=self.use_batch_norm,\n dropout_rate=self.dropout_rate,\n )\n else:\n factory = VectorEncoderFactory(\n activation=self.activation,\n use_batch_norm=self.use_batch_norm,\n dropout_rate=self.dropout_rate,\n )\n return factory.create(observation_shape)" }, { "class_start_lineno": 1, "class_end_lineno": 403, "func_start_lineno": 47, "func_end_lineno": 82, "func_code": "def create_discrete_q_function(\n observation_shape: Shape,\n action_size: int,\n encoder_factory: EncoderFactory,\n q_func_factory: QFunctionFactory,\n device: str,\n enable_ddp: bool,\n n_ensembles: int = 1,\n) -> tuple[nn.ModuleList, DiscreteEnsembleQFunctionForwarder]:\n if q_func_factory.share_encoder:\n encoder = encoder_factory.create(observation_shape)\n hidden_size = compute_output_size([observation_shape], encoder)\n # normalize gradient scale by ensemble size\n for p in cast(nn.Module, encoder).parameters():\n p.register_hook(lambda grad: grad / n_ensembles)\n\n q_funcs = []\n forwarders = []\n for _ in range(n_ensembles):\n if not q_func_factory.share_encoder:\n encoder = encoder_factory.create(observation_shape)\n hidden_size = compute_output_size([observation_shape], encoder)\n q_func, forwarder = q_func_factory.create_discrete(\n encoder, hidden_size, action_size\n )\n q_func.to(device)\n if enable_ddp:\n q_func = wrap_model_by_ddp(q_func)\n forwarder.set_q_func(q_func)\n q_funcs.append(q_func)\n forwarders.append(forwarder)\n q_func_modules = nn.ModuleList(q_funcs)\n ensemble_forwarder = DiscreteEnsembleQFunctionForwarder(\n forwarders, action_size\n )\n return q_func_modules, ensemble_forwarder" } ]
[ "Development" ]
[ "d3rlpy.dataset.transition_pickers.BasicTransitionPicker.__call__", "d3rlpy.metrics.evaluators.make_batches", "d3rlpy.metrics.evaluators.TDErrorEvaluator.__call__", "d3rlpy.models.encoders.DefaultEncoderFactory.create", "d3rlpy.models.builders.create_discrete_q_function" ]
Python
0
5
{ "total_num": 19, "base_passed_num": 0 }
[ "d3rlpy.d3rlpy.models.torch.q_functions.ensemble_q_function._gather_quantiles_by_indices", "d3rlpy.d3rlpy.models.torch.q_functions.ensemble_q_function._reduce_quantile_ensemble", "d3rlpy.d3rlpy.models.torch.q_functions.mean_q_function.DiscreteMeanQFunctionForwarder::compute_error", "d3rlpy.d3rlpy.models.torch.q_functions.iqn_q_function.DiscreteIQNQFunctionForwarder::compute_error", "d3rlpy.d3rlpy.models.torch.q_functions.ensemble_q_function.compute_ensemble_q_function_error" ]
d3rlpy
[ "d3rlpy/models/torch/q_functions/ensemble_q_function.py", "d3rlpy/models/torch/q_functions/ensemble_q_function.py", "d3rlpy/models/torch/q_functions/mean_q_function.py", "d3rlpy/models/torch/q_functions/iqn_q_function.py", "d3rlpy/models/torch/q_functions/ensemble_q_function.py", "d3rlpy/models/torch/q_functions/ensemble_q_function.py" ]
[ "tests_copy/models/torch/q_functions/test_ensemble_q_function.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 367, "func_start_lineno": 35, "func_end_lineno": 52, "func_code": "def _gather_quantiles_by_indices(\n y: torch.Tensor, indices: torch.Tensor\n) -> torch.Tensor:\n # TODO: implement this in general case\n if y.dim() == 3:\n # (N, batch, n_quantiles) -> (batch, n_quantiles)\n return y.transpose(0, 1)[torch.arange(y.shape[1]), indices]\n elif y.dim() == 4:\n # (N, batch, action, n_quantiles) -> (batch, action, N, n_quantiles)\n transposed_y = y.transpose(0, 1).transpose(1, 2)\n # (batch, action, N, n_quantiles) -> (batch * action, N, n_quantiles)\n flat_y = transposed_y.reshape(-1, y.shape[0], y.shape[3])\n head_indices = torch.arange(y.shape[1] * y.shape[2])\n # (batch * action, N, n_quantiles) -> (batch * action, n_quantiles)\n gathered_y = flat_y[head_indices, indices.view(-1)]\n # (batch * action, n_quantiles) -> (batch, action, n_quantiles)\n return gathered_y.view(y.shape[1], y.shape[2], -1)\n raise ValueError" }, { "class_start_lineno": 1, "class_end_lineno": 367, "func_start_lineno": 55, "func_end_lineno": 74, "func_code": "def _reduce_quantile_ensemble(\n y: torch.Tensor, reduction: str = \"min\", dim: int = 0, lam: float = 0.75\n) -> torch.Tensor:\n # reduction beased on expectation\n mean = y.mean(dim=-1)\n if reduction == \"min\":\n indices = mean.min(dim=dim).indices\n return _gather_quantiles_by_indices(y, indices)\n elif reduction == \"max\":\n indices = mean.max(dim=dim).indices\n return _gather_quantiles_by_indices(y, indices)\n elif reduction == \"none\":\n return y\n elif reduction == \"mix\":\n min_indices = mean.min(dim=dim).indices\n max_indices = mean.max(dim=dim).indices\n min_values = _gather_quantiles_by_indices(y, min_indices)\n max_values = _gather_quantiles_by_indices(y, max_indices)\n return lam * min_values + (1.0 - lam) * max_values\n raise ValueError" }, { "class_start_lineno": 47, "class_end_lineno": 86, "func_start_lineno": 58, "func_end_lineno": 74, "func_code": " def compute_error(\n self,\n observations: TorchObservation,\n actions: torch.Tensor,\n rewards: torch.Tensor,\n target: torch.Tensor,\n terminals: torch.Tensor,\n gamma: Union[float, torch.Tensor] = 0.99,\n reduction: str = \"mean\",\n ) -> torch.Tensor:\n one_hot = F.one_hot(actions.view(-1), num_classes=self._action_size)\n value = (self._q_func(observations).q_value * one_hot.float()).sum(\n dim=1, keepdim=True\n )\n y = rewards + gamma * target * (1 - terminals)\n loss = compute_huber_loss(value, y)\n return compute_reduce(loss, reduction)" }, { "class_start_lineno": 122, "class_end_lineno": 174, "func_start_lineno": 133, "func_end_lineno": 162, "func_code": " def compute_error(\n self,\n observations: TorchObservation,\n actions: torch.Tensor,\n rewards: torch.Tensor,\n target: torch.Tensor,\n terminals: torch.Tensor,\n gamma: Union[float, torch.Tensor] = 0.99,\n reduction: str = \"mean\",\n ) -> torch.Tensor:\n batch_size = get_batch_size(observations)\n assert target.shape == (batch_size, self._n_quantiles)\n\n # extraect quantiles corresponding to act_t\n output = self._q_func(observations)\n taus = output.taus\n all_quantiles = output.quantiles\n assert taus is not None and all_quantiles is not None\n quantiles = pick_quantile_value_by_action(all_quantiles, actions)\n\n loss = compute_quantile_loss(\n quantiles=quantiles,\n rewards=rewards,\n target=target,\n terminals=terminals,\n taus=taus,\n gamma=gamma,\n )\n\n return compute_reduce(loss, reduction)" }, { "class_start_lineno": 1, "class_end_lineno": 367, "func_start_lineno": 77, "func_end_lineno": 109, "func_code": "def compute_ensemble_q_function_error(\n forwarders: Union[\n Sequence[DiscreteQFunctionForwarder],\n Sequence[ContinuousQFunctionForwarder],\n ],\n observations: TorchObservation,\n actions: torch.Tensor,\n rewards: torch.Tensor,\n target: torch.Tensor,\n terminals: torch.Tensor,\n gamma: Union[float, torch.Tensor] = 0.99,\n masks: Optional[torch.Tensor] = None,\n) -> torch.Tensor:\n assert target.ndim == 2\n td_sum = torch.tensor(\n 0.0,\n dtype=torch.float32,\n device=get_device(observations),\n )\n for forwarder in forwarders:\n loss = forwarder.compute_error(\n observations=observations,\n actions=actions,\n rewards=rewards,\n target=target,\n terminals=terminals,\n gamma=gamma,\n reduction=\"none\",\n )\n if masks is not None:\n loss = loss * masks\n td_sum += loss.mean()\n return td_sum" }, { "class_start_lineno": 150, "class_end_lineno": 218, "func_start_lineno": 179, "func_end_lineno": 198, "func_code": " def compute_error(\n self,\n observations: TorchObservation,\n actions: torch.Tensor,\n rewards: torch.Tensor,\n target: torch.Tensor,\n terminals: torch.Tensor,\n gamma: Union[float, torch.Tensor] = 0.99,\n masks: Optional[torch.Tensor] = None,\n ) -> torch.Tensor:\n return compute_ensemble_q_function_error(\n forwarders=self._forwarders,\n observations=observations,\n actions=actions,\n rewards=rewards,\n target=target,\n terminals=terminals,\n gamma=gamma,\n masks=masks,\n )" } ]
[ "Development" ]
[ "d3rlpy.models.torch.q_functions.ensemble_q_function._gather_quantiles_by_indices", "d3rlpy.models.torch.q_functions.ensemble_q_function._reduce_quantile_ensemble", "d3rlpy.models.torch.q_functions.mean_q_function.DiscreteMeanQFunctionForwarder.compute_error", "d3rlpy.models.torch.q_functions.iqn_q_function.DiscreteIQNQFunctionForwarder.compute_error", "d3rlpy.models.torch.q_functions.ensemble_q_function.compute_ensemble_q_function_error", "d3rlpy.models.torch.q_functions.ensemble_q_function.DiscreteEnsembleQFunctionForwarder.compute_error" ]
Python
0
5
{ "total_num": 30, "base_passed_num": 4 }
[ "d3rlpy.d3rlpy.models.torch.q_functions.iqn_q_function.DiscreteIQNQFunction::forward", "d3rlpy.d3rlpy.models.torch.q_functions.iqn_q_function.DiscreteIQNQFunctionForwarder::compute_error" ]
d3rlpy
[ "d3rlpy/models/torch/q_functions/iqn_q_function.py", "d3rlpy/models/torch/q_functions/base.py", "d3rlpy/models/torch/q_functions/iqn_q_function.py" ]
[ "tests_copy/models/torch/q_functions/test_iqn_q_function.py" ]
[ { "class_start_lineno": 65, "class_end_lineno": 119, "func_start_lineno": 92, "func_end_lineno": 115, "func_code": " def forward(self, x: TorchObservation) -> QFunctionOutput:\n h = self._encoder(x)\n\n if self.training:\n n_quantiles = self._n_quantiles\n else:\n n_quantiles = self._n_greedy_quantiles\n taus = _make_taus(\n batch_size=get_batch_size(x),\n n_quantiles=n_quantiles,\n training=self.training,\n device=torch.device(get_device(x)),\n )\n\n # (batch, quantile, feature)\n prod = compute_iqn_feature(h, taus, self._embed, self._embed_size)\n # (batch, quantile, action) -> (batch, action, quantile)\n quantiles = self._fc(prod).transpose(1, 2)\n\n return QFunctionOutput(\n q_value=quantiles.mean(dim=2),\n quantiles=quantiles,\n taus=taus,\n )" }, { "class_start_lineno": null, "class_end_lineno": null, "func_start_lineno": null, "func_end_lineno": null, "func_code": "未找到 DiscreteIQNQFunction::__call__" }, { "class_start_lineno": 122, "class_end_lineno": 174, "func_start_lineno": 133, "func_end_lineno": 162, "func_code": " def compute_error(\n self,\n observations: TorchObservation,\n actions: torch.Tensor,\n rewards: torch.Tensor,\n target: torch.Tensor,\n terminals: torch.Tensor,\n gamma: Union[float, torch.Tensor] = 0.99,\n reduction: str = \"mean\",\n ) -> torch.Tensor:\n batch_size = get_batch_size(observations)\n assert target.shape == (batch_size, self._n_quantiles)\n\n # extraect quantiles corresponding to act_t\n output = self._q_func(observations)\n taus = output.taus\n all_quantiles = output.quantiles\n assert taus is not None and all_quantiles is not None\n quantiles = pick_quantile_value_by_action(all_quantiles, actions)\n\n loss = compute_quantile_loss(\n quantiles=quantiles,\n rewards=rewards,\n target=target,\n terminals=terminals,\n taus=taus,\n gamma=gamma,\n )\n\n return compute_reduce(loss, reduction)" } ]
[ "Development" ]
[ "d3rlpy.models.torch.q_functions.iqn_q_function.DiscreteIQNQFunction.forward", "d3rlpy.models.torch.q_functions.base.DiscreteIQNQFunction.__call__", "d3rlpy.models.torch.q_functions.iqn_q_function.DiscreteIQNQFunctionForwarder.compute_error" ]
Python
0
2
{ "total_num": 8, "base_passed_num": 4 }
[ "datachain.src.datachain.lib.convert.python_to_sql.python_to_sql", "datachain.src.datachain.lib.signal_schema.SignalSchema::get_column_type", "datachain.src.datachain.lib.dc.DataChain::mutate", "datachain.src.datachain.lib.signal_schema.SignalSchema::mutate" ]
datachain
[ "datachain/lib/convert/python_to_sql.py", "datachain/func/func.py", "datachain/lib/signal_schema.py", "datachain/func/func.py", "datachain/lib/dc.py", "datachain/lib/signal_schema.py" ]
[ "tests/unit/test_func.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 117, "func_start_lineno": 37, "func_end_lineno": 82, "func_code": "def python_to_sql(typ): # noqa: PLR0911\n if inspect.isclass(typ):\n if issubclass(typ, SQLType):\n return typ\n if issubclass(typ, Enum):\n return str\n\n res = PYTHON_TO_SQL.get(typ)\n if res:\n return res\n\n orig = get_origin(typ)\n\n if orig in (Literal, LiteralEx):\n return String\n\n args = get_args(typ)\n if inspect.isclass(orig) and (issubclass(list, orig) or issubclass(tuple, orig)):\n if args is None:\n raise TypeError(f\"Cannot resolve type '{typ}' for flattening features\")\n\n args0 = args[0]\n if ModelStore.is_pydantic(args0):\n return Array(JSON())\n\n list_type = list_of_args_to_type(args)\n return Array(list_type)\n\n if orig is Annotated:\n # Ignoring annotations\n return python_to_sql(args[0])\n\n if inspect.isclass(orig) and issubclass(dict, orig):\n return JSON\n\n if orig == Union:\n if len(args) == 2 and (type(None) in args):\n return python_to_sql(args[0])\n\n if _is_union_str_literal(orig, args):\n return String\n\n if _is_json_inside_union(orig, args):\n return JSON\n\n raise TypeError(f\"Cannot recognize type {typ}\")" }, { "class_start_lineno": 29, "class_end_lineno": 422, "func_start_lineno": 375, "func_end_lineno": 422, "func_code": " def get_column(\n self,\n signals_schema: Optional[\"SignalSchema\"] = None,\n label: Optional[str] = None,\n table: Optional[\"TableClause\"] = None,\n ) -> Column:\n col_type = self.get_result_type(signals_schema)\n sql_type = python_to_sql(col_type)\n\n def get_col(col: ColT, string_as_literal=False) -> ColT:\n # string_as_literal is used only for conditionals like `case()` where\n # literals are nested inside ColT as we have tuples of condition - values\n # and if user wants to set some case value as column, explicit `C(\"col\")`\n # syntax must be used to distinguish from literals\n if isinstance(col, tuple):\n return tuple(get_col(x, string_as_literal=True) for x in col)\n if isinstance(col, Func):\n return col.get_column(signals_schema, table=table)\n if isinstance(col, str) and not string_as_literal:\n column = Column(col, sql_type)\n column.table = table\n return column\n return col\n\n cols = [get_col(col) for col in self._db_cols]\n kwargs = {k: get_col(v, string_as_literal=True) for k, v in self.kwargs.items()}\n func_col = self.inner(*cols, *self.args, **kwargs)\n\n if self.is_window:\n if not self.window:\n raise DataChainParamsError(\n f\"Window function {self} requires over() clause with a window spec\",\n )\n func_col = func_col.over(\n partition_by=self.window.partition_by,\n order_by=(\n desc(self.window.order_by)\n if self.window.desc\n else self.window.order_by\n ),\n )\n\n func_col.type = sql_type() if inspect.isclass(sql_type) else sql_type\n\n if col_name := self.get_col_name(label):\n func_col = func_col.label(col_name)\n\n return func_col" }, { "class_start_lineno": 135, "class_end_lineno": 751, "func_start_lineno": 464, "func_end_lineno": 479, "func_code": " def get_column_type(self, col_name: str, with_subtree: bool = False) -> DataType:\n \"\"\"\n Returns column type by column name.\n\n If `with_subtree` is True, then it will return the type of the column\n even if it has a subtree (e.g. model with nested fields), otherwise it will\n return the type of the column (standard type field, not the model).\n\n If column is not found, raises `SignalResolvingError`.\n \"\"\"\n for path, _type, has_subtree, _ in self.get_flat_tree():\n if (with_subtree or not has_subtree) and DEFAULT_DELIMITER.join(\n path\n ) == col_name:\n return _type\n raise SignalResolvingError([col_name], \"is not found\")" }, { "class_start_lineno": 1, "class_end_lineno": 449, "func_start_lineno": 425, "func_end_lineno": 438, "func_code": "def get_db_col_type(signals_schema: \"SignalSchema\", col: ColT) -> \"DataType\":\n if isinstance(col, tuple):\n # we can only get tuple from case statement where the first tuple item\n # is condition, and second one is value which type is important\n col = col[1]\n if isinstance(col, Func):\n return col.get_result_type(signals_schema)\n\n if isinstance(col, ColumnElement) and not hasattr(col, \"name\"):\n return sql_to_python(col)\n\n return signals_schema.get_column_type(\n col.name if isinstance(col, ColumnElement) else col # type: ignore[arg-type]\n )" }, { "class_start_lineno": 174, "class_end_lineno": 2625, "func_start_lineno": 1136, "func_end_lineno": 1215, "func_code": " def mutate(self, **kwargs) -> \"Self\":\n \"\"\"Create new signals based on existing signals.\n\n This method cannot modify existing columns. If you need to modify an\n existing column, use a different name for the new column and then use\n `select()` to choose which columns to keep.\n\n This method is vectorized and more efficient compared to map(), and it does not\n extract or download any data from the internal database. However, it can only\n utilize predefined built-in functions and their combinations.\n\n The supported functions:\n Numerical: +, -, *, /, rand(), avg(), count(), func(),\n greatest(), least(), max(), min(), sum()\n String: length(), split(), replace(), regexp_replace()\n Filename: name(), parent(), file_stem(), file_ext()\n Array: length(), sip_hash_64(), euclidean_distance(),\n cosine_distance()\n Window: row_number(), rank(), dense_rank(), first()\n\n Example:\n ```py\n dc.mutate(\n area=Column(\"image.height\") * Column(\"image.width\"),\n extension=file_ext(Column(\"file.name\")),\n dist=cosine_distance(embedding_text, embedding_image)\n )\n ```\n\n Window function example:\n ```py\n window = func.window(partition_by=\"file.parent\", order_by=\"file.size\")\n dc.mutate(\n row_number=func.row_number().over(window),\n )\n ```\n\n This method can be also used to rename signals. If the Column(\"name\") provided\n as value for the new signal - the old column will be dropped. Otherwise a new\n column is created.\n\n Example:\n ```py\n dc.mutate(\n newkey=Column(\"oldkey\")\n )\n ```\n \"\"\"\n primitives = (bool, str, int, float)\n\n for col_name, expr in kwargs.items():\n if not isinstance(expr, (*primitives, Column, Func)) and isinstance(\n expr.type, NullType\n ):\n raise DataChainColumnError(\n col_name, f\"Cannot infer type with expression {expr}\"\n )\n\n mutated = {}\n schema = self.signals_schema\n for name, value in kwargs.items():\n if isinstance(value, Column):\n # renaming existing column\n for signal in schema.db_signals(name=value.name, as_columns=True):\n mutated[signal.name.replace(value.name, name, 1)] = signal # type: ignore[union-attr]\n elif isinstance(value, Func):\n # adding new signal\n mutated[name] = value.get_column(schema)\n elif isinstance(value, primitives):\n # adding simple python constant primitives like str, int, float, bool\n val = literal(value)\n val.type = python_to_sql(type(value))()\n mutated[name] = val # type: ignore[assignment]\n else:\n # adding new signal\n mutated[name] = value\n\n return self._evolve(\n query=self._query.mutate(**mutated), signal_schema=schema.mutate(kwargs)\n )" }, { "class_start_lineno": 135, "class_end_lineno": 751, "func_start_lineno": 557, "func_end_lineno": 585, "func_code": " def mutate(self, args_map: dict) -> \"SignalSchema\":\n new_values = self.values.copy()\n\n for name, value in args_map.items():\n if isinstance(value, Column) and value.name in self.values:\n # renaming existing signal\n del new_values[value.name]\n new_values[name] = self.values[value.name]\n continue\n if isinstance(value, Column):\n # adding new signal from existing signal field\n try:\n new_values[name] = self.get_column_type(\n value.name, with_subtree=True\n )\n continue\n except SignalResolvingError:\n pass\n if isinstance(value, Func):\n # adding new signal with function\n new_values[name] = value.get_result_type(self)\n continue\n if isinstance(value, ColumnElement):\n # adding new signal\n new_values[name] = sql_to_python(value)\n continue\n new_values[name] = value\n\n return SignalSchema(new_values)" } ]
[ "function_empty", "Development" ]
[ "datachain.lib.convert.python_to_sql.python_to_sql", "datachain.func.func.Func.get_column", "datachain.lib.signal_schema.SignalSchema.get_column_type", "datachain.func.func.get_db_col_type", "datachain.lib.dc.DataChain.mutate", "datachain.lib.signal_schema.SignalSchema.mutate" ]
Python
1
4
{ "total_num": 94, "base_passed_num": 39 }
[ "datachain.src.datachain.query.session.Session::_cleanup_temp_datasets", "datachain.src.datachain.query.session.Session::__exit__" ]
datachain
[ "datachain/query/session.py", "datachain/query/session.py" ]
[ "tests/unit/test_listing.py", "tests/unit/test_session.py" ]
[ { "class_start_lineno": 19, "class_end_lineno": 195, "func_start_lineno": 103, "func_end_lineno": 110, "func_code": " def _cleanup_temp_datasets(self) -> None:\n prefix = self.get_temp_prefix()\n try:\n for dataset in list(self.catalog.metastore.list_datasets_by_prefix(prefix)):\n self.catalog.remove_dataset(dataset.name, force=True)\n # suppress error when metastore has been reset during testing\n except TableMissingError:\n pass" }, { "class_start_lineno": 19, "class_end_lineno": 195, "func_start_lineno": 80, "func_end_lineno": 90, "func_code": " def __exit__(self, exc_type, exc_val, exc_tb):\n if exc_type:\n self._cleanup_created_versions()\n\n self._cleanup_temp_datasets()\n if self.is_new_catalog:\n self.catalog.metastore.close_on_exit()\n self.catalog.warehouse.close_on_exit()\n\n if Session.SESSION_CONTEXTS:\n Session.SESSION_CONTEXTS.pop()" } ]
[ "Development" ]
[ "datachain.query.session.Session._cleanup_temp_datasets", "datachain.query.session.Session.__exit__" ]
Python
0
2
{ "total_num": 24, "base_passed_num": 7 }
[ "datachain.src.datachain.lib.signal_schema.SignalSchema::_get_flat_tree", "datachain.src.datachain.lib.convert.python_to_sql.python_to_sql", "datachain.src.datachain.lib.signal_schema.SignalSchema::db_signals", "datachain.src.datachain.query.session.Session::_cleanup_temp_datasets", "datachain.src.datachain.query.session.Session::__exit__" ]
datachain
[ "datachain/lib/signal_schema.py", "datachain/lib/signal_schema.py", "datachain/lib/convert/python_to_sql.py", "datachain/lib/signal_schema.py", "datachain/query/session.py", "datachain/query/session.py" ]
[ "tests/unit/lib/test_arrow.py" ]
[ { "class_start_lineno": 135, "class_end_lineno": 751, "func_start_lineno": 630, "func_end_lineno": 639, "func_code": " def _get_flat_tree(\n self, tree: dict, prefix: list[str], depth: int\n ) -> Iterator[tuple[list[str], DataType, bool, int]]:\n for name, (type_, substree) in tree.items():\n suffix = name.split(\".\")\n new_prefix = prefix + suffix\n has_subtree = substree is not None\n yield new_prefix, type_, has_subtree, depth\n if substree is not None:\n yield from self._get_flat_tree(substree, new_prefix, depth + 1)" }, { "class_start_lineno": 135, "class_end_lineno": 751, "func_start_lineno": 627, "func_end_lineno": 628, "func_code": " def get_flat_tree(self) -> Iterator[tuple[list[str], DataType, bool, int]]:\n yield from self._get_flat_tree(self.tree, [], 0)" }, { "class_start_lineno": 1, "class_end_lineno": 117, "func_start_lineno": 37, "func_end_lineno": 82, "func_code": "def python_to_sql(typ): # noqa: PLR0911\n if inspect.isclass(typ):\n if issubclass(typ, SQLType):\n return typ\n if issubclass(typ, Enum):\n return str\n\n res = PYTHON_TO_SQL.get(typ)\n if res:\n return res\n\n orig = get_origin(typ)\n\n if orig in (Literal, LiteralEx):\n return String\n\n args = get_args(typ)\n if inspect.isclass(orig) and (issubclass(list, orig) or issubclass(tuple, orig)):\n if args is None:\n raise TypeError(f\"Cannot resolve type '{typ}' for flattening features\")\n\n args0 = args[0]\n if ModelStore.is_pydantic(args0):\n return Array(JSON())\n\n list_type = list_of_args_to_type(args)\n return Array(list_type)\n\n if orig is Annotated:\n # Ignoring annotations\n return python_to_sql(args[0])\n\n if inspect.isclass(orig) and issubclass(dict, orig):\n return JSON\n\n if orig == Union:\n if len(args) == 2 and (type(None) in args):\n return python_to_sql(args[0])\n\n if _is_union_str_literal(orig, args):\n return String\n\n if _is_json_inside_union(orig, args):\n return JSON\n\n raise TypeError(f\"Cannot recognize type {typ}\")" }, { "class_start_lineno": 135, "class_end_lineno": 751, "func_start_lineno": 481, "func_end_lineno": 503, "func_code": " def db_signals(\n self, name: Optional[str] = None, as_columns=False\n ) -> Union[list[str], list[Column]]:\n \"\"\"\n Returns DB columns as strings or Column objects with proper types\n Optionally, it can filter results by specific object, returning only his signals\n \"\"\"\n signals = [\n DEFAULT_DELIMITER.join(path)\n if not as_columns\n else Column(DEFAULT_DELIMITER.join(path), python_to_sql(_type))\n for path, _type, has_subtree, _ in self.get_flat_tree()\n if not has_subtree\n ]\n\n if name:\n signals = [\n s\n for s in signals\n if str(s) == name or str(s).startswith(f\"{name}{DEFAULT_DELIMITER}\")\n ]\n\n return signals # type: ignore[return-value]" }, { "class_start_lineno": 19, "class_end_lineno": 195, "func_start_lineno": 103, "func_end_lineno": 110, "func_code": " def _cleanup_temp_datasets(self) -> None:\n prefix = self.get_temp_prefix()\n try:\n for dataset in list(self.catalog.metastore.list_datasets_by_prefix(prefix)):\n self.catalog.remove_dataset(dataset.name, force=True)\n # suppress error when metastore has been reset during testing\n except TableMissingError:\n pass" }, { "class_start_lineno": 19, "class_end_lineno": 195, "func_start_lineno": 80, "func_end_lineno": 90, "func_code": " def __exit__(self, exc_type, exc_val, exc_tb):\n if exc_type:\n self._cleanup_created_versions()\n\n self._cleanup_temp_datasets()\n if self.is_new_catalog:\n self.catalog.metastore.close_on_exit()\n self.catalog.warehouse.close_on_exit()\n\n if Session.SESSION_CONTEXTS:\n Session.SESSION_CONTEXTS.pop()" } ]
[ "function_empty", "Development" ]
[ "datachain.lib.signal_schema.SignalSchema._get_flat_tree", "datachain.lib.signal_schema.SignalSchema.get_flat_tree", "datachain.lib.convert.python_to_sql.python_to_sql", "datachain.lib.signal_schema.SignalSchema.db_signals", "datachain.query.session.Session._cleanup_temp_datasets", "datachain.query.session.Session.__exit__" ]
Python
1
5
{ "total_num": 32, "base_passed_num": 31 }
[ "datachain.src.datachain.lib.image.convert_image", "datachain.src.datachain.lib.image.convert_images" ]
datachain
[ "datachain/lib/image.py", "datachain/lib/image.py" ]
[ "tests/unit/lib/test_clip.py", "tests/unit/lib/test_image.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 81, "func_start_lineno": 7, "func_end_lineno": 46, "func_code": "def convert_image(\n img: Image.Image,\n mode: str = \"RGB\",\n size: Optional[tuple[int, int]] = None,\n transform: Optional[Callable] = None,\n encoder: Optional[Callable] = None,\n device: Optional[Union[str, torch.device]] = None,\n) -> Union[Image.Image, torch.Tensor]:\n \"\"\"\n Resize, transform, and otherwise convert an image.\n\n Args:\n img (Image): PIL.Image object.\n mode (str): PIL.Image mode.\n size (tuple[int, int]): Size in (width, height) pixels for resizing.\n transform (Callable): Torchvision transform or huggingface processor to apply.\n encoder (Callable): Encode image using model.\n device (str or torch.device): Device to use.\n \"\"\"\n if mode:\n img = img.convert(mode)\n if size:\n img = img.resize(size)\n if transform:\n img = transform(img)\n\n try:\n from transformers.image_processing_utils import BaseImageProcessor\n\n if isinstance(transform, BaseImageProcessor):\n img = torch.as_tensor(img.pixel_values[0]).clone().detach() # type: ignore[assignment,attr-defined]\n except ImportError:\n pass\n if device:\n img = img.to(device) # type: ignore[attr-defined]\n if encoder:\n img = img.unsqueeze(0) # type: ignore[attr-defined]\n if encoder:\n img = encoder(img)\n return img" }, { "class_start_lineno": 1, "class_end_lineno": 81, "func_start_lineno": 49, "func_end_lineno": 81, "func_code": "def convert_images(\n images: Union[Image.Image, list[Image.Image]],\n mode: str = \"RGB\",\n size: Optional[tuple[int, int]] = None,\n transform: Optional[Callable] = None,\n encoder: Optional[Callable] = None,\n device: Optional[Union[str, torch.device]] = None,\n) -> Union[list[Image.Image], torch.Tensor]:\n \"\"\"\n Resize, transform, and otherwise convert one or more images.\n\n Args:\n images (Image, list[Image]): PIL.Image object or list of objects.\n mode (str): PIL.Image mode.\n size (tuple[int, int]): Size in (width, height) pixels for resizing.\n transform (Callable): Torchvision transform or huggingface processor to apply.\n encoder (Callable): Encode image using model.\n device (str or torch.device): Device to use.\n \"\"\"\n if isinstance(images, Image.Image):\n images = [images]\n\n converted = [\n convert_image(img, mode, size, transform, device=device) for img in images\n ]\n\n if isinstance(converted[0], torch.Tensor):\n converted = torch.stack(converted) # type: ignore[assignment,arg-type]\n\n if encoder:\n converted = encoder(converted)\n\n return converted # type: ignore[return-value]" } ]
[ "function_empty" ]
[ "datachain.lib.image.convert_image", "datachain.lib.image.convert_images" ]
Python
2
2
{ "total_num": 41, "base_passed_num": 13 }
[ "datachain.src.datachain.lib.file.File::get_destination_path", "datachain.src.datachain.lib.file.File::export", "datachain.src.datachain.lib.file.File::ensure_cached", "datachain.src.datachain.lib.file.File::_symlink_to" ]
datachain
[ "datachain/lib/file.py", "datachain/lib/file.py", "datachain/lib/file.py", "datachain/lib/file.py" ]
[ "tests/unit/lib/test_file.py" ]
[ { "class_start_lineno": 125, "class_end_lineno": 468, "func_start_lineno": 396, "func_end_lineno": 414, "func_code": " def get_destination_path(self, output: str, placement: ExportPlacement) -> str:\n \"\"\"\n Returns full destination path of a file for exporting to some output\n based on export placement\n \"\"\"\n if placement == \"filename\":\n path = unquote(self.name)\n elif placement == \"etag\":\n path = f\"{self.etag}{self.get_file_suffix()}\"\n elif placement == \"fullpath\":\n path = unquote(self.get_full_name())\n source = urlparse(self.source)\n if source.scheme and source.scheme != \"file\":\n path = posixpath.join(source.netloc, path)\n elif placement == \"checksum\":\n raise NotImplementedError(\"Checksum placement not implemented yet\")\n else:\n raise ValueError(f\"Unsupported file export placement: {placement}\")\n return posixpath.join(output, path) # type: ignore[union-attr]" }, { "class_start_lineno": 125, "class_end_lineno": 468, "func_start_lineno": 297, "func_end_lineno": 319, "func_code": " def export(\n self,\n output: str,\n placement: ExportPlacement = \"fullpath\",\n use_cache: bool = True,\n link_type: Literal[\"copy\", \"symlink\"] = \"copy\",\n ) -> None:\n \"\"\"Export file to new location.\"\"\"\n if use_cache:\n self._caching_enabled = use_cache\n dst = self.get_destination_path(output, placement)\n dst_dir = os.path.dirname(dst)\n client: Client = self._catalog.get_client(dst_dir)\n client.fs.makedirs(dst_dir, exist_ok=True)\n\n if link_type == \"symlink\":\n try:\n return self._symlink_to(dst)\n except OSError as exc:\n if exc.errno not in (errno.ENOTSUP, errno.EXDEV, errno.ENOSYS):\n raise\n\n self.save(dst)" }, { "class_start_lineno": 125, "class_end_lineno": 468, "func_start_lineno": 331, "func_end_lineno": 337, "func_code": " def ensure_cached(self) -> None:\n if self._catalog is None:\n raise RuntimeError(\n \"cannot download file to cache because catalog is not setup\"\n )\n client = self._catalog.get_client(self.source)\n client.download(self, callback=self._download_cb)" }, { "class_start_lineno": 125, "class_end_lineno": 468, "func_start_lineno": 282, "func_end_lineno": 295, "func_code": " def _symlink_to(self, destination: str):\n if self.location:\n raise OSError(errno.ENOTSUP, \"Symlinking virtual file is not supported\")\n\n if self._caching_enabled:\n self.ensure_cached()\n source = self.get_local_path()\n assert source, \"File was not cached\"\n elif self.source.startswith(\"file://\"):\n source = self.get_path()\n else:\n raise OSError(errno.EXDEV, \"can't link across filesystems\")\n\n return os.symlink(source, destination)" } ]
[ "function_empty", "Development" ]
[ "datachain.lib.file.File.get_destination_path", "datachain.lib.file.File.export", "datachain.lib.file.File.ensure_cached", "datachain.lib.file.File._symlink_to" ]
Python
1
4
{ "total_num": 33, "base_passed_num": 14 }
[ "datachain.src.datachain.query.session.Session::_cleanup_temp_datasets", "datachain.src.datachain.query.session.Session::__exit__", "datachain.src.datachain.lib.signal_schema.SignalSchema::_get_flat_tree", "datachain.src.datachain.lib.convert.python_to_sql.python_to_sql", "datachain.src.datachain.lib.signal_schema.SignalSchema::db_signals" ]
datachain
[ "datachain/query/session.py", "datachain/query/session.py", "datachain/lib/signal_schema.py", "datachain/lib/signal_schema.py", "datachain/lib/convert/python_to_sql.py", "datachain/lib/signal_schema.py" ]
[ "tests/unit/lib/test_signal_schema.py" ]
[ { "class_start_lineno": 19, "class_end_lineno": 195, "func_start_lineno": 103, "func_end_lineno": 110, "func_code": " def _cleanup_temp_datasets(self) -> None:\n prefix = self.get_temp_prefix()\n try:\n for dataset in list(self.catalog.metastore.list_datasets_by_prefix(prefix)):\n self.catalog.remove_dataset(dataset.name, force=True)\n # suppress error when metastore has been reset during testing\n except TableMissingError:\n pass" }, { "class_start_lineno": 19, "class_end_lineno": 195, "func_start_lineno": 80, "func_end_lineno": 90, "func_code": " def __exit__(self, exc_type, exc_val, exc_tb):\n if exc_type:\n self._cleanup_created_versions()\n\n self._cleanup_temp_datasets()\n if self.is_new_catalog:\n self.catalog.metastore.close_on_exit()\n self.catalog.warehouse.close_on_exit()\n\n if Session.SESSION_CONTEXTS:\n Session.SESSION_CONTEXTS.pop()" }, { "class_start_lineno": 135, "class_end_lineno": 751, "func_start_lineno": 630, "func_end_lineno": 639, "func_code": " def _get_flat_tree(\n self, tree: dict, prefix: list[str], depth: int\n ) -> Iterator[tuple[list[str], DataType, bool, int]]:\n for name, (type_, substree) in tree.items():\n suffix = name.split(\".\")\n new_prefix = prefix + suffix\n has_subtree = substree is not None\n yield new_prefix, type_, has_subtree, depth\n if substree is not None:\n yield from self._get_flat_tree(substree, new_prefix, depth + 1)" }, { "class_start_lineno": 135, "class_end_lineno": 751, "func_start_lineno": 627, "func_end_lineno": 628, "func_code": " def get_flat_tree(self) -> Iterator[tuple[list[str], DataType, bool, int]]:\n yield from self._get_flat_tree(self.tree, [], 0)" }, { "class_start_lineno": 1, "class_end_lineno": 117, "func_start_lineno": 37, "func_end_lineno": 82, "func_code": "def python_to_sql(typ): # noqa: PLR0911\n if inspect.isclass(typ):\n if issubclass(typ, SQLType):\n return typ\n if issubclass(typ, Enum):\n return str\n\n res = PYTHON_TO_SQL.get(typ)\n if res:\n return res\n\n orig = get_origin(typ)\n\n if orig in (Literal, LiteralEx):\n return String\n\n args = get_args(typ)\n if inspect.isclass(orig) and (issubclass(list, orig) or issubclass(tuple, orig)):\n if args is None:\n raise TypeError(f\"Cannot resolve type '{typ}' for flattening features\")\n\n args0 = args[0]\n if ModelStore.is_pydantic(args0):\n return Array(JSON())\n\n list_type = list_of_args_to_type(args)\n return Array(list_type)\n\n if orig is Annotated:\n # Ignoring annotations\n return python_to_sql(args[0])\n\n if inspect.isclass(orig) and issubclass(dict, orig):\n return JSON\n\n if orig == Union:\n if len(args) == 2 and (type(None) in args):\n return python_to_sql(args[0])\n\n if _is_union_str_literal(orig, args):\n return String\n\n if _is_json_inside_union(orig, args):\n return JSON\n\n raise TypeError(f\"Cannot recognize type {typ}\")" }, { "class_start_lineno": 135, "class_end_lineno": 751, "func_start_lineno": 481, "func_end_lineno": 503, "func_code": " def db_signals(\n self, name: Optional[str] = None, as_columns=False\n ) -> Union[list[str], list[Column]]:\n \"\"\"\n Returns DB columns as strings or Column objects with proper types\n Optionally, it can filter results by specific object, returning only his signals\n \"\"\"\n signals = [\n DEFAULT_DELIMITER.join(path)\n if not as_columns\n else Column(DEFAULT_DELIMITER.join(path), python_to_sql(_type))\n for path, _type, has_subtree, _ in self.get_flat_tree()\n if not has_subtree\n ]\n\n if name:\n signals = [\n s\n for s in signals\n if str(s) == name or str(s).startswith(f\"{name}{DEFAULT_DELIMITER}\")\n ]\n\n return signals # type: ignore[return-value]" } ]
[ "function_empty", "Development" ]
[ "datachain.query.session.Session._cleanup_temp_datasets", "datachain.query.session.Session.__exit__", "datachain.lib.signal_schema.SignalSchema._get_flat_tree", "datachain.lib.signal_schema.SignalSchema.get_flat_tree", "datachain.lib.convert.python_to_sql.python_to_sql", "datachain.lib.signal_schema.SignalSchema.db_signals" ]
Python
1
5
{ "total_num": 58, "base_passed_num": 36 }
[ "datachain.src.datachain.lib.webdataset.Builder::add", "datachain.src.datachain.lib.webdataset.get_tar_groups" ]
datachain
[ "datachain/lib/webdataset.py", "datachain/lib/webdataset.py" ]
[ "tests/unit/lib/test_webdataset.py" ]
[ { "class_start_lineno": 104, "class_end_lineno": 194, "func_start_lineno": 134, "func_end_lineno": 171, "func_code": " def add(self, file: tarfile.TarInfo):\n fstream = File(path=file.name)\n ext = fstream.get_file_ext()\n stem = fstream.get_file_stem()\n\n if self.state.stem is not None and self.state.stem != stem:\n raise StopIteration\n\n if self.state.stem is None:\n self.state.stem = stem\n\n if ext in self._core_extensions:\n if self.state.core_file is not None:\n raise CoreFileDuplicationError(\n self._tar_stream, file.name, self.state.core_file.name\n )\n self.state.core_file = file\n elif ext in self.state.data:\n raise WDSError(\n self._tar_stream,\n f\"file with extension '.{ext}' already exists in the archive\",\n )\n else:\n type_ = self._get_type(ext)\n if type_ is None:\n raise UnknownFileExtensionError(self._tar_stream, fstream.name, ext)\n\n if issubclass(type_, WDSReadableSubclass):\n reader = type_._reader\n else:\n reader = self.DEFAULT_TYPES_READERS.get(type_, None)\n\n if reader is None:\n raise WDSError(\n self._tar_stream,\n f\"unable to find a reader for type {type_}, extension .{ext}\",\n )\n self.state.data[ext] = reader(self, file)" }, { "class_start_lineno": 1, "class_end_lineno": 220, "func_start_lineno": 197, "func_end_lineno": 209, "func_code": "def get_tar_groups(stream, tar, core_extensions, spec, encoding=\"utf-8\"):\n builder = Builder(stream, core_extensions, spec, tar, encoding)\n\n for item in sorted(tar.getmembers(), key=lambda m: Path(m.name).stem):\n if not item.isfile():\n continue\n try:\n builder.add(item)\n except StopIteration:\n yield builder.produce()\n builder.add(item)\n if builder.state.stem is not None:\n yield builder.produce()" } ]
[ "Development" ]
[ "datachain.lib.webdataset.Builder.add", "datachain.lib.webdataset.get_tar_groups" ]
Python
0
2
{ "total_num": 7, "base_passed_num": 0 }
[ "datachain.src.datachain.func.conditional.case", "datachain.src.datachain.func.conditional.ifelse" ]
datachain
[ "datachain/func/conditional.py", "datachain/func/conditional.py" ]
[ "tests/unit/sql/test_conditional.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 270, "func_start_lineno": 93, "func_end_lineno": 158, "func_code": "def case(\n *args: tuple[Union[ColumnElement, Func, bool], CaseT], else_: Optional[CaseT] = None\n) -> Func:\n \"\"\"\n Returns the case function that produces case expression which has a list of\n conditions and corresponding results. Results can be python primitives like string,\n numbers or booleans but can also be other nested functions (including case function)\n or columns.\n Result type is inferred from condition results.\n\n Args:\n args tuple((ColumnElement | Func | bool),(str | int | float | complex | bool, Func, ColumnElement)):\n Tuple of condition and values pair.\n else_ (str | int | float | complex | bool, Func): optional else value in case\n expression. If omitted, and no case conditions are satisfied, the result\n will be None (NULL in DB).\n\n Returns:\n Func: A Func object that represents the case function.\n\n Example:\n ```py\n dc.mutate(\n res=func.case((C(\"num\") > 0, \"P\"), (C(\"num\") < 0, \"N\"), else_=\"Z\"),\n )\n ```\n \"\"\" # noqa: E501\n supported_types = [int, float, complex, str, bool]\n\n def _get_type(val):\n from enum import Enum\n\n if isinstance(val, Func):\n # nested functions\n return val.result_type\n if isinstance(val, Column):\n # at this point we cannot know what is the type of a column\n return None\n if isinstance(val, Enum):\n return type(val.value)\n return type(val)\n\n if not args:\n raise DataChainParamsError(\"Missing statements\")\n\n type_ = _get_type(else_) if else_ is not None else None\n\n for arg in args:\n arg_type = _get_type(arg[1])\n if arg_type is None:\n # we couldn't figure out the type of case value\n continue\n if type_ and arg_type != type_:\n raise DataChainParamsError(\n f\"Statement values must be of the same type, got {type_} and {arg_type}\"\n )\n type_ = arg_type\n\n if type_ is not None and type_ not in supported_types:\n raise DataChainParamsError(\n f\"Only python literals ({supported_types}) are supported for values\"\n )\n\n kwargs = {\"else_\": else_}\n\n return Func(\"case\", inner=sql_case, cols=args, kwargs=kwargs, result_type=type_)" }, { "class_start_lineno": 1, "class_end_lineno": 270, "func_start_lineno": 161, "func_end_lineno": 187, "func_code": "def ifelse(\n condition: Union[ColumnElement, Func], if_val: CaseT, else_val: CaseT\n) -> Func:\n \"\"\"\n Returns the ifelse function that produces if expression which has a condition\n and values for true and false outcome. Results can be one of python primitives\n like string, numbers or booleans, but can also be nested functions or columns.\n Result type is inferred from the values.\n\n Args:\n condition (ColumnElement, Func): Condition which is evaluated.\n if_val (str | int | float | complex | bool, Func, ColumnElement): Value for true\n condition outcome.\n else_val (str | int | float | complex | bool, Func, ColumnElement): Value for\n false condition outcome.\n\n Returns:\n Func: A Func object that represents the ifelse function.\n\n Example:\n ```py\n dc.mutate(\n res=func.ifelse(isnone(\"col\"), \"EMPTY\", \"NOT_EMPTY\")\n )\n ```\n \"\"\"\n return case((condition, if_val), else_=else_val)" } ]
[ "function_empty", "Development" ]
[ "datachain.func.conditional.case", "datachain.func.conditional.ifelse" ]
Python
1
2
{ "total_num": 34, "base_passed_num": 2 }
[ "haystack.haystack.components.builders.prompt_builder.PromptBuilder::_validate_variables", "haystack.haystack.components.builders.prompt_builder.PromptBuilder::run", "haystack.haystack.core.type_utils._strict_types_are_compatible", "haystack.haystack.core.type_utils._types_are_compatible" ]
haystack
[ "haystack/components/builders/prompt_builder.py", "haystack/components/builders/prompt_builder.py", "haystack/core/type_utils.py", "haystack/core/type_utils.py" ]
[ "test/components/builders/test_prompt_builder.py" ]
[ { "class_start_lineno": 17, "class_end_lineno": 266, "func_start_lineno": 247, "func_end_lineno": 266, "func_code": " def _validate_variables(self, provided_variables: Set[str]):\n \"\"\"\n Checks if all the required template variables are provided.\n\n :param provided_variables:\n A set of provided template variables.\n :raises ValueError:\n If any of the required template variables is not provided.\n \"\"\"\n if self.required_variables == \"*\":\n required_variables = sorted(self.variables)\n else:\n required_variables = self.required_variables\n missing_variables = [var for var in required_variables if var not in provided_variables]\n if missing_variables:\n missing_vars_str = \", \".join(missing_variables)\n raise ValueError(\n f\"Missing required input variables in PromptBuilder: {missing_vars_str}. \"\n f\"Required variables: {required_variables}. Provided variables: {provided_variables}.\"\n )" }, { "class_start_lineno": 17, "class_end_lineno": 266, "func_start_lineno": 213, "func_end_lineno": 245, "func_code": " def run(self, template: Optional[str] = None, template_variables: Optional[Dict[str, Any]] = None, **kwargs):\n \"\"\"\n Renders the prompt template with the provided variables.\n\n It applies the template variables to render the final prompt. You can provide variables via pipeline kwargs.\n In order to overwrite the default template, you can set the `template` parameter.\n In order to overwrite pipeline kwargs, you can set the `template_variables` parameter.\n\n :param template:\n An optional string template to overwrite PromptBuilder's default template. If None, the default template\n provided at initialization is used.\n :param template_variables:\n An optional dictionary of template variables to overwrite the pipeline variables.\n :param kwargs:\n Pipeline variables used for rendering the prompt.\n\n :returns: A dictionary with the following keys:\n - `prompt`: The updated prompt text after rendering the prompt template.\n\n :raises ValueError:\n If any of the required template variables is not provided.\n \"\"\"\n kwargs = kwargs or {}\n template_variables = template_variables or {}\n template_variables_combined = {**kwargs, **template_variables}\n self._validate_variables(set(template_variables_combined.keys()))\n\n compiled_template = self.template\n if template is not None:\n compiled_template = self._env.from_string(template)\n\n result = compiled_template.render(template_variables_combined)\n return {\"prompt\": result}" }, { "class_start_lineno": 1, "class_end_lineno": 105, "func_start_lineno": 29, "func_end_lineno": 76, "func_code": "def _strict_types_are_compatible(sender, receiver): # pylint: disable=too-many-return-statements\n \"\"\"\n Checks whether the sender type is equal to or a subtype of the receiver type under strict validation.\n\n Note: this method has no pretense to perform proper type matching. It especially does not deal with aliasing of\n typing classes such as `List` or `Dict` to their runtime counterparts `list` and `dict`. It also does not deal well\n with \"bare\" types, so `List` is treated differently from `List[Any]`, even though they should be the same.\n Consider simplifying the typing of your components if you observe unexpected errors during component connection.\n\n :param sender: The sender type.\n :param receiver: The receiver type.\n :return: True if the sender type is strictly compatible with the receiver type, False otherwise.\n \"\"\"\n if sender == receiver or receiver is Any:\n return True\n\n if sender is Any:\n return False\n\n try:\n if issubclass(sender, receiver):\n return True\n except TypeError: # typing classes can't be used with issubclass, so we deal with them below\n pass\n\n sender_origin = get_origin(sender)\n receiver_origin = get_origin(receiver)\n\n if sender_origin is not Union and receiver_origin is Union:\n return any(_strict_types_are_compatible(sender, union_arg) for union_arg in get_args(receiver))\n\n # Both must have origins and they must be equal\n if not (sender_origin and receiver_origin and sender_origin == receiver_origin):\n return False\n\n # Compare generic type arguments\n sender_args = get_args(sender)\n receiver_args = get_args(receiver)\n\n # Handle bare types\n if not sender_args and sender_origin:\n sender_args = (Any,)\n if not receiver_args and receiver_origin:\n receiver_args = (Any,) * (len(sender_args) if sender_args else 1)\n if len(sender_args) > len(receiver_args):\n return False\n\n return all(_strict_types_are_compatible(*args) for args in zip(sender_args, receiver_args))" }, { "class_start_lineno": 1, "class_end_lineno": 105, "func_start_lineno": 14, "func_end_lineno": 26, "func_code": "def _types_are_compatible(sender, receiver, type_validation: bool = True) -> bool:\n \"\"\"\n Determines if two types are compatible based on the specified validation mode.\n\n :param sender: The sender type.\n :param receiver: The receiver type.\n :param type_validation: Whether to perform strict type validation.\n :return: True if the types are compatible, False otherwise.\n \"\"\"\n if type_validation:\n return _strict_types_are_compatible(sender, receiver)\n else:\n return True" } ]
[ "function_empty" ]
[ "haystack.components.builders.prompt_builder.PromptBuilder._validate_variables", "haystack.components.builders.prompt_builder.PromptBuilder.run", "haystack.core.type_utils._strict_types_are_compatible", "haystack.core.type_utils._types_are_compatible" ]
Python
4
4
{ "total_num": 29, "base_passed_num": 7 }
[ "haystack.haystack.core.type_utils._strict_types_are_compatible", "haystack.haystack.core.type_utils._types_are_compatible", "haystack.haystack.document_stores.in_memory.document_store.InMemoryDocumentStore::to_dict", "haystack.haystack.components.retrievers.in_memory.bm25_retriever.InMemoryBM25Retriever::to_dict", "haystack.haystack.core.serialization.component_to_dict" ]
haystack
[ "haystack/core/type_utils.py", "haystack/core/type_utils.py", "haystack/document_stores/in_memory/document_store.py", "haystack/components/retrievers/in_memory/bm25_retriever.py", "haystack/core/serialization.py" ]
[ "test/components/classifiers/test_zero_shot_document_classifier.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 105, "func_start_lineno": 29, "func_end_lineno": 76, "func_code": "def _strict_types_are_compatible(sender, receiver): # pylint: disable=too-many-return-statements\n \"\"\"\n Checks whether the sender type is equal to or a subtype of the receiver type under strict validation.\n\n Note: this method has no pretense to perform proper type matching. It especially does not deal with aliasing of\n typing classes such as `List` or `Dict` to their runtime counterparts `list` and `dict`. It also does not deal well\n with \"bare\" types, so `List` is treated differently from `List[Any]`, even though they should be the same.\n Consider simplifying the typing of your components if you observe unexpected errors during component connection.\n\n :param sender: The sender type.\n :param receiver: The receiver type.\n :return: True if the sender type is strictly compatible with the receiver type, False otherwise.\n \"\"\"\n if sender == receiver or receiver is Any:\n return True\n\n if sender is Any:\n return False\n\n try:\n if issubclass(sender, receiver):\n return True\n except TypeError: # typing classes can't be used with issubclass, so we deal with them below\n pass\n\n sender_origin = get_origin(sender)\n receiver_origin = get_origin(receiver)\n\n if sender_origin is not Union and receiver_origin is Union:\n return any(_strict_types_are_compatible(sender, union_arg) for union_arg in get_args(receiver))\n\n # Both must have origins and they must be equal\n if not (sender_origin and receiver_origin and sender_origin == receiver_origin):\n return False\n\n # Compare generic type arguments\n sender_args = get_args(sender)\n receiver_args = get_args(receiver)\n\n # Handle bare types\n if not sender_args and sender_origin:\n sender_args = (Any,)\n if not receiver_args and receiver_origin:\n receiver_args = (Any,) * (len(sender_args) if sender_args else 1)\n if len(sender_args) > len(receiver_args):\n return False\n\n return all(_strict_types_are_compatible(*args) for args in zip(sender_args, receiver_args))" }, { "class_start_lineno": 1, "class_end_lineno": 105, "func_start_lineno": 14, "func_end_lineno": 26, "func_code": "def _types_are_compatible(sender, receiver, type_validation: bool = True) -> bool:\n \"\"\"\n Determines if two types are compatible based on the specified validation mode.\n\n :param sender: The sender type.\n :param receiver: The receiver type.\n :param type_validation: Whether to perform strict type validation.\n :return: True if the types are compatible, False otherwise.\n \"\"\"\n if type_validation:\n return _strict_types_are_compatible(sender, receiver)\n else:\n return True" }, { "class_start_lineno": 58, "class_end_lineno": 738, "func_start_lineno": 344, "func_end_lineno": 358, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serializes the component to a dictionary.\n\n :returns:\n Dictionary with serialized data.\n \"\"\"\n return default_to_dict(\n self,\n bm25_tokenization_regex=self.bm25_tokenization_regex,\n bm25_algorithm=self.bm25_algorithm,\n bm25_parameters=self.bm25_parameters,\n embedding_similarity_function=self.embedding_similarity_function,\n index=self.index,\n )" }, { "class_start_lineno": 13, "class_end_lineno": 203, "func_start_lineno": 88, "func_end_lineno": 103, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serializes the component to a dictionary.\n\n :returns:\n Dictionary with serialized data.\n \"\"\"\n docstore = self.document_store.to_dict()\n return default_to_dict(\n self,\n document_store=docstore,\n filters=self.filters,\n top_k=self.top_k,\n scale_score=self.scale_score,\n filter_policy=self.filter_policy.value,\n )" }, { "class_start_lineno": 1, "class_end_lineno": 264, "func_start_lineno": 36, "func_end_lineno": 82, "func_code": "def component_to_dict(obj: Any, name: str) -> Dict[str, Any]:\n \"\"\"\n Converts a component instance into a dictionary.\n\n If a `to_dict` method is present in the component instance, that will be used instead of the default method.\n\n :param obj:\n The component to be serialized.\n :param name:\n The name of the component.\n :returns:\n A dictionary representation of the component.\n\n :raises SerializationError:\n If the component doesn't have a `to_dict` method.\n If the values of the init parameters can't be determined.\n If a non-basic Python type is used in the serialized data.\n \"\"\"\n if hasattr(obj, \"to_dict\"):\n data = obj.to_dict()\n else:\n init_parameters = {}\n for param_name, param in inspect.signature(obj.__init__).parameters.items():\n # Ignore `args` and `kwargs`, used by the default constructor\n if param_name in (\"args\", \"kwargs\"):\n continue\n try:\n # This only works if the Component constructor assigns the init\n # parameter to an instance variable or property with the same name\n param_value = getattr(obj, param_name)\n except AttributeError as e:\n # If the parameter doesn't have a default value, raise an error\n if param.default is param.empty:\n raise SerializationError(\n f\"Cannot determine the value of the init parameter '{param_name}' \"\n f\"for the class {obj.__class__.__name__}.\"\n f\"You can fix this error by assigning 'self.{param_name} = {param_name}' or adding a \"\n f\"custom serialization method 'to_dict' to the class.\"\n ) from e\n # In case the init parameter was not assigned, we use the default value\n param_value = param.default\n init_parameters[param_name] = param_value\n\n data = default_to_dict(obj, **init_parameters)\n\n _validate_component_to_dict_output(obj, name, data)\n return data" } ]
[ "function_empty", "Development" ]
[ "haystack.core.type_utils._strict_types_are_compatible", "haystack.core.type_utils._types_are_compatible", "haystack.document_stores.in_memory.document_store.InMemoryDocumentStore.to_dict", "haystack.components.retrievers.in_memory.bm25_retriever.InMemoryBM25Retriever.to_dict", "haystack.core.serialization.component_to_dict" ]
Python
4
5
{ "total_num": 10, "base_passed_num": 9 }
[ "haystack.haystack.core.serialization.default_to_dict", "haystack.haystack.core.serialization.component_to_dict" ]
haystack
[ "haystack/core/serialization.py", "haystack/components/connectors/openapi_service.py", "haystack/core/serialization.py" ]
[ "test/components/connectors/test_openapi_service.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 264, "func_start_lineno": 172, "func_end_lineno": 210, "func_code": "def default_to_dict(obj: Any, **init_parameters) -> Dict[str, Any]:\n \"\"\"\n Utility function to serialize an object to a dictionary.\n\n This is mostly necessary for components but can be used by any object.\n `init_parameters` are parameters passed to the object class `__init__`.\n They must be defined explicitly as they'll be used when creating a new\n instance of `obj` with `from_dict`. Omitting them might cause deserialisation\n errors or unexpected behaviours later, when calling `from_dict`.\n\n An example usage:\n\n ```python\n class MyClass:\n def __init__(self, my_param: int = 10):\n self.my_param = my_param\n\n def to_dict(self):\n return default_to_dict(self, my_param=self.my_param)\n\n\n obj = MyClass(my_param=1000)\n data = obj.to_dict()\n assert data == {\n \"type\": \"MyClass\",\n \"init_parameters\": {\n \"my_param\": 1000,\n },\n }\n ```\n\n :param obj:\n The object to be serialized.\n :param init_parameters:\n The parameters used to create a new instance of the class.\n :returns:\n A dictionary representation of the instance.\n \"\"\"\n return {\"type\": generate_qualified_class_name(type(obj)), \"init_parameters\": init_parameters}" }, { "class_start_lineno": 149, "class_end_lineno": 399, "func_start_lineno": 265, "func_end_lineno": 272, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serializes the component to a dictionary.\n\n :returns:\n Dictionary with serialized data.\n \"\"\"\n return default_to_dict(self, ssl_verify=self.ssl_verify)" }, { "class_start_lineno": 1, "class_end_lineno": 264, "func_start_lineno": 36, "func_end_lineno": 82, "func_code": "def component_to_dict(obj: Any, name: str) -> Dict[str, Any]:\n \"\"\"\n Converts a component instance into a dictionary.\n\n If a `to_dict` method is present in the component instance, that will be used instead of the default method.\n\n :param obj:\n The component to be serialized.\n :param name:\n The name of the component.\n :returns:\n A dictionary representation of the component.\n\n :raises SerializationError:\n If the component doesn't have a `to_dict` method.\n If the values of the init parameters can't be determined.\n If a non-basic Python type is used in the serialized data.\n \"\"\"\n if hasattr(obj, \"to_dict\"):\n data = obj.to_dict()\n else:\n init_parameters = {}\n for param_name, param in inspect.signature(obj.__init__).parameters.items():\n # Ignore `args` and `kwargs`, used by the default constructor\n if param_name in (\"args\", \"kwargs\"):\n continue\n try:\n # This only works if the Component constructor assigns the init\n # parameter to an instance variable or property with the same name\n param_value = getattr(obj, param_name)\n except AttributeError as e:\n # If the parameter doesn't have a default value, raise an error\n if param.default is param.empty:\n raise SerializationError(\n f\"Cannot determine the value of the init parameter '{param_name}' \"\n f\"for the class {obj.__class__.__name__}.\"\n f\"You can fix this error by assigning 'self.{param_name} = {param_name}' or adding a \"\n f\"custom serialization method 'to_dict' to the class.\"\n ) from e\n # In case the init parameter was not assigned, we use the default value\n param_value = param.default\n init_parameters[param_name] = param_value\n\n data = default_to_dict(obj, **init_parameters)\n\n _validate_component_to_dict_output(obj, name, data)\n return data" } ]
[ "function_empty", "Development" ]
[ "haystack.core.serialization.default_to_dict", "haystack.components.connectors.openapi_service.OpenAPIServiceConnector.to_dict", "haystack.core.serialization.component_to_dict" ]
Python
1
2
{ "total_num": 12, "base_passed_num": 10 }
[ "haystack.haystack.core.serialization.default_to_dict", "haystack.haystack.components.converters.json.JSONConverter::to_dict", "haystack.haystack.components.converters.utils.normalize_metadata", "haystack.haystack.components.converters.json.JSONConverter::run" ]
haystack
[ "haystack/core/serialization.py", "haystack/components/converters/json.py", "haystack/components/converters/utils.py", "haystack/components/converters/json.py" ]
[ "test/components/converters/test_json.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 264, "func_start_lineno": 172, "func_end_lineno": 210, "func_code": "def default_to_dict(obj: Any, **init_parameters) -> Dict[str, Any]:\n \"\"\"\n Utility function to serialize an object to a dictionary.\n\n This is mostly necessary for components but can be used by any object.\n `init_parameters` are parameters passed to the object class `__init__`.\n They must be defined explicitly as they'll be used when creating a new\n instance of `obj` with `from_dict`. Omitting them might cause deserialisation\n errors or unexpected behaviours later, when calling `from_dict`.\n\n An example usage:\n\n ```python\n class MyClass:\n def __init__(self, my_param: int = 10):\n self.my_param = my_param\n\n def to_dict(self):\n return default_to_dict(self, my_param=self.my_param)\n\n\n obj = MyClass(my_param=1000)\n data = obj.to_dict()\n assert data == {\n \"type\": \"MyClass\",\n \"init_parameters\": {\n \"my_param\": 1000,\n },\n }\n ```\n\n :param obj:\n The object to be serialized.\n :param init_parameters:\n The parameters used to create a new instance of the class.\n :returns:\n A dictionary representation of the instance.\n \"\"\"\n return {\"type\": generate_qualified_class_name(type(obj)), \"init_parameters\": init_parameters}" }, { "class_start_lineno": 22, "class_end_lineno": 291, "func_start_lineno": 152, "func_end_lineno": 165, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serializes the component to a dictionary.\n\n :returns:\n Dictionary with serialized data.\n \"\"\"\n return default_to_dict(\n self,\n jq_schema=self._jq_schema,\n content_key=self._content_key,\n extra_meta_fields=self._meta_fields,\n store_full_path=self._store_full_path,\n )" }, { "class_start_lineno": 1, "class_end_lineno": 51, "func_start_lineno": 30, "func_end_lineno": 51, "func_code": "def normalize_metadata(\n meta: Optional[Union[Dict[str, Any], List[Dict[str, Any]]]], sources_count: int\n) -> List[Dict[str, Any]]:\n \"\"\"\n Normalize the metadata input for a converter.\n\n Given all the possible value of the meta input for a converter (None, dictionary or list of dicts),\n makes sure to return a list of dictionaries of the correct length for the converter to use.\n\n :param meta: the meta input of the converter, as-is\n :param sources_count: the number of sources the converter received\n :returns: a list of dictionaries of the make length as the sources list\n \"\"\"\n if meta is None:\n return [{}] * sources_count\n if isinstance(meta, dict):\n return [meta] * sources_count\n if isinstance(meta, list):\n if sources_count != len(meta):\n raise ValueError(\"The length of the metadata list must match the number of sources.\")\n return meta\n raise ValueError(\"meta must be either None, a dictionary or a list of dictionaries.\")" }, { "class_start_lineno": 22, "class_end_lineno": 291, "func_start_lineno": 250, "func_end_lineno": 291, "func_code": " def run(\n self,\n sources: List[Union[str, Path, ByteStream]],\n meta: Optional[Union[Dict[str, Any], List[Dict[str, Any]]]] = None,\n ):\n \"\"\"\n Converts a list of JSON files to documents.\n\n :param sources:\n A list of file paths or ByteStream objects.\n :param meta:\n Optional metadata to attach to the documents.\n This value can be either a list of dictionaries or a single dictionary.\n If it's a single dictionary, its content is added to the metadata of all produced documents.\n If it's a list, the length of the list must match the number of sources.\n If `sources` contain ByteStream objects, their `meta` will be added to the output documents.\n\n :returns:\n A dictionary with the following keys:\n - `documents`: A list of created documents.\n \"\"\"\n documents = []\n meta_list = normalize_metadata(meta=meta, sources_count=len(sources))\n\n for source, metadata in zip(sources, meta_list):\n try:\n bytestream = get_bytestream_from_source(source)\n except Exception as exc:\n logger.warning(\"Could not read {source}. Skipping it. Error: {error}\", source=source, error=exc)\n continue\n\n data = self._get_content_and_meta(bytestream)\n\n for text, extra_meta in data:\n merged_metadata = {**bytestream.meta, **metadata, **extra_meta}\n\n if not self._store_full_path and (file_path := bytestream.meta.get(\"file_path\")):\n merged_metadata[\"file_path\"] = os.path.basename(file_path)\n document = Document(content=text, meta=merged_metadata)\n documents.append(document)\n\n return {\"documents\": documents}" } ]
[ "function_empty", "Development" ]
[ "haystack.core.serialization.default_to_dict", "haystack.components.converters.json.JSONConverter.to_dict", "haystack.components.converters.utils.normalize_metadata", "haystack.components.converters.json.JSONConverter.run" ]
Python
3
4
{ "total_num": 19, "base_passed_num": 5 }
[ "haystack.haystack.components.converters.openapi_functions.OpenAPIServiceToFunctions::_parse_openapi_spec", "haystack.haystack.components.converters.openapi_functions.OpenAPIServiceToFunctions::run", "haystack.haystack.components.converters.openapi_functions.OpenAPIServiceToFunctions::_parse_property_attributes", "haystack.haystack.components.converters.openapi_functions.OpenAPIServiceToFunctions::_parse_endpoint_spec", "haystack.haystack.components.converters.openapi_functions.OpenAPIServiceToFunctions::_openapi_to_functions" ]
haystack
[ "haystack/components/converters/openapi_functions.py", "haystack/components/converters/openapi_functions.py", "haystack/components/converters/openapi_functions.py", "haystack/components/converters/openapi_functions.py", "haystack/components/converters/openapi_functions.py" ]
[ "test/components/converters/test_openapi_functions.py" ]
[ { "class_start_lineno": 23, "class_end_lineno": 257, "func_start_lineno": 232, "func_end_lineno": 257, "func_code": " def _parse_openapi_spec(self, content: str) -> Dict[str, Any]:\n \"\"\"\n Parses OpenAPI specification content, supporting both JSON and YAML formats.\n\n :param content: The content of the OpenAPI specification.\n :return: The parsed OpenAPI specification.\n \"\"\"\n open_api_spec_content = None\n try:\n open_api_spec_content = json.loads(content)\n return jsonref.replace_refs(open_api_spec_content)\n except json.JSONDecodeError as json_error:\n # heuristic to confirm that the content is likely malformed JSON\n if content.strip().startswith((\"{\", \"[\")):\n raise json_error\n\n try:\n open_api_spec_content = yaml.safe_load(content)\n except yaml.YAMLError:\n error_message = (\n \"Failed to parse the OpenAPI specification. The content does not appear to be valid JSON or YAML.\\n\\n\"\n )\n raise RuntimeError(error_message, content)\n\n # Replace references in the object with their resolved values, if any\n return jsonref.replace_refs(open_api_spec_content)" }, { "class_start_lineno": 23, "class_end_lineno": 257, "func_start_lineno": 56, "func_end_lineno": 115, "func_code": " def run(self, sources: List[Union[str, Path, ByteStream]]) -> Dict[str, Any]:\n \"\"\"\n Converts OpenAPI definitions in OpenAI function calling format.\n\n :param sources:\n File paths or ByteStream objects of OpenAPI definitions (in JSON or YAML format).\n\n :returns:\n A dictionary with the following keys:\n - functions: Function definitions in JSON object format\n - openapi_specs: OpenAPI specs in JSON/YAML object format with resolved references\n\n :raises RuntimeError:\n If the OpenAPI definitions cannot be downloaded or processed.\n :raises ValueError:\n If the source type is not recognized or no functions are found in the OpenAPI definitions.\n \"\"\"\n all_extracted_fc_definitions: List[Dict[str, Any]] = []\n all_openapi_specs = []\n for source in sources:\n openapi_spec_content = None\n if isinstance(source, (str, Path)):\n if os.path.exists(source):\n try:\n with open(source, \"r\") as f:\n openapi_spec_content = f.read()\n except IOError as e:\n logger.warning(\n \"IO error reading OpenAPI specification file: {source}. Error: {e}\", source=source, e=e\n )\n else:\n logger.warning(f\"OpenAPI specification file not found: {source}\")\n elif isinstance(source, ByteStream):\n openapi_spec_content = source.data.decode(\"utf-8\")\n if not openapi_spec_content:\n logger.warning(\n \"Invalid OpenAPI specification content provided: {openapi_spec_content}\",\n openapi_spec_content=openapi_spec_content,\n )\n else:\n logger.warning(\n \"Invalid source type {source}. Only str, Path, and ByteStream are supported.\", source=type(source)\n )\n continue\n\n if openapi_spec_content:\n try:\n service_openapi_spec = self._parse_openapi_spec(openapi_spec_content)\n functions: List[Dict[str, Any]] = self._openapi_to_functions(service_openapi_spec)\n all_extracted_fc_definitions.extend(functions)\n all_openapi_specs.append(service_openapi_spec)\n except Exception as e:\n logger.error(\n \"Error processing OpenAPI specification from source {source}: {error}\", source=source, error=e\n )\n\n if not all_extracted_fc_definitions:\n logger.warning(\"No OpenAI function definitions extracted from the provided OpenAPI specification sources.\")\n\n return {\"functions\": all_extracted_fc_definitions, \"openapi_specs\": all_openapi_specs}" }, { "class_start_lineno": 23, "class_end_lineno": 257, "func_start_lineno": 193, "func_end_lineno": 230, "func_code": " def _parse_property_attributes(\n self, property_schema: Dict[str, Any], include_attributes: Optional[List[str]] = None\n ) -> Dict[str, Any]:\n \"\"\"\n Parses the attributes of a property schema.\n\n Recursively parses the attributes of a property schema, including nested objects and arrays,\n and includes specified attributes like description, pattern, etc.\n\n :param property_schema: The schema of the property to parse.\n :param include_attributes: The list of attributes to include in the parsed schema.\n :return: The parsed schema of the property including the specified attributes.\n \"\"\"\n include_attributes = include_attributes or [\"description\", \"pattern\", \"enum\"]\n\n schema_type = property_schema.get(\"type\")\n\n parsed_schema = {\"type\": schema_type} if schema_type else {}\n for attr in include_attributes:\n if attr in property_schema:\n parsed_schema[attr] = property_schema[attr]\n\n if schema_type == \"object\":\n properties = property_schema.get(\"properties\", {})\n parsed_properties = {\n prop_name: self._parse_property_attributes(prop, include_attributes)\n for prop_name, prop in properties.items()\n }\n parsed_schema[\"properties\"] = parsed_properties\n\n if \"required\" in property_schema:\n parsed_schema[\"required\"] = property_schema[\"required\"]\n\n elif schema_type == \"array\":\n items = property_schema.get(\"items\", {})\n parsed_schema[\"items\"] = self._parse_property_attributes(items, include_attributes)\n\n return parsed_schema" }, { "class_start_lineno": 23, "class_end_lineno": 257, "func_start_lineno": 153, "func_end_lineno": 191, "func_code": " def _parse_endpoint_spec(self, resolved_spec: Dict[str, Any]) -> Optional[Dict[str, Any]]:\n if not isinstance(resolved_spec, dict):\n logger.warning(\"Invalid OpenAPI spec format provided. Could not extract function.\")\n return {}\n\n function_name = resolved_spec.get(\"operationId\")\n description = resolved_spec.get(\"description\") or resolved_spec.get(\"summary\", \"\")\n\n schema: Dict[str, Any] = {\"type\": \"object\", \"properties\": {}}\n\n # requestBody section\n req_body_schema = (\n resolved_spec.get(\"requestBody\", {}).get(\"content\", {}).get(\"application/json\", {}).get(\"schema\", {})\n )\n if \"properties\" in req_body_schema:\n for prop_name, prop_schema in req_body_schema[\"properties\"].items():\n schema[\"properties\"][prop_name] = self._parse_property_attributes(prop_schema)\n\n if \"required\" in req_body_schema:\n schema.setdefault(\"required\", []).extend(req_body_schema[\"required\"])\n\n # parameters section\n for param in resolved_spec.get(\"parameters\", []):\n if \"schema\" in param:\n schema_dict = self._parse_property_attributes(param[\"schema\"])\n # these attributes are not in param[schema] level but on param level\n useful_attributes = [\"description\", \"pattern\", \"enum\"]\n schema_dict.update({key: param[key] for key in useful_attributes if param.get(key)})\n schema[\"properties\"][param[\"name\"]] = schema_dict\n if param.get(\"required\", False):\n schema.setdefault(\"required\", []).append(param[\"name\"])\n\n if function_name and description and schema[\"properties\"]:\n return {\"name\": function_name, \"description\": description, \"parameters\": schema}\n else:\n logger.warning(\n \"Invalid OpenAPI spec format provided. Could not extract function from {spec}\", spec=resolved_spec\n )\n return {}" }, { "class_start_lineno": 23, "class_end_lineno": 257, "func_start_lineno": 117, "func_end_lineno": 151, "func_code": " def _openapi_to_functions(self, service_openapi_spec: Dict[str, Any]) -> List[Dict[str, Any]]:\n \"\"\"\n OpenAPI to OpenAI function conversion.\n\n Extracts functions from the OpenAPI specification of the service and converts them into a format\n suitable for OpenAI function calling.\n\n :param service_openapi_spec: The OpenAPI specification from which functions are to be extracted.\n :type service_openapi_spec: Dict[str, Any]\n :return: A list of dictionaries, each representing a function. Each dictionary includes the function's\n name, description, and a schema of its parameters.\n :rtype: List[Dict[str, Any]]\n \"\"\"\n\n # Doesn't enforce rigid spec validation because that would require a lot of dependencies\n # We check the version and require minimal fields to be present, so we can extract functions\n spec_version = service_openapi_spec.get(\"openapi\")\n if not spec_version:\n raise ValueError(f\"Invalid OpenAPI spec provided. Could not extract version from {service_openapi_spec}\")\n service_openapi_spec_version = int(spec_version.split(\".\")[0])\n\n # Compare the versions\n if service_openapi_spec_version < OpenAPIServiceToFunctions.MIN_REQUIRED_OPENAPI_SPEC_VERSION:\n raise ValueError(\n f\"Invalid OpenAPI spec version {service_openapi_spec_version}. Must be \"\n f\"at least {OpenAPIServiceToFunctions.MIN_REQUIRED_OPENAPI_SPEC_VERSION}.\"\n )\n\n functions: List[Dict[str, Any]] = []\n for paths in service_openapi_spec[\"paths\"].values():\n for path_spec in paths.values():\n function_dict = self._parse_endpoint_spec(path_spec)\n if function_dict:\n functions.append(function_dict)\n return functions" } ]
[ "function_empty", "Development" ]
[ "haystack.components.converters.openapi_functions.OpenAPIServiceToFunctions._parse_openapi_spec", "haystack.components.converters.openapi_functions.OpenAPIServiceToFunctions.run", "haystack.components.converters.openapi_functions.OpenAPIServiceToFunctions._parse_property_attributes", "haystack.components.converters.openapi_functions.OpenAPIServiceToFunctions._parse_endpoint_spec", "haystack.components.converters.openapi_functions.OpenAPIServiceToFunctions._openapi_to_functions" ]
Python
3
5
{ "total_num": 8, "base_passed_num": 0 }
[ "haystack.haystack.utils.callable_serialization.serialize_callable", "haystack.haystack.components.converters.output_adapter.OutputAdapter::to_dict", "haystack.haystack.utils.type_serialization.thread_safe_import", "haystack.haystack.utils.callable_serialization.deserialize_callable" ]
haystack
[ "haystack/utils/callable_serialization.py", "haystack/components/converters/output_adapter.py", "haystack/utils/type_serialization.py", "haystack/utils/callable_serialization.py" ]
[ "test/components/converters/test_output_adapter.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 80, "func_start_lineno": 12, "func_end_lineno": 42, "func_code": "def serialize_callable(callable_handle: Callable) -> str:\n \"\"\"\n Serializes a callable to its full path.\n\n :param callable_handle: The callable to serialize\n :return: The full path of the callable\n \"\"\"\n try:\n full_arg_spec = inspect.getfullargspec(callable_handle)\n is_instance_method = bool(full_arg_spec.args and full_arg_spec.args[0] == \"self\")\n except TypeError:\n is_instance_method = False\n if is_instance_method:\n raise SerializationError(\"Serialization of instance methods is not supported.\")\n\n # __qualname__ contains the fully qualified path we need for classmethods and staticmethods\n qualname = getattr(callable_handle, \"__qualname__\", \"\")\n if \"<lambda>\" in qualname:\n raise SerializationError(\"Serialization of lambdas is not supported.\")\n if \"<locals>\" in qualname:\n raise SerializationError(\"Serialization of nested functions is not supported.\")\n\n name = qualname or callable_handle.__name__\n\n # Get the full package path of the function\n module = inspect.getmodule(callable_handle)\n if module is not None:\n full_path = f\"{module.__name__}.{name}\"\n else:\n full_path = name\n return full_path" }, { "class_start_lineno": 25, "class_end_lineno": 184, "func_start_lineno": 139, "func_end_lineno": 153, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serializes the component to a dictionary.\n\n :returns:\n Dictionary with serialized data.\n \"\"\"\n se_filters = {name: serialize_callable(filter_func) for name, filter_func in self.custom_filters.items()}\n return default_to_dict(\n self,\n template=self.template,\n output_type=serialize_type(self.output_type),\n custom_filters=se_filters,\n unsafe=self._unsafe,\n )" }, { "class_start_lineno": 1, "class_end_lineno": 170, "func_start_lineno": 159, "func_end_lineno": 170, "func_code": "def thread_safe_import(module_name: str) -> ModuleType:\n \"\"\"\n Import a module in a thread-safe manner.\n\n Importing modules in a multi-threaded environment can lead to race conditions.\n This function ensures that the module is imported in a thread-safe manner without having impact\n on the performance of the import for single-threaded environments.\n\n :param module_name: the module to import\n \"\"\"\n with _import_lock:\n return importlib.import_module(module_name)" }, { "class_start_lineno": 1, "class_end_lineno": 80, "func_start_lineno": 45, "func_end_lineno": 80, "func_code": "def deserialize_callable(callable_handle: str) -> Callable:\n \"\"\"\n Deserializes a callable given its full import path as a string.\n\n :param callable_handle: The full path of the callable_handle\n :return: The callable\n :raises DeserializationError: If the callable cannot be found\n \"\"\"\n parts = callable_handle.split(\".\")\n\n for i in range(len(parts), 0, -1):\n module_name = \".\".join(parts[:i])\n try:\n mod: Any = thread_safe_import(module_name)\n except Exception:\n # keep reducing i until we find a valid module import\n continue\n\n attr_value = mod\n for part in parts[i:]:\n try:\n attr_value = getattr(attr_value, part)\n except AttributeError as e:\n raise DeserializationError(f\"Could not find attribute '{part}' in {attr_value.__name__}\") from e\n\n # when the attribute is a classmethod, we need the underlying function\n if isinstance(attr_value, (classmethod, staticmethod)):\n attr_value = attr_value.__func__\n\n if not callable(attr_value):\n raise DeserializationError(f\"The final attribute is not callable: {attr_value}\")\n\n return attr_value\n\n # Fallback if we never find anything\n raise DeserializationError(f\"Could not import '{callable_handle}' as a module or callable.\")" } ]
[ "function_empty", "Development" ]
[ "haystack.utils.callable_serialization.serialize_callable", "haystack.components.converters.output_adapter.OutputAdapter.to_dict", "haystack.utils.type_serialization.thread_safe_import", "haystack.utils.callable_serialization.deserialize_callable" ]
Python
3
4
{ "total_num": 14, "base_passed_num": 9 }
[ "haystack.haystack.utils.callable_serialization.serialize_callable", "haystack.haystack.components.embedders.azure_document_embedder.AzureOpenAIDocumentEmbedder::to_dict", "haystack.haystack.utils.type_serialization.thread_safe_import", "haystack.haystack.utils.callable_serialization.deserialize_callable" ]
haystack
[ "haystack/utils/callable_serialization.py", "haystack/components/embedders/azure_document_embedder.py", "haystack/utils/type_serialization.py", "haystack/utils/callable_serialization.py" ]
[ "test/components/embedders/test_azure_document_embedder.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 80, "func_start_lineno": 12, "func_end_lineno": 42, "func_code": "def serialize_callable(callable_handle: Callable) -> str:\n \"\"\"\n Serializes a callable to its full path.\n\n :param callable_handle: The callable to serialize\n :return: The full path of the callable\n \"\"\"\n try:\n full_arg_spec = inspect.getfullargspec(callable_handle)\n is_instance_method = bool(full_arg_spec.args and full_arg_spec.args[0] == \"self\")\n except TypeError:\n is_instance_method = False\n if is_instance_method:\n raise SerializationError(\"Serialization of instance methods is not supported.\")\n\n # __qualname__ contains the fully qualified path we need for classmethods and staticmethods\n qualname = getattr(callable_handle, \"__qualname__\", \"\")\n if \"<lambda>\" in qualname:\n raise SerializationError(\"Serialization of lambdas is not supported.\")\n if \"<locals>\" in qualname:\n raise SerializationError(\"Serialization of nested functions is not supported.\")\n\n name = qualname or callable_handle.__name__\n\n # Get the full package path of the function\n module = inspect.getmodule(callable_handle)\n if module is not None:\n full_path = f\"{module.__name__}.{name}\"\n else:\n full_path = name\n return full_path" }, { "class_start_lineno": 20, "class_end_lineno": 281, "func_start_lineno": 154, "func_end_lineno": 183, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serializes the component to a dictionary.\n\n :returns:\n Dictionary with serialized data.\n \"\"\"\n azure_ad_token_provider_name = None\n if self.azure_ad_token_provider:\n azure_ad_token_provider_name = serialize_callable(self.azure_ad_token_provider)\n return default_to_dict(\n self,\n azure_endpoint=self.azure_endpoint,\n azure_deployment=self.azure_deployment,\n dimensions=self.dimensions,\n organization=self.organization,\n api_version=self.api_version,\n prefix=self.prefix,\n suffix=self.suffix,\n batch_size=self.batch_size,\n progress_bar=self.progress_bar,\n meta_fields_to_embed=self.meta_fields_to_embed,\n embedding_separator=self.embedding_separator,\n api_key=self.api_key.to_dict() if self.api_key is not None else None,\n azure_ad_token=self.azure_ad_token.to_dict() if self.azure_ad_token is not None else None,\n timeout=self.timeout,\n max_retries=self.max_retries,\n default_headers=self.default_headers,\n azure_ad_token_provider=azure_ad_token_provider_name,\n )" }, { "class_start_lineno": 1, "class_end_lineno": 170, "func_start_lineno": 159, "func_end_lineno": 170, "func_code": "def thread_safe_import(module_name: str) -> ModuleType:\n \"\"\"\n Import a module in a thread-safe manner.\n\n Importing modules in a multi-threaded environment can lead to race conditions.\n This function ensures that the module is imported in a thread-safe manner without having impact\n on the performance of the import for single-threaded environments.\n\n :param module_name: the module to import\n \"\"\"\n with _import_lock:\n return importlib.import_module(module_name)" }, { "class_start_lineno": 1, "class_end_lineno": 80, "func_start_lineno": 45, "func_end_lineno": 80, "func_code": "def deserialize_callable(callable_handle: str) -> Callable:\n \"\"\"\n Deserializes a callable given its full import path as a string.\n\n :param callable_handle: The full path of the callable_handle\n :return: The callable\n :raises DeserializationError: If the callable cannot be found\n \"\"\"\n parts = callable_handle.split(\".\")\n\n for i in range(len(parts), 0, -1):\n module_name = \".\".join(parts[:i])\n try:\n mod: Any = thread_safe_import(module_name)\n except Exception:\n # keep reducing i until we find a valid module import\n continue\n\n attr_value = mod\n for part in parts[i:]:\n try:\n attr_value = getattr(attr_value, part)\n except AttributeError as e:\n raise DeserializationError(f\"Could not find attribute '{part}' in {attr_value.__name__}\") from e\n\n # when the attribute is a classmethod, we need the underlying function\n if isinstance(attr_value, (classmethod, staticmethod)):\n attr_value = attr_value.__func__\n\n if not callable(attr_value):\n raise DeserializationError(f\"The final attribute is not callable: {attr_value}\")\n\n return attr_value\n\n # Fallback if we never find anything\n raise DeserializationError(f\"Could not import '{callable_handle}' as a module or callable.\")" } ]
[ "function_empty", "Development" ]
[ "haystack.utils.callable_serialization.serialize_callable", "haystack.components.embedders.azure_document_embedder.AzureOpenAIDocumentEmbedder.to_dict", "haystack.utils.type_serialization.thread_safe_import", "haystack.utils.callable_serialization.deserialize_callable" ]
Python
3
4
{ "total_num": 6, "base_passed_num": 3 }
[ "haystack.haystack.utils.callable_serialization.serialize_callable", "haystack.haystack.components.embedders.azure_text_embedder.AzureOpenAITextEmbedder::to_dict", "haystack.haystack.utils.type_serialization.thread_safe_import", "haystack.haystack.utils.callable_serialization.deserialize_callable" ]
haystack
[ "haystack/utils/callable_serialization.py", "haystack/components/embedders/azure_text_embedder.py", "haystack/utils/type_serialization.py", "haystack/utils/callable_serialization.py" ]
[ "test/components/embedders/test_azure_text_embedder.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 80, "func_start_lineno": 12, "func_end_lineno": 42, "func_code": "def serialize_callable(callable_handle: Callable) -> str:\n \"\"\"\n Serializes a callable to its full path.\n\n :param callable_handle: The callable to serialize\n :return: The full path of the callable\n \"\"\"\n try:\n full_arg_spec = inspect.getfullargspec(callable_handle)\n is_instance_method = bool(full_arg_spec.args and full_arg_spec.args[0] == \"self\")\n except TypeError:\n is_instance_method = False\n if is_instance_method:\n raise SerializationError(\"Serialization of instance methods is not supported.\")\n\n # __qualname__ contains the fully qualified path we need for classmethods and staticmethods\n qualname = getattr(callable_handle, \"__qualname__\", \"\")\n if \"<lambda>\" in qualname:\n raise SerializationError(\"Serialization of lambdas is not supported.\")\n if \"<locals>\" in qualname:\n raise SerializationError(\"Serialization of nested functions is not supported.\")\n\n name = qualname or callable_handle.__name__\n\n # Get the full package path of the function\n module = inspect.getmodule(callable_handle)\n if module is not None:\n full_path = f\"{module.__name__}.{name}\"\n else:\n full_path = name\n return full_path" }, { "class_start_lineno": 15, "class_end_lineno": 216, "func_start_lineno": 136, "func_end_lineno": 161, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serializes the component to a dictionary.\n\n :returns:\n Dictionary with serialized data.\n \"\"\"\n azure_ad_token_provider_name = None\n if self.azure_ad_token_provider:\n azure_ad_token_provider_name = serialize_callable(self.azure_ad_token_provider)\n return default_to_dict(\n self,\n azure_endpoint=self.azure_endpoint,\n azure_deployment=self.azure_deployment,\n dimensions=self.dimensions,\n organization=self.organization,\n api_version=self.api_version,\n prefix=self.prefix,\n suffix=self.suffix,\n api_key=self.api_key.to_dict() if self.api_key is not None else None,\n azure_ad_token=self.azure_ad_token.to_dict() if self.azure_ad_token is not None else None,\n timeout=self.timeout,\n max_retries=self.max_retries,\n default_headers=self.default_headers,\n azure_ad_token_provider=azure_ad_token_provider_name,\n )" }, { "class_start_lineno": 1, "class_end_lineno": 170, "func_start_lineno": 159, "func_end_lineno": 170, "func_code": "def thread_safe_import(module_name: str) -> ModuleType:\n \"\"\"\n Import a module in a thread-safe manner.\n\n Importing modules in a multi-threaded environment can lead to race conditions.\n This function ensures that the module is imported in a thread-safe manner without having impact\n on the performance of the import for single-threaded environments.\n\n :param module_name: the module to import\n \"\"\"\n with _import_lock:\n return importlib.import_module(module_name)" }, { "class_start_lineno": 1, "class_end_lineno": 80, "func_start_lineno": 45, "func_end_lineno": 80, "func_code": "def deserialize_callable(callable_handle: str) -> Callable:\n \"\"\"\n Deserializes a callable given its full import path as a string.\n\n :param callable_handle: The full path of the callable_handle\n :return: The callable\n :raises DeserializationError: If the callable cannot be found\n \"\"\"\n parts = callable_handle.split(\".\")\n\n for i in range(len(parts), 0, -1):\n module_name = \".\".join(parts[:i])\n try:\n mod: Any = thread_safe_import(module_name)\n except Exception:\n # keep reducing i until we find a valid module import\n continue\n\n attr_value = mod\n for part in parts[i:]:\n try:\n attr_value = getattr(attr_value, part)\n except AttributeError as e:\n raise DeserializationError(f\"Could not find attribute '{part}' in {attr_value.__name__}\") from e\n\n # when the attribute is a classmethod, we need the underlying function\n if isinstance(attr_value, (classmethod, staticmethod)):\n attr_value = attr_value.__func__\n\n if not callable(attr_value):\n raise DeserializationError(f\"The final attribute is not callable: {attr_value}\")\n\n return attr_value\n\n # Fallback if we never find anything\n raise DeserializationError(f\"Could not import '{callable_handle}' as a module or callable.\")" } ]
[ "function_empty", "Development" ]
[ "haystack.utils.callable_serialization.serialize_callable", "haystack.components.embedders.azure_text_embedder.AzureOpenAITextEmbedder.to_dict", "haystack.utils.type_serialization.thread_safe_import", "haystack.utils.callable_serialization.deserialize_callable" ]
Python
3
4
{ "total_num": 5, "base_passed_num": 2 }
[ "haystack.haystack.components.embedders.hugging_face_api_document_embedder.HuggingFaceAPIDocumentEmbedder::_embed_batch", "haystack.haystack.components.embedders.hugging_face_api_document_embedder.HuggingFaceAPIDocumentEmbedder::run" ]
haystack
[ "haystack/components/embedders/hugging_face_api_document_embedder.py", "haystack/components/embedders/hugging_face_api_document_embedder.py" ]
[ "test/components/embedders/test_hugging_face_api_document_embedder.py" ]
[ { "class_start_lineno": 24, "class_end_lineno": 298, "func_start_lineno": 236, "func_end_lineno": 271, "func_code": " def _embed_batch(self, texts_to_embed: List[str], batch_size: int) -> List[List[float]]:\n \"\"\"\n Embed a list of texts in batches.\n \"\"\"\n truncate = self.truncate\n normalize = self.normalize\n\n if self.api_type == HFEmbeddingAPIType.SERVERLESS_INFERENCE_API:\n if truncate is not None:\n msg = \"`truncate` parameter is not supported for Serverless Inference API. It will be ignored.\"\n warnings.warn(msg)\n truncate = None\n if normalize is not None:\n msg = \"`normalize` parameter is not supported for Serverless Inference API. It will be ignored.\"\n warnings.warn(msg)\n normalize = None\n\n all_embeddings = []\n for i in tqdm(\n range(0, len(texts_to_embed), batch_size), disable=not self.progress_bar, desc=\"Calculating embeddings\"\n ):\n batch = texts_to_embed[i : i + batch_size]\n\n np_embeddings = self._client.feature_extraction(\n # this method does not officially support list of strings, but works as expected\n text=batch, # type: ignore[arg-type]\n truncate=truncate,\n normalize=normalize,\n )\n\n if np_embeddings.ndim != 2 or np_embeddings.shape[0] != len(batch):\n raise ValueError(f\"Expected embedding shape ({batch_size}, embedding_dim), got {np_embeddings.shape}\")\n\n all_embeddings.extend(np_embeddings.tolist())\n\n return all_embeddings" }, { "class_start_lineno": 24, "class_end_lineno": 298, "func_start_lineno": 274, "func_end_lineno": 298, "func_code": " def run(self, documents: List[Document]):\n \"\"\"\n Embeds a list of documents.\n\n :param documents:\n Documents to embed.\n\n :returns:\n A dictionary with the following keys:\n - `documents`: A list of documents with embeddings.\n \"\"\"\n if not isinstance(documents, list) or documents and not isinstance(documents[0], Document):\n raise TypeError(\n \"HuggingFaceAPIDocumentEmbedder expects a list of Documents as input.\"\n \" In case you want to embed a string, please use the HuggingFaceAPITextEmbedder.\"\n )\n\n texts_to_embed = self._prepare_texts_to_embed(documents=documents)\n\n embeddings = self._embed_batch(texts_to_embed=texts_to_embed, batch_size=self.batch_size)\n\n for doc, emb in zip(documents, embeddings):\n doc.embedding = emb\n\n return {\"documents\": documents}" } ]
[ "function_empty", "Development" ]
[ "haystack.components.embedders.hugging_face_api_document_embedder.HuggingFaceAPIDocumentEmbedder._embed_batch", "haystack.components.embedders.hugging_face_api_document_embedder.HuggingFaceAPIDocumentEmbedder.run" ]
Python
1
2
{ "total_num": 17, "base_passed_num": 12 }
[ "haystack.haystack.components.embedders.openai_document_embedder.OpenAIDocumentEmbedder::_prepare_texts_to_embed", "haystack.haystack.components.embedders.openai_document_embedder.OpenAIDocumentEmbedder::_embed_batch", "haystack.haystack.components.embedders.openai_document_embedder.OpenAIDocumentEmbedder::run" ]
haystack
[ "haystack/components/embedders/openai_document_embedder.py", "haystack/components/embedders/openai_document_embedder.py", "haystack/components/embedders/openai_document_embedder.py" ]
[ "test/components/embedders/test_openai_document_embedder.py" ]
[ { "class_start_lineno": 19, "class_end_lineno": 245, "func_start_lineno": 164, "func_end_lineno": 181, "func_code": " def _prepare_texts_to_embed(self, documents: List[Document]) -> Dict[str, str]:\n \"\"\"\n Prepare the texts to embed by concatenating the Document text with the metadata fields to embed.\n \"\"\"\n texts_to_embed = {}\n for doc in documents:\n meta_values_to_embed = [\n str(doc.meta[key]) for key in self.meta_fields_to_embed if key in doc.meta and doc.meta[key] is not None\n ]\n\n text_to_embed = (\n self.prefix + self.embedding_separator.join(meta_values_to_embed + [doc.content or \"\"]) + self.suffix\n )\n\n # copied from OpenAI embedding_utils (https://github.com/openai/openai-python/blob/main/openai/embeddings_utils.py)\n # replace newlines, which can negatively affect performance.\n texts_to_embed[doc.id] = text_to_embed.replace(\"\\n\", \" \")\n return texts_to_embed" }, { "class_start_lineno": 19, "class_end_lineno": 245, "func_start_lineno": 183, "func_end_lineno": 217, "func_code": " def _embed_batch(self, texts_to_embed: Dict[str, str], batch_size: int) -> Tuple[List[List[float]], Dict[str, Any]]:\n \"\"\"\n Embed a list of texts in batches.\n \"\"\"\n\n all_embeddings = []\n meta: Dict[str, Any] = {}\n for batch in tqdm(\n batched(texts_to_embed.items(), batch_size), disable=not self.progress_bar, desc=\"Calculating embeddings\"\n ):\n args: Dict[str, Any] = {\"model\": self.model, \"input\": [b[1] for b in batch]}\n\n if self.dimensions is not None:\n args[\"dimensions\"] = self.dimensions\n\n try:\n response = self.client.embeddings.create(**args)\n except APIError as exc:\n ids = \", \".join(b[0] for b in batch)\n msg = \"Failed embedding of documents {ids} caused by {exc}\"\n logger.exception(msg, ids=ids, exc=exc)\n continue\n\n embeddings = [el.embedding for el in response.data]\n all_embeddings.extend(embeddings)\n\n if \"model\" not in meta:\n meta[\"model\"] = response.model\n if \"usage\" not in meta:\n meta[\"usage\"] = dict(response.usage)\n else:\n meta[\"usage\"][\"prompt_tokens\"] += response.usage.prompt_tokens\n meta[\"usage\"][\"total_tokens\"] += response.usage.total_tokens\n\n return all_embeddings, meta" }, { "class_start_lineno": 19, "class_end_lineno": 245, "func_start_lineno": 220, "func_end_lineno": 245, "func_code": " def run(self, documents: List[Document]):\n \"\"\"\n Embeds a list of documents.\n\n :param documents:\n A list of documents to embed.\n\n :returns:\n A dictionary with the following keys:\n - `documents`: A list of documents with embeddings.\n - `meta`: Information about the usage of the model.\n \"\"\"\n if not isinstance(documents, list) or documents and not isinstance(documents[0], Document):\n raise TypeError(\n \"OpenAIDocumentEmbedder expects a list of Documents as input.\"\n \"In case you want to embed a string, please use the OpenAITextEmbedder.\"\n )\n\n texts_to_embed = self._prepare_texts_to_embed(documents=documents)\n\n embeddings, meta = self._embed_batch(texts_to_embed=texts_to_embed, batch_size=self.batch_size)\n\n for doc, emb in zip(documents, embeddings):\n doc.embedding = emb\n\n return {\"documents\": documents, \"meta\": meta}" } ]
[ "function_empty", "Development" ]
[ "haystack.components.embedders.openai_document_embedder.OpenAIDocumentEmbedder._prepare_texts_to_embed", "haystack.components.embedders.openai_document_embedder.OpenAIDocumentEmbedder._embed_batch", "haystack.components.embedders.openai_document_embedder.OpenAIDocumentEmbedder.run" ]
Python
2
3
{ "total_num": 11, "base_passed_num": 4 }
[ "haystack.haystack.utils.device.ComponentDevice::to_dict", "haystack.haystack.components.embedders.sentence_transformers_document_embedder.SentenceTransformersDocumentEmbedder::to_dict" ]
haystack
[ "haystack/utils/device.py", "haystack/components/embedders/sentence_transformers_document_embedder.py" ]
[ "test/components/embedders/test_sentence_transformers_document_embedder.py" ]
[ { "class_start_lineno": 240, "class_end_lineno": 480, "func_start_lineno": 450, "func_end_lineno": 463, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Convert the component device representation to a JSON-serializable dictionary.\n\n :returns:\n The dictionary representation.\n \"\"\"\n if self._single_device is not None:\n return {\"type\": \"single\", \"device\": str(self._single_device)}\n elif self._multiple_devices is not None:\n return {\"type\": \"multiple\", \"device_map\": self._multiple_devices.to_dict()}\n else:\n # Unreachable\n assert False" }, { "class_start_lineno": 16, "class_end_lineno": 256, "func_start_lineno": 145, "func_end_lineno": 175, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serializes the component to a dictionary.\n\n :returns:\n Dictionary with serialized data.\n \"\"\"\n serialization_dict = default_to_dict(\n self,\n model=self.model,\n device=self.device.to_dict(),\n token=self.token.to_dict() if self.token else None,\n prefix=self.prefix,\n suffix=self.suffix,\n batch_size=self.batch_size,\n progress_bar=self.progress_bar,\n normalize_embeddings=self.normalize_embeddings,\n meta_fields_to_embed=self.meta_fields_to_embed,\n embedding_separator=self.embedding_separator,\n trust_remote_code=self.trust_remote_code,\n truncate_dim=self.truncate_dim,\n model_kwargs=self.model_kwargs,\n tokenizer_kwargs=self.tokenizer_kwargs,\n config_kwargs=self.config_kwargs,\n precision=self.precision,\n encode_kwargs=self.encode_kwargs,\n backend=self.backend,\n )\n if serialization_dict[\"init_parameters\"].get(\"model_kwargs\") is not None:\n serialize_hf_model_kwargs(serialization_dict[\"init_parameters\"][\"model_kwargs\"])\n return serialization_dict" } ]
[ "function_empty" ]
[ "haystack.utils.device.ComponentDevice.to_dict", "haystack.components.embedders.sentence_transformers_document_embedder.SentenceTransformersDocumentEmbedder.to_dict" ]
Python
2
2
{ "total_num": 18, "base_passed_num": 15 }
[ "haystack.haystack.utils.device.ComponentDevice::to_dict", "haystack.haystack.components.embedders.sentence_transformers_text_embedder.SentenceTransformersTextEmbedder::to_dict", "haystack.haystack.utils.device.ComponentDevice::to_torch_str", "haystack.haystack.components.embedders.sentence_transformers_text_embedder.SentenceTransformersTextEmbedder::warm_up" ]
haystack
[ "haystack/utils/device.py", "haystack/components/embedders/sentence_transformers_text_embedder.py", "haystack/utils/device.py", "haystack/components/embedders/sentence_transformers_text_embedder.py" ]
[ "test/components/embedders/test_sentence_transformers_text_embedder.py" ]
[ { "class_start_lineno": 240, "class_end_lineno": 480, "func_start_lineno": 450, "func_end_lineno": 463, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Convert the component device representation to a JSON-serializable dictionary.\n\n :returns:\n The dictionary representation.\n \"\"\"\n if self._single_device is not None:\n return {\"type\": \"single\", \"device\": str(self._single_device)}\n elif self._multiple_devices is not None:\n return {\"type\": \"multiple\", \"device_map\": self._multiple_devices.to_dict()}\n else:\n # Unreachable\n assert False" }, { "class_start_lineno": 16, "class_end_lineno": 229, "func_start_lineno": 133, "func_end_lineno": 161, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serializes the component to a dictionary.\n\n :returns:\n Dictionary with serialized data.\n \"\"\"\n serialization_dict = default_to_dict(\n self,\n model=self.model,\n device=self.device.to_dict(),\n token=self.token.to_dict() if self.token else None,\n prefix=self.prefix,\n suffix=self.suffix,\n batch_size=self.batch_size,\n progress_bar=self.progress_bar,\n normalize_embeddings=self.normalize_embeddings,\n trust_remote_code=self.trust_remote_code,\n truncate_dim=self.truncate_dim,\n model_kwargs=self.model_kwargs,\n tokenizer_kwargs=self.tokenizer_kwargs,\n config_kwargs=self.config_kwargs,\n precision=self.precision,\n encode_kwargs=self.encode_kwargs,\n backend=self.backend,\n )\n if serialization_dict[\"init_parameters\"].get(\"model_kwargs\") is not None:\n serialize_hf_model_kwargs(serialization_dict[\"init_parameters\"][\"model_kwargs\"])\n return serialization_dict" }, { "class_start_lineno": 240, "class_end_lineno": 480, "func_start_lineno": 321, "func_end_lineno": 336, "func_code": " def to_torch_str(self) -> str:\n \"\"\"\n Convert the component device representation to PyTorch string format.\n\n Device maps are not supported.\n\n :returns:\n The PyTorch device string representation.\n \"\"\"\n self._validate()\n\n if self._single_device is None:\n raise ValueError(\"Only single devices can be converted to PyTorch format\")\n\n assert self._single_device is not None\n return str(self._single_device)" }, { "class_start_lineno": 16, "class_end_lineno": 229, "func_start_lineno": 181, "func_end_lineno": 198, "func_code": " def warm_up(self):\n \"\"\"\n Initializes the component.\n \"\"\"\n if self.embedding_backend is None:\n self.embedding_backend = _SentenceTransformersEmbeddingBackendFactory.get_embedding_backend(\n model=self.model,\n device=self.device.to_torch_str(),\n auth_token=self.token,\n trust_remote_code=self.trust_remote_code,\n truncate_dim=self.truncate_dim,\n model_kwargs=self.model_kwargs,\n tokenizer_kwargs=self.tokenizer_kwargs,\n config_kwargs=self.config_kwargs,\n backend=self.backend,\n )\n if self.tokenizer_kwargs and self.tokenizer_kwargs.get(\"model_max_length\"):\n self.embedding_backend.model.max_seq_length = self.tokenizer_kwargs[\"model_max_length\"]" } ]
[ "function_empty", "Development" ]
[ "haystack.utils.device.ComponentDevice.to_dict", "haystack.components.embedders.sentence_transformers_text_embedder.SentenceTransformersTextEmbedder.to_dict", "haystack.utils.device.ComponentDevice.to_torch_str", "haystack.components.embedders.sentence_transformers_text_embedder.SentenceTransformersTextEmbedder.warm_up" ]
Python
3
4
{ "total_num": 19, "base_passed_num": 8 }
[ "haystack.haystack.utils.type_serialization.serialize_type", "haystack.haystack.components.evaluators.llm_evaluator.LLMEvaluator::to_dict", "haystack.haystack.core.serialization.component_to_dict", "haystack.haystack.utils.type_serialization.deserialize_type", "haystack.haystack.core.serialization.component_from_dict" ]
haystack
[ "haystack/utils/type_serialization.py", "haystack/components/evaluators/llm_evaluator.py", "haystack/core/serialization.py", "haystack/utils/type_serialization.py", "haystack/core/serialization.py" ]
[ "test/components/evaluators/test_llm_evaluator.py" ]
[ { "class_start_lineno": 1, "class_end_lineno": 170, "func_start_lineno": 19, "func_end_lineno": 52, "func_code": "def serialize_type(target: Any) -> str:\n \"\"\"\n Serializes a type or an instance to its string representation, including the module name.\n\n This function handles types, instances of types, and special typing objects.\n It assumes that non-typing objects will have a '__name__' attribute.\n\n :param target:\n The object to serialize, can be an instance or a type.\n :return:\n The string representation of the type.\n \"\"\"\n name = getattr(target, \"__name__\", str(target))\n\n # Remove the 'typing.' prefix when using python <3.9\n if name.startswith(\"typing.\"):\n name = name[7:]\n # Remove the arguments from the name when using python <3.9\n if \"[\" in name:\n name = name.split(\"[\")[0]\n\n # Get module name\n module = inspect.getmodule(target)\n module_name = \"\"\n # We omit the module name for builtins to not clutter the output\n if module and hasattr(module, \"__name__\") and module.__name__ != \"builtins\":\n module_name = f\"{module.__name__}\"\n\n args = get_args(target)\n if args:\n args_str = \", \".join([serialize_type(a) for a in args if a is not type(None)])\n return f\"{module_name}.{name}[{args_str}]\" if module_name else f\"{name}[{args_str}]\"\n\n return f\"{module_name}.{name}\" if module_name else f\"{name}\"" }, { "class_start_lineno": 18, "class_end_lineno": 387, "func_start_lineno": 278, "func_end_lineno": 297, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serialize this component to a dictionary.\n\n :returns:\n The serialized component as a dictionary.\n \"\"\"\n # Since we cannot currently serialize tuples, convert the inputs to a list.\n inputs = [[name, serialize_type(type_)] for name, type_ in self.inputs]\n return default_to_dict(\n self,\n instructions=self.instructions,\n inputs=inputs,\n outputs=self.outputs,\n examples=self.examples,\n api=self.api,\n api_key=self.api_key and self.api_key.to_dict(),\n api_params=self.api_params,\n progress_bar=self.progress_bar,\n )" }, { "class_start_lineno": 1, "class_end_lineno": 264, "func_start_lineno": 36, "func_end_lineno": 82, "func_code": "def component_to_dict(obj: Any, name: str) -> Dict[str, Any]:\n \"\"\"\n Converts a component instance into a dictionary.\n\n If a `to_dict` method is present in the component instance, that will be used instead of the default method.\n\n :param obj:\n The component to be serialized.\n :param name:\n The name of the component.\n :returns:\n A dictionary representation of the component.\n\n :raises SerializationError:\n If the component doesn't have a `to_dict` method.\n If the values of the init parameters can't be determined.\n If a non-basic Python type is used in the serialized data.\n \"\"\"\n if hasattr(obj, \"to_dict\"):\n data = obj.to_dict()\n else:\n init_parameters = {}\n for param_name, param in inspect.signature(obj.__init__).parameters.items():\n # Ignore `args` and `kwargs`, used by the default constructor\n if param_name in (\"args\", \"kwargs\"):\n continue\n try:\n # This only works if the Component constructor assigns the init\n # parameter to an instance variable or property with the same name\n param_value = getattr(obj, param_name)\n except AttributeError as e:\n # If the parameter doesn't have a default value, raise an error\n if param.default is param.empty:\n raise SerializationError(\n f\"Cannot determine the value of the init parameter '{param_name}' \"\n f\"for the class {obj.__class__.__name__}.\"\n f\"You can fix this error by assigning 'self.{param_name} = {param_name}' or adding a \"\n f\"custom serialization method 'to_dict' to the class.\"\n ) from e\n # In case the init parameter was not assigned, we use the default value\n param_value = param.default\n init_parameters[param_name] = param_value\n\n data = default_to_dict(obj, **init_parameters)\n\n _validate_component_to_dict_output(obj, name, data)\n return data" }, { "class_start_lineno": 1, "class_end_lineno": 170, "func_start_lineno": 78, "func_end_lineno": 156, "func_code": "def deserialize_type(type_str: str) -> Any: # pylint: disable=too-many-return-statements\n \"\"\"\n Deserializes a type given its full import path as a string, including nested generic types.\n\n This function will dynamically import the module if it's not already imported\n and then retrieve the type object from it. It also handles nested generic types like\n `typing.List[typing.Dict[int, str]]`.\n\n :param type_str:\n The string representation of the type's full import path.\n :returns:\n The deserialized type object.\n :raises DeserializationError:\n If the type cannot be deserialized due to missing module or type.\n \"\"\"\n\n type_mapping = {\n list: typing.List,\n dict: typing.Dict,\n set: typing.Set,\n tuple: typing.Tuple,\n frozenset: typing.FrozenSet,\n }\n\n # Handle generics\n if \"[\" in type_str and type_str.endswith(\"]\"):\n main_type_str, generics_str = type_str.split(\"[\", 1)\n generics_str = generics_str[:-1]\n\n main_type = deserialize_type(main_type_str)\n generic_args = [deserialize_type(arg) for arg in _parse_generic_args(generics_str)]\n\n # Reconstruct\n try:\n if sys.version_info >= (3, 9) or repr(main_type).startswith(\"typing.\"):\n return main_type[tuple(generic_args) if len(generic_args) > 1 else generic_args[0]]\n else:\n return type_mapping[main_type][tuple(generic_args) if len(generic_args) > 1 else generic_args[0]]\n except (TypeError, AttributeError) as e:\n raise DeserializationError(f\"Could not apply arguments {generic_args} to type {main_type}\") from e\n\n # Handle non-generic types\n # First, check if there's a module prefix\n if \".\" in type_str:\n parts = type_str.split(\".\")\n module_name = \".\".join(parts[:-1])\n type_name = parts[-1]\n\n module = sys.modules.get(module_name)\n if module is None:\n try:\n module = thread_safe_import(module_name)\n except ImportError as e:\n raise DeserializationError(f\"Could not import the module: {module_name}\") from e\n\n # Get the class from the module\n if hasattr(module, type_name):\n return getattr(module, type_name)\n\n raise DeserializationError(f\"Could not locate the type: {type_name} in the module: {module_name}\")\n\n # No module prefix, check builtins and typing\n # First check builtins\n if hasattr(builtins, type_str):\n return getattr(builtins, type_str)\n\n # Then check typing\n if hasattr(typing, type_str):\n return getattr(typing, type_str)\n\n # Special case for NoneType\n if type_str == \"NoneType\":\n return type(None)\n\n # Special case for None\n if type_str == \"None\":\n return None\n\n raise DeserializationError(f\"Could not deserialize type: {type_str}\")" }, { "class_start_lineno": 1, "class_end_lineno": 264, "func_start_lineno": 134, "func_end_lineno": 169, "func_code": "def component_from_dict(\n cls: Type[object], data: Dict[str, Any], name: str, callbacks: Optional[DeserializationCallbacks] = None\n) -> Any:\n \"\"\"\n Creates a component instance from a dictionary.\n\n If a `from_dict` method is present in the component class, that will be used instead of the default method.\n\n :param cls:\n The class to be used for deserialization.\n :param data:\n The serialized data.\n :param name:\n The name of the component.\n :param callbacks:\n Callbacks to invoke during deserialization.\n :returns:\n The deserialized component.\n \"\"\"\n\n def component_pre_init_callback(component_cls, init_params):\n assert callbacks is not None\n assert callbacks.component_pre_init is not None\n callbacks.component_pre_init(name, component_cls, init_params)\n\n def do_from_dict():\n if hasattr(cls, \"from_dict\"):\n return cls.from_dict(data)\n\n return default_from_dict(cls, data)\n\n if callbacks is None or callbacks.component_pre_init is None:\n return do_from_dict()\n\n with _hook_component_init(component_pre_init_callback):\n return do_from_dict()" } ]
[ "function_empty", "Development" ]
[ "haystack.utils.type_serialization.serialize_type", "haystack.components.evaluators.llm_evaluator.LLMEvaluator.to_dict", "haystack.core.serialization.component_to_dict", "haystack.utils.type_serialization.deserialize_type", "haystack.core.serialization.component_from_dict" ]
Python
2
5
{ "total_num": 17, "base_passed_num": 12 }
[ "haystack.haystack.utils.device.ComponentDevice::to_dict", "haystack.haystack.components.evaluators.sas_evaluator.SASEvaluator::to_dict" ]
haystack
[ "haystack/utils/device.py", "haystack/components/evaluators/sas_evaluator.py" ]
[ "test/components/evaluators/test_sas_evaluator.py" ]
[ { "class_start_lineno": 240, "class_end_lineno": 480, "func_start_lineno": 450, "func_end_lineno": 463, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Convert the component device representation to a JSON-serializable dictionary.\n\n :returns:\n The dictionary representation.\n \"\"\"\n if self._single_device is not None:\n return {\"type\": \"single\", \"device\": str(self._single_device)}\n elif self._multiple_devices is not None:\n return {\"type\": \"multiple\", \"device_map\": self._multiple_devices.to_dict()}\n else:\n # Unreachable\n assert False" }, { "class_start_lineno": 20, "class_end_lineno": 201, "func_start_lineno": 85, "func_end_lineno": 98, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serialize this component to a dictionary.\n\n :returns:\n The serialized component as a dictionary.\n \"\"\"\n return default_to_dict(\n self,\n model=self._model,\n batch_size=self._batch_size,\n device=self._device.to_dict() if self._device else None,\n token=self._token.to_dict() if self._token else None,\n )" } ]
[ "function_empty" ]
[ "haystack.utils.device.ComponentDevice.to_dict", "haystack.components.evaluators.sas_evaluator.SASEvaluator.to_dict" ]
Python
2
2
{ "total_num": 12, "base_passed_num": 11 }
[ "haystack.haystack.components.generators.chat.openai.OpenAIChatGenerator::to_dict", "haystack.haystack.components.extractors.llm_metadata_extractor.LLMMetadataExtractor::to_dict", "haystack.haystack.components.builders.prompt_builder.PromptBuilder::_validate_variables", "haystack.haystack.components.builders.prompt_builder.PromptBuilder::run", "haystack.haystack.components.extractors.llm_metadata_extractor.LLMMetadataExtractor::_prepare_prompts" ]
haystack
[ "haystack/components/generators/chat/openai.py", "haystack/components/extractors/llm_metadata_extractor.py", "haystack/components/builders/prompt_builder.py", "haystack/components/builders/prompt_builder.py", "haystack/components/extractors/llm_metadata_extractor.py" ]
[ "test/components/extractors/test_llm_metadata_extractor.py" ]
[ { "class_start_lineno": 32, "class_end_lineno": 571, "func_start_lineno": 170, "func_end_lineno": 190, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serialize this component to a dictionary.\n\n :returns:\n The serialized component as a dictionary.\n \"\"\"\n callback_name = serialize_callable(self.streaming_callback) if self.streaming_callback else None\n return default_to_dict(\n self,\n model=self.model,\n streaming_callback=callback_name,\n api_base_url=self.api_base_url,\n organization=self.organization,\n generation_kwargs=self.generation_kwargs,\n api_key=self.api_key.to_dict(),\n timeout=self.timeout,\n max_retries=self.max_retries,\n tools=[tool.to_dict() for tool in self.tools] if self.tools else None,\n tools_strict=self.tools_strict,\n )" }, { "class_start_lineno": 61, "class_end_lineno": 442, "func_start_lineno": 239, "func_end_lineno": 258, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serializes the component to a dictionary.\n\n :returns:\n Dictionary with serialized data.\n \"\"\"\n\n llm_provider = self.llm_provider.to_dict()\n\n return default_to_dict(\n self,\n prompt=self.prompt,\n generator_api=self.generator_api.value,\n generator_api_params=llm_provider[\"init_parameters\"],\n expected_keys=self.expected_keys,\n page_range=self.expanded_range,\n raise_on_failure=self.raise_on_failure,\n max_workers=self.max_workers,\n )" }, { "class_start_lineno": 17, "class_end_lineno": 266, "func_start_lineno": 247, "func_end_lineno": 266, "func_code": " def _validate_variables(self, provided_variables: Set[str]):\n \"\"\"\n Checks if all the required template variables are provided.\n\n :param provided_variables:\n A set of provided template variables.\n :raises ValueError:\n If any of the required template variables is not provided.\n \"\"\"\n if self.required_variables == \"*\":\n required_variables = sorted(self.variables)\n else:\n required_variables = self.required_variables\n missing_variables = [var for var in required_variables if var not in provided_variables]\n if missing_variables:\n missing_vars_str = \", \".join(missing_variables)\n raise ValueError(\n f\"Missing required input variables in PromptBuilder: {missing_vars_str}. \"\n f\"Required variables: {required_variables}. Provided variables: {provided_variables}.\"\n )" }, { "class_start_lineno": 17, "class_end_lineno": 266, "func_start_lineno": 213, "func_end_lineno": 245, "func_code": " def run(self, template: Optional[str] = None, template_variables: Optional[Dict[str, Any]] = None, **kwargs):\n \"\"\"\n Renders the prompt template with the provided variables.\n\n It applies the template variables to render the final prompt. You can provide variables via pipeline kwargs.\n In order to overwrite the default template, you can set the `template` parameter.\n In order to overwrite pipeline kwargs, you can set the `template_variables` parameter.\n\n :param template:\n An optional string template to overwrite PromptBuilder's default template. If None, the default template\n provided at initialization is used.\n :param template_variables:\n An optional dictionary of template variables to overwrite the pipeline variables.\n :param kwargs:\n Pipeline variables used for rendering the prompt.\n\n :returns: A dictionary with the following keys:\n - `prompt`: The updated prompt text after rendering the prompt template.\n\n :raises ValueError:\n If any of the required template variables is not provided.\n \"\"\"\n kwargs = kwargs or {}\n template_variables = template_variables or {}\n template_variables_combined = {**kwargs, **template_variables}\n self._validate_variables(set(template_variables_combined.keys()))\n\n compiled_template = self.template\n if template is not None:\n compiled_template = self._env.from_string(template)\n\n result = compiled_template.render(template_variables_combined)\n return {\"prompt\": result}" }, { "class_start_lineno": 61, "class_end_lineno": 442, "func_start_lineno": 332, "func_end_lineno": 359, "func_code": " def _prepare_prompts(\n self, documents: List[Document], expanded_range: Optional[List[int]] = None\n ) -> List[Union[ChatMessage, None]]:\n all_prompts: List[Union[ChatMessage, None]] = []\n for document in documents:\n if not document.content:\n logger.warning(\"Document {doc_id} has no content. Skipping metadata extraction.\", doc_id=document.id)\n all_prompts.append(None)\n continue\n\n if expanded_range:\n doc_copy = copy.deepcopy(document)\n pages = self.splitter.run(documents=[doc_copy])\n content = \"\"\n for idx, page in enumerate(pages[\"documents\"]):\n if idx + 1 in expanded_range:\n content += page.content\n doc_copy.content = content\n else:\n doc_copy = document\n\n prompt_with_doc = self.builder.run(template=self.prompt, template_variables={\"document\": doc_copy})\n\n # build a ChatMessage with the prompt\n message = ChatMessage.from_user(prompt_with_doc[\"prompt\"])\n all_prompts.append(message)\n\n return all_prompts" } ]
[ "function_empty", "Development" ]
[ "haystack.components.generators.chat.openai.OpenAIChatGenerator.to_dict", "haystack.components.extractors.llm_metadata_extractor.LLMMetadataExtractor.to_dict", "haystack.components.builders.prompt_builder.PromptBuilder._validate_variables", "haystack.components.builders.prompt_builder.PromptBuilder.run", "haystack.components.extractors.llm_metadata_extractor.LLMMetadataExtractor._prepare_prompts" ]
Python
4
5
{ "total_num": 13, "base_passed_num": 9 }
[ "haystack.haystack.components.extractors.named_entity_extractor.NamedEntityExtractor::to_dict", "haystack.haystack.core.serialization.component_to_dict" ]
haystack
[ "haystack/components/extractors/named_entity_extractor.py", "haystack/core/serialization.py" ]
[ "test/components/extractors/test_named_entity_extractor.py" ]
[ { "class_start_lineno": 78, "class_end_lineno": 275, "func_start_lineno": 212, "func_end_lineno": 232, "func_code": " def to_dict(self) -> Dict[str, Any]:\n \"\"\"\n Serializes the component to a dictionary.\n\n :returns:\n Dictionary with serialized data.\n \"\"\"\n serialization_dict = default_to_dict(\n self,\n backend=self._backend.type.name,\n model=self._backend.model_name,\n device=self._backend.device.to_dict(),\n pipeline_kwargs=self._backend._pipeline_kwargs,\n token=self.token.to_dict() if self.token else None,\n )\n\n hf_pipeline_kwargs = serialization_dict[\"init_parameters\"][\"pipeline_kwargs\"]\n hf_pipeline_kwargs.pop(\"token\", None)\n\n serialize_hf_model_kwargs(hf_pipeline_kwargs)\n return serialization_dict" }, { "class_start_lineno": 1, "class_end_lineno": 264, "func_start_lineno": 36, "func_end_lineno": 82, "func_code": "def component_to_dict(obj: Any, name: str) -> Dict[str, Any]:\n \"\"\"\n Converts a component instance into a dictionary.\n\n If a `to_dict` method is present in the component instance, that will be used instead of the default method.\n\n :param obj:\n The component to be serialized.\n :param name:\n The name of the component.\n :returns:\n A dictionary representation of the component.\n\n :raises SerializationError:\n If the component doesn't have a `to_dict` method.\n If the values of the init parameters can't be determined.\n If a non-basic Python type is used in the serialized data.\n \"\"\"\n if hasattr(obj, \"to_dict\"):\n data = obj.to_dict()\n else:\n init_parameters = {}\n for param_name, param in inspect.signature(obj.__init__).parameters.items():\n # Ignore `args` and `kwargs`, used by the default constructor\n if param_name in (\"args\", \"kwargs\"):\n continue\n try:\n # This only works if the Component constructor assigns the init\n # parameter to an instance variable or property with the same name\n param_value = getattr(obj, param_name)\n except AttributeError as e:\n # If the parameter doesn't have a default value, raise an error\n if param.default is param.empty:\n raise SerializationError(\n f\"Cannot determine the value of the init parameter '{param_name}' \"\n f\"for the class {obj.__class__.__name__}.\"\n f\"You can fix this error by assigning 'self.{param_name} = {param_name}' or adding a \"\n f\"custom serialization method 'to_dict' to the class.\"\n ) from e\n # In case the init parameter was not assigned, we use the default value\n param_value = param.default\n init_parameters[param_name] = param_value\n\n data = default_to_dict(obj, **init_parameters)\n\n _validate_component_to_dict_output(obj, name, data)\n return data" } ]
[ "function_empty", "Development" ]
[ "haystack.components.extractors.named_entity_extractor.NamedEntityExtractor.to_dict", "haystack.core.serialization.component_to_dict" ]
Python
1
2
{ "total_num": 7, "base_passed_num": 2 }
End of preview. Expand in Data Studio

Multi Testcases for CoreCodeBench

File Explanation

  • CoreCodeBench_Multi.jsonl Multi test cases for CoreCodeBench.

  • CoreCodeBench_Difficult.jsonl More difficult version for CoreCodeBench multi test cases .

Key Explanation

Key Meaning/Description
id A list of unique identifiers for the functions to be completed, typically in the format module.path.Class::function.
project The name of the project this data is associated with.
origin_file A list of file paths indicating where each function or method is defined in the source code.
test_list A list of file paths for test scripts that are related to or used for testing the functions/methods.
prob_info A list of dictionaries, each containing detailed information about a function or method, such as its starting and ending line numbers, and the actual code (func_code). Each dictionary includes: class_start_lineno/class_end_lineno: Line numbers where the class begins and ends. func_start_lineno/func_end_lineno: Line numbers where the function begins and ends. func_code: The source code of the function as a string.
type A list indicating the type or category of the functions/methods (e.g., "function_empty").
node A list of fully qualified names (with module and class) for each function/method.
language The programming language used.
toolfunc_count The number of tool-related functions in the data.
func_count The total number of atomic functions in the data.
pytest_info A dictionary with information about pytest test results: total_num is the total number of unit tests, while base_passed_num is the number of base tests that passed.
Downloads last month
10